User Interaction with Touch Devices

Hi everyone,

I’ve been thinking about this lately and discussing with my coworkers how user interaction is changing thanks to smart touch devices.

In the past (before smart touch devices), users usually interacted with buttons. When I interact with my keyboard, I feel each individual button. I know when I’ve hit one button or another, because I can feel the key respond to my movement (haptic feedback). When I use my MP3 player, I feel each button be pressed down. I can find the button I’m looking for without looking because I can just feel around for it. Same applies to many other devices.

Enter smart phones and tablets.

I currently have a Droid RAZR, one of many smart phones available. It has exactly 3 physical buttons: power and a two button thing which usually means volume up and volume down. That’s it. Everything else is purely touch driven. Many other smart devices follow the same pattern.

These devices provide no haptic or tactile feedback (aside from being able to feel I’m touching the glass and the occasional vibration, which is generally indistinguishable from virtual “button” to button). However, even with this lack of feedback, I’m generally able to unlock my screen (including entering a passcode on that little 9 circle grid thing) and then do something simple like start my music playing… all without looking.

I wanted to start a thread to get your feedback on how you think this shift in interfaces will affect user interaction. In particular, how it will affect mobile websites and applications. Also, do you think this shift will migrate more and more to traditional desktops as well, as more tablet PCs become available.

Thoughts?

These days, more and more websites are being designed “responsive”, which mean that they respond to the screen resolution where it’s being browsed. So, to answer your question, I think, shift in interfaces won’t affect user interaction much, provided the website is usable and contains all the essential elements desired by the users.