Interaction design, Invisible interface, Embodiment

Sam Billingham in Everyware

Removing the interface

Is just using voice control removing the interface?

Is gesture control and recognition removing the interface?

For both of these I would say no, its not removing the interface but a better term is decreasing the interface. Gesture recognition is still a means to physically interact with a device.

Do we just mean more intuitive interfaces?

There was talk that app’s such as Clear for the iPhone have an invisible interface, on the surface this would appear to be true everything is gesture control and if you are already familiar with these it will seem natural, there is no interface you just do it. To me that is still an interface, you still need to swipe, tap and pinch and all of these are learned. The difference comes because you’re not just clicking menu, add new, okay.

I posed the question ‘What about the line where intuitive takes away from efficiency?’ should devices be less intuitive but more geared to high levels of efficiency and therefore may require some learning. We have to learn to speak, walk why should everything after a certain point then be intuitive, why not just learn again.

Location recognition gets it wrong, Just because I am by the sea doesn’t mean I want to see more pictures of the seaside. If I’m in the kitchen and I search Google I don’t automatically want to look up recipes. We can’t continue to rely on location as a valuable source to filter information for a user.

Published November 29, 2012 by Sam Billingham in Everyware