NUI – Natural User Interface

UI Design | 13th July 2017
A Natural User Interface allows the user to have a much higher level of immersion, they are simpler and more exploitable than the previous ones, which relied on physical hardware. The Natural User Interface is also known as the “invisible interface” with successive levels of immersion and is based on nature or on human natural elements.

The user quickly abstracts himself from the technology involved in the interaction, focusing essentially on his goals and daily tasks such as checking the time, the weather forecast for tomorrow, know the time when the meeting is scheduled to happen or even communicate or warn about a slight delay.
 

Is this major reason that serves as justification for the ever growing and frightful number of mobile devices? Yes. The level of learning is quite low and the accessibility rate is increasing, thus having a great contribution to the digital inclusion. NUI is still in expansion towards a natural interface interaction through voice and movement, and even by relying on our thinking.

 

According to Dan Saffer, the Natural User Interface has 12 principles:

1) DESIGN FOR FINGERS, NOT CURSORS.

One should have in mind touch targets that have ideal dimensions for touch on a mobile device. These areas must be larger when compared to a desktop/mouse. A 10-14mm size is recommended for targets that will be touched by the fingertip.  

Touch UI
BeyondPLM

2) HAVE ALWAYS IN MIND THE PHYSIOLOGY, KINESIOLOGY.

Repetitive tasks must be avoided and one should also take into account the movements that are conceivable and comfortable for a person, avoiding the difficult ones, for instance a drag that is too hard to perform with a simple thumb on the screen.

Good and not good screen
XDA Developers

3) GORILLA ARM.

The use of touch-sensitive screens requires attention and coordination from the human body, since the positioning of the arm, eyes, hand and posture varies in accordance with the device and use. This is self-evident when you are holding a smartphone or a tablet while standing, seated, using one hand, using both, etc. The use of a touchscreen in a vertical position allows us to know the Gorilla Arm Syndrome.

4) SCREEN COVERAGE.

To know our own hand. Sometimes the use of our fingers can drag with them the palm of our hand and this covers part of the information available on the screen. A strategical placement of the graphic elements of the User Interface can be helpful to keep all the elements visible.

5) BE AWARE OF THE TECHNOLOGY USED.

It’s important to know how our interface will be used, if it serves a touch screen, a sensor, or even a camera that determines gestures in order to infer an interaction.

​6) HAVE ALWAYS IN MIND THAT THE MORE CHALLENGING THE GESTURE, THE HARDER THE INTERACTION, FUNCTIONALITY.

7) ON RELEASE, NOT ON PRESS.

Activate functions when the user removes the finger, and when he doesn’t touch the screen.

8) AFFORDANCE.

Use a system of touches and gestures that are already obvious to the user. Simple, intuitive gestures which attract him for the system. And we should never fight the basic affordances and the “native” knowledge of our user.



Get fresh content about UX every month!

9) AVOID THE INVOLUNTARY ACTIVATION OF ACTIONS.

A variety of daily movements conducted by the user can trigger the system by accident.

10) GESTURES AND COMMAND KEYS.

Use a simple User Kit and provide an easy way to access buttons, Drag movements (such as menu items, etc.), but offer as well advanced and agile ways to learn gestures like shortcuts.

11) A VARIETY OF REQUIREMENTS. THE SAME FUNCTIONALITY CAN BE ACCESSED THROUGH DIFFERENT, BUT SIMILAR, GESTURES.

Having in mind that I can be left or right-handed, for instance.

12) DETERMINE THE COMPLEXITY OF THE GESTURE ACCORDING TO THE COMPLEXITY AND THE FREQUENCY OF THE TASK TO BE PERFORMED.

If the task is simple and frequent, it must be triggered by a simple gesture, a quick access, like the Floating Action Button of the Material Design.​


 

Originally published in portuguese on


Read about the huge challenge of Designing Cross-Platform Experiences.

Back