Human features

A human can use many different parts of their anatomy to trigger an interaction with a computer. The most used method is through tactile user interfaces (touch) such as clicks, touch, buttons, faders or turn pots.

In the last 10 years, technological advancements have allowed humans to use other methods of interaction.

One of the most prominent methods which threatens to take over tactile/touch methods is auditory user interfaces (sound-based). Consider the way in which smart home hubs such as Amazon’s Echo with Alexa and integrated operating system software such as Siri for the iPhone allow humans to ask computers questions. If that computer device is networked to the World Wide Web then most answers can be retrieved in seconds.

Other methods of interaction which are starting to become more common include visual user interfaces such as iris recognition or face recognition. These methods tend to be used more for security (for example, to unlock a device) or for those with physical impairment.

Let’s investigate!

Take a look at this article and read about human user interface interaction of the near future.

When designing a user interface for a human, you must consider the differences across mankind and barriers which some humans have in accessing technology. We call this the digital divide.

Glossary

Digital divide
The divide between those that can and cannot access technology. These divides can be caused by many different factors including age, poverty, infrastructure, culture and education.

In order to narrow the digital divide, a user interface must consider:

  • Accessibility needs – consider impairments (physical or mental).
  • Education – some users might have a high level of digital literacy and others won’t. The user interface must support and differentiate for both groups.
  • Demographics – consider different ages, cultural differences and other demographics so more people can access user interfaces.

Let’s investigate!

Research a range of different devices (e.g. mobile phones, tablets, robots) and investigate the type of user interface features that are used and how the user interacts with them.