If you’re going to design for mobile, then it’s likely you’re going to need to consider the way that the device is used and the specifics of the device itself. There are some general principles that can help designers for mobile get started but don’t forget that these don’t replace the need for user research. They are guidelines not hard and fast rules.
There are many things to consider when designing for mobile and while many are standard UX considerations; there are going to be mobile specific design considerations too. Are you going to integrate your mobile offering with your current offering? Will you use responsive design or adaptive design if you do?
A lot of this will boil down to context. E.g the context in which the mobile device will be used. If your users access the mobile web from their desks, that’s awesome, but many users don’t. They’re going to be trying to use them in the supermarket, on their daily commute, on the walk to the coffee shop, etc.
That means you’re going to have to consider how to reduce distractions and make it easy for the user to focus on the task in hand too.
Josh Clark, the author of Tapworthy- Designing Great iPhone Apps, offers three categories for mobile web access:
Microtasking: When the user interacts with their device for brief but frenzied periods of activity
Local: When the user wants to know what’s going on around them
Bored: When the user has nothing better to do and is looking to be entertained or otherwise diverted
Basic Design Considerations for the Mobile Web
You don’t have as much screen real estate for mobile devices as you do for PCs and laptops. That means, normally, you’ll be designing for multiple screen sizes. You need to make a decision early as to whether to use responsive design (where the device handles the changes in display) or adaptive design (where your servers handle the changes).
You want to focus on a “mobile first” approach which means designing for the smallest mobile platforms and increasing complexity from there.
A good process to follow would be:
Group device types based on similar screen sizes and try to keep this to a manageable number of groups
Define content rules and design adaption rules that enable you to display things well on each group of devices
Try to adhere as closely to web standards (W3) as possible when implementing flexible layouts
Don’t forget that there are many different browser types available for the mobile web and the wider Internet too. You want to ensure that you support as many of these as possible – including those that are no longer current (such as BlackBerry and Nokia WebKit).
Keep Navigation Simple
Keypads and touch screens don’t make for precise navigation like mice do – so try to:
Prioritize navigation based on the way users work with functionality – the most popular go at the top
Minimize the levels of navigation involved
Ensure labelling is clear and concise for navigation
Offer short-key access to different features
Remember to offer a 30×30 pixel space for touch screen tap points
Ensure that links are visually distinct and make it clear when they have been activated too
Make it easy to swap between the mobile and full site (if you choose to implement separate versions)
Keep Content to a Minimum
Don’t overwhelm your users – respect the small screen space. Keep content to a minimum.
Make sure that content is universally supported on all devices or avoid it. Think Flash and then don’t use it, for example.
Make page descriptions short and to the point – for relevant bookmarks.
Reduce the Inputs Required from Users
The less the user has to fiddle with their phone; the more they’re going to enjoy using your mobile web offering. Consider:
Keeping URLs short.
Offering alternative input mechanisms (video, voice, etc.)
Minimizing inputs in forms (you can always ask for more data when the user logs on to the desktop)
Allowing permanent sign in (most smartphones are password or fingerprint protected – the risks of staying logged in are less than on the desktop)
Keep scrolling to a minimum and only allow scrolling in one direction
Remember Mobile Connections Are Not Stable
Mobile connections can be a colossal PITA in areas with patchy service. Don’t make things hard on your users. Try:
Retaining data so that it’s not lost in a connection break
Minimizing page size for rapid loading
Killing off ad-networks, etc. on mobile sites which consume huge amounts of bandwidth and data
Keeping images to a minimum and reducing the size of those images
Reducing the numbers of embedded images to a minimum (speeding up load times)
Continuous Integrated Experiences
As users move between mobile and the desktop they’re going to expect similar experiences. Remember to:
Maintain continuity. If they log into your webstore on mobile they should be able to track orders and make purchases just like they would on the desktop.
Maintain consistency. Offer the option to switch between mobile and desktop offerings at will.
Maintain brand. The look and feel of each version should be similar.
The Take Away
Mobile is different from the traditional desktop environment and while standard UX and usabilityconsiderations are needed in a mobile context – the mobile environment also brings new design considerations. It’s important for mobile designers to pay attention to the details in order to deliver the best possible user experiences.
Interaction design is an important component within the giant umbrella of user experience (UX) design. In this article, we’ll explain what interaction design is, some useful models of interaction design, as well as briefly describe what an interaction designer usually does.
A simple and useful understanding of interaction design
Interaction design can be understood in simple (but not simplified) terms: it is the design of the interaction between users and products. Most often when people talk about interaction design, the products tend to be software products like apps or websites. The goal of interaction design is to create products that enable the user to achieve their objective(s) in the best way possible.
If this definition sounds broad, that’s because the field is rather broad: the interaction between a user and a product often involves elements like aesthetics, motion, sound, space, and many more. And of course, each of these elements can involve even more specialised fields, like sound design for the crafting of sounds used in user interactions.
As you might already realise, there’s a huge overlap between interaction design and UX design. After all, UX design is about shaping the experience of using a product, and most part of that experience involves some interaction between the user and the product. But UX design is more than interaction design: it also involves user research (finding out who the users are in the first place), creating user personas (why, and under what conditions, would they use the product), performing user testing and usability testing, etc.
The 5 dimensions of interaction design
The 5 dimensions of interaction design(1) is a useful model to understand what interaction design involves. Gillian Crampton Smith, an interaction design academic, first introduced the concept of four dimensions of an interaction design language, to which Kevin Silver, senior interaction designer at IDEXX Laboratories, added the fifth.
Words—especially those used in interactions, like button labels—should be meaningful and simple to understand. They should communicate information to users, but not too much information to overwhelm the user.
2D: Visual representations
This concerns graphical elements like images, typography and icons that users interact with. These usually supplement the words used to communicate information to users.
3D: Physical objects or space
Through what physical objects do users interact with the product? A laptop, with a mouse or touchpad? Or a smartphone, with the user’s fingers? And within what kind of physical space does the user do so? For instance, is the user standing in a crowded train while using the app on a smartphone, or sitting on a desk in the office surfing the website? These all affect the interaction between the user and the product.
While this dimension sounds a little abstract, it mostly refers to media that changes with time (animation, videos, sounds). Motion and sounds play a crucial role in giving visual and audio feedback to users’ interactions. Also of concern is the amount of time a user spends interacting with the product: can users track their progress, or resume their interaction some time later?
This includes the mechanism of a product: how do users perform actions on the website? How do users operate the product? In other words, it’s how the previous dimensions define the interactions of a product. It also includes the reactions—for instance emotional responses or feedback—of users and the product.
Important questions interaction designers ask
How do interaction designers work with the 5 dimensions above to create meaningful interactions? To get an understanding of that, we can look at some important questions interaction designers ask when designing for users, as provided by Usability.gov(2):
What can a user do with their mouse, finger, or stylus to directly interact with the interface? This helps us define the possible user interactions with the product.
What about the appearance (colour, shape, size, etc.) gives the user a clue about how it may function? This helps us give users clues about what behaviours are possible.
Do error messages provide a way for the user to correct the problem or explain why the error occurred? This lets us anticipate and mitigate errors.
What feedback does a user get once an action is performed? This allows us to ensure that the system provides feedback in a reasonable time after user actions.
Are the interface elements a reasonable size to interact with? Questions like this helps us think strategically about each element used in the product.
Are familiar or standard formats used? Standard elements and formats are used to simplify and enhance the learnability of a product.
So what do interaction designers do?
Well, it depends.
For instance, if the company is large enough and has huge resources, it might have separate jobs for UX designers and interaction designers. In a large design team, there might be a UX researcher, an information architect, an interaction designer, and a visual designer, for instance. But for smaller companies and teams, most of the UX design job might be done by 1-2 people, who might or might not have the title of “Interaction Designer”. In any case, here are some of the tasks interaction designers handle in their daily work:
This is concerned with what the goal(s) of a user are, and in turn what interactions are necessary to achieve these goals. Depending on the company, interaction designers might have to conduct user research to find out what the goals of the users are before creating a strategy that translates that into interactions.
Wireframes and prototypes
This again depends on the job description of the company, but most interaction designers are tasked to create wireframes that lay out the interactions in the product. Sometimes, interaction designers might also create interactive prototypes and/or high-fidelity prototypes that look exactly like the actual app or website.
If you have just started embarking your journey through the Design Thinking process, things might seem a little overwhelming. This is why we have prepared a useful overview of the Design Thinking process, as well as some of the popular Design Thinking frameworks commonly used by global design firms and national design agencies.
To begin, let’s have a quick overview of the fundamental principles behind Design Thinking:
Design Thinking starts with empathy, a deep human focus, in order to gain insights which may reveal new and unexplored ways of seeing, and courses of action to follow in bringing about preferred situations for business and society.
It involves reframing the perceived problem or challenge at hand, and gaining perspectives, which allow a more holistic look at the path towards these preferred situations.
It encourages collaborative, multi-disciplinary teamwork to leverage the skills, personalities and thinking styles of many in order to solve multifaceted problems.
It initially employs divergent styles of thinking to explore as many possibilities, deferring judgment and creating an open ideations space to allow for the maximum number of ideas and points of view to surface.
It later employs convergent styles of thinking to isolate potential solution streams, combining and refining insights and more mature ideas, which pave a path forward.
It engages in early exploration of selected ideas, rapidly modelling potential solutions to encourage learning while doing, and allow for gaining additional insight into the viability of solutions before too much time or money has been spent
Tests the prototypes which survive the processes further to remove any potential issues.
Iterates through the various stages, revisiting empathetic frames of mind and then redefining the challenge as new knowledge and insight is gained along the way.
It starts off chaotic and cloudy steamrolling towards points of clarity until a desirable, feasible and viable solution emerges.
As we have seen from the definitions and descriptions, Design Thinking means many things to many people, and this theme persists into the practical implementation as well. There are a wide variety of process breakdowns and visualisations ranging typically between 3 and 7 steps. Each process step or phase embodies one or more of the core ingredients of design thinking that being, reframing, empathy, ideation, prototyping and testing. These different implementation frameworks or models might have different names and number of stages, but they embody the same principles laid out in the bullet points above.
Modelled on Early Traditional Design Processes
The earliest process expressions of Design Thinking were almost exact replications of the traditional Design Process, with the later addition of deeper empathy and more specific forms multidisciplinary collaboration. Taken from Herbert Simon’s 1969 seminal work The Sciences of the Artificial, the design process: define, research, ideate, prototype, choose, implement, and learn has been the cornerstone of design process for decades.
Popular Design Thinking Frameworks
Heart, Head and Hand
The Design Thinking Process is a blend of Heart, Head and Hand. This means the process is based on vision, need, emotion and feeling to begin with, continuing on to the cognitive processing for ideation and evaluation and then diving into practical creation by hand. It’s a holistic process and demands input from all of our faculties in order to be successful.
The Deep-Dive was IDEO’S first expression of this process, which they aired LIVE on ABC Nightline back in the late 90’s. The Deep-Dive process comprises of the following steps:
Deloitte acquired the Deep-Dive process in 2006.
d.school’s 5 Stage Process
The Stanford Design School (d.school), now known as the Hasso Plattner Institute of Design began teaching a design thinking process with the following 3 steps:
They have since moved on to formulate and open source their famous 5 stage process below which is widely used. This is the process we also recommend:
The d.school represents the 5 stage process by their hexagonal Design Thinking Lenses. The lenses are purposely defined as such so they will be seen more as enablers or modes of thinking, rather than concrete linear steps.
IDEO’s Design Thinking Process
IDEO uses a different process, and while it has only three stages, covers pretty much the same ground as the other processes covered here. The three stages are
Inspire: The problem or opportunity that motivates the search for solution
Ideate: The process of generating ideas
Implement: The path that leads form the project room to the market
IDEO have also released a deck of IDEO Method Cards covering the modes Learn, Look, Ask, Try each with their own collection of methods for an entire innovation cycle.
HCD – Human Centred Design
IDEO has also developed contextualised toolkits, which repackaged the Design Thinking processes. One such iteration focuses on the social innovation setting in developing countries. For this context the terminology needed to be simplified, made memorable and restructured for the typical kinds of challenges faced. The HCD process (Human Centred Design) was re-interpreted as an acronym to mean Hear, Create, Deliver.
Similar to early phases in other Design Thinking processes, the Hear stage is about developing an empathic understanding of users, as well as defining the problem that the team is trying to solve. It serves the purpose of gaining a solid foundation in the context of the problem and sufficiently reframing it in order to progress. In this phase of the process, design thinkers need to
identify their challenge,
recognise existing knowledge in the challenge space,
identify people to engage with to understand the deeper human side of the challenge,
develop Points of view or stories to guide the creation phase.
Similar to the Ideate and Prototype phases in d.school’s 5-stage approach, the Create stage here is concerned with exploration, experimentation and learning through making. It involves pinpointing potential areas of exploration and then engaging those closest to the problem to co-create solutions. This allows design teams to maintain the highest levels of empathy during early design phases as well as weed out potential problematic assumptions made by designers who do not sufficiently understand the context.
Highlight Opportunities to explore from insights gained in the Hear Phase
Recruit participants for the co-design task from a diverse pool of those affected
Maintain awareness of sensitivities by avoiding judgements
Facilitate action orientated creation of tangible solution
The Deliver phase of the HCD process is centred around logistical implementation and overcoming any obstacles which may exist when rolling out a solution within the required context. Though solutions arrived at may provide a functional patch to a problem, getting by in communities and bypassing any other roadblocks on the path of implementation is essential for the process to be completed successfully.
Design Council of the UK: 4 D’s
The Design Council of the UK has settled on 4 D’s, Discover, Define, Develop, Deliver. They make use of a Double Diamond process diagram to indicate 2 cycles of divergent and convergent thinking and activities.
Frog Design’s 3 D’s Discover, Design Deliver has been replaced with Explore, Converge, Support, indicating a focus on more than just finite projects or products but an ongoing relationship with their clients well after delivery date.
What x 4
Jeanne Liedtka and Tim Ogilvie’s book, Designing for Growth, puts forward a unique spin on the same journey, reframing the terminology into a more inquisitive and intuitive 4 W’s. Jeanne Liedtka is a professor of business administration at the Darden School of the University of Virginia, while Tim Ogilvie is the founder of innovative consultancy firm Peer Insight, and both are experts in design thinking and strategic thinking. Their 4 W’s process involves asking:
What is? Exploring the current reality
What if? Envisioning Alternative Futures
What wows? Getting users to help us make some tough choices
What works? Making it work in-market, and as a business
What if—one of the most powerful phrases in the English language, and for good reason.
The LUMA System
The LUMA Institute, a global firm that teaches innovation and human-centred design, has its own expression of Design Thinking modes: Looking, Understanding and Making. This unfolds through a series of steps per mode completed with a proprietary user manual and method cards. The modes allow for remixing a wide range of processes through the 3 modes using methods specific to your needs.
The Take Away
We could spend weeks exploring the Design Thinking Processes, their differences and similarities and the merits of variety or conformity. It is important for us to peel away the facade in order to understand the foundations. To the first timer, at first sight, the Design Thinking process is mysterious, chaotic, and at many times complex. However, it’s a discipline, which will grow on you with direct practice. You will learn things in a practical manner, which no theory can adequately cover growing in confidence with each new experience. You may even be tempted to develop your own expression of these steps, modes, and phases to suite a completely new context, and that’s part of the beauty of Design Thinking.
(This article is kindly sponsored by Adobe.) Voice-enabled interfaces are challenging the long dominance of graphical user interfaces and are quickly becoming a common part of our daily lives. According to a survey run by Adobe, 76 percent of smart speaker owners increased their usage of voice assistants over the last year.
In this article, I’ll share a flow that you can use to create voice-based experiences. But before we dive into the specific recommendations on how to design for voice, it’s important to understand the user expectations about it.
Why Do People Expect More From Voice?
Voice User Interfaces (VUIs) not only introduce a change in a way people interact with machines, but they also raise the bar for the quality of interaction. When people interact with GUI’s and have troubles with them, they often blame themselves, but when people interact with VUIs and are unable to complete a task, they blame the system.
Why is that? Well, talking is the most naturally convenient medium for communication between people, and people are confident in their talking skills. This can have a direct influence on the retention rate: A 2017 report by Voicelabs states there’s only a 6 percent chance a user will be active in the second week after downloading a voice application.
Many designers think that designing voice-based experiences is completely different from graphical user interfaces. That’s not true.
Designing voice-based experiences is not a new direction in UX design; it’s a next natural step. It’s possible to adapt the design process that we use for visual interfaces for voice-based products.
There are five steps should take place before starting development a voice product:
The great thing about this process is that it can be applied to all types of voice interfaces, whether it is a voice-enabled, voice-only or voice-first.
Similar to any other digital product we design, we need to apply user-first design in the context of voice user interfaces. The goal of user research is to understand the needs and behaviors of the target user. The information you gather during this step will be a foundation for product requirements.
Identify The Target Audience
Defining and researching the target audience of a product should be one of the first steps in the design process.
Here’s what to focus on during this step:
Look at the current experience and how the users are solving their problem now. By identifying pain points, you’ll find the cases where voice can benefit your users.
User language. The exact phrases that a target user uses when they speak with other people. This information will help us to design a system for different utterances.
During this step, we need to shape our future product and define its capabilities.
Define Key Scenarios Of Interaction
Scenarios come before specific ideas for app — they’re a way to think about the reasons someone might have to use a VUI. You need design scenarios that have high value for your target users. If you have many scenarios and do not know which ones are important and which are not, create use case matrix to evaluate each individual scenario. The matrix will tell you what scenarios are primary, what are secondary what are nice-to-haves.
Make Sure Key Scenarios Work With Voice
There should be a compelling reason to use voice. Users should be able to solve the problem faster or more efficiently using voice than any of the alternative experiences.
A few common cases when voice interaction might be preferable for users:
When user’s hands are busy (while driving or cooking);
When using voice is an easier and more natural way to interact (for example, it’s much easier to tell your smart speaker to “Play Jazz” rather than jump to a media center and select the right option using a GUI).
Your goal for this step is to identify both common and specific cases that your users will benefit from. It’s also important to consider the limitations of voice interactions. For example, selecting from a long list of menu items is problematic with voice interactions. A good rule of thumb is to keep choices short and to the point — 3 selections maximum. If you find you have more than 3, it’s best to reframe the scenario.
With voice prototypes, it’s important to start at the drawing board. The first step is to tackle the voice user flows of your experience, which is the basis from which all user interaction will map back to.
Storyboards visualize interactions and flows in context and make them feel more realistic.
Dialogues are the building blocks of voice user flows. For each key scenario that the voice app will support, start creating conversational dialogues between the user and the app. Strive to make interacting with the app as familiar as having a regular conversation with a real person. Human conversation is complex; it often has many twists and turns. It will be important to take this into account when working through your scenarios and writing dialogues.
A few general recommendations for creating great dialogues:
Reduce the number of steps it takes to complete a task.
Try to eliminate unnecessary information and questions wherever possible. Design should solve the user’s problem with the minimum number of steps. Remember that the longer it takes to complete the task, the less enjoyable the experience becomes. No one likes products that drain on their valuable time, and this is especially true for repetitive tasks. Your app should provide a delightful user experience whether it’s the first time a user completes the task, or it’s the 100th time this month.
Don’t teach “commands”.
Given how hard discovery can be in voice, some designers try to teach users commands so they know what to say. Don’t do that. Phrasing should be natural and intuitive, so users will know what to say in particular situations.
Be brief with your questions and responses.
It’s relatively easy to overload users with too much information when you use long sentences. Try to simplify messages to their essence wherever possible. Users enjoy concise and well-defined information from VUI. Think about what information is primary, and what information is secondary. Follow the principle of Maximum of Quantity which says that a speaker provides the listener as much information as is necessary to advance the perceived purpose of the interaction but not too much.
Here is a dialogue that doesn’t follow the principle of Maximum of Quantity:
User: “Where’s a good place to go for pizza?”
System: “I have five places for you. The first one is “Pizza & Pasta” is located on 1300-1398 Diamond St San Francisco, 15 minutes away from you, open from 8:00 AM till 8:00 PM, the second is “Italy” which is located on 4100-4148 20th St San Francisco, 1 hour away from you, open from 9:00 AM till 6:00 PM,…”
The dialogue tries to present all options in a single turn. It’s better to try to narrow down the list by asking a clarifying question:
User: “Where’s a good place to go for pizza?”
System: “There are several pizza restaurants in the area — would you like to walk, or drive?”
No matter how good your VUI will be, one thing is for sure: things will go wrong. And you need to design for those situations. Graceful handling of errors is one of the biggest challenges you’ll face when designing for voice and it’s vital to start thinking about it right at the time when you write your dialogues.
Here are a few common cases that can cause error states:
Some users can mispronounce words and the natural language processing system won’t understand the request.
Even between humans, speech is sometimes ambiguous. For example, someone might say “Good” and that could be defined as ‘Yes, I agree’ or just be an indication that they’re listening.
Cases when the system is unable to provide valuable information for users. For example, the user wants to book a flight, but the system doesn’t have relevant options.
In all cases, the dialogue shouldn’t end up in a dead end; the system should provide valuable options. Always try to get the dialog back on track.
Let’s check out the first example:
User: “Book me a flight from San Diego to Dallas on Monday.”
System: “I couldn’t find any flights from San Diego to Dallas on Monday.”
It’s evident that users face a dead end. However, it’s relatively easy to modify the dialog to provide some options for users:
User: “Book me a flight from San Diego to Dallas on Monday.”
System: “I couldn’t find any flights from San Diego to Dallas on Monday. Would you like to search for another date?”
Another example for food ordering app:
User: “What’s the ZIP code for delivery?”
System: “Hmm, 811415”
User: “This ZIP is invalid. Try again.”
Not only the word “Invalid” is too technical, but it also doesn’t help get the user back on track. It’s better to offer a user a hint so they can understand what they should do next:
System: “What’s the ZIP code for delivery?”
User: “Hmm, 811415”
System: “This ZIP is longer than expected. It should be five digits, not six.“
Reflect Your Unique Identity
All voices project a persona whether you plan for one or not. Don’t leave your VUI persona to chance, reflect your unique brand and identity in dialogues you write. Imagine how your ideal employee should interact with customers and try to reflect it in the wording of your dialogues.
Tone of voice has measurable impacts on users’ perceptions of a product. That’s why it’s important to consider the emotional needs of your users when choosing a tone.
Bake Empathy In Interactions
Voice interfaces should take user emotions into account. People like not only friendly people but also friendly computers. For example, when someone wants to book a ticket for a flight and provides information about a trip, the system might respond ‘Sounds like a fun trip!’ The response should be slightly different each time to prevent a feeling of interaction with a machine.
Confirm When A Task Has Been Completed
It’s vital to think about where in the conversation flow the users need confirmations. Usually, people expect a final confirmation at the end of a dialogue. For example, when a user schedules an event, they might want to hear the “The event is on your calendar now.” Another typical scenario is a checkout flow — let the user know that the transaction has been successfully recorded.
Use explicit confirmation for important actions and implicit for routine tasks. For example, if you ask your Alexa to send money to your friend, a user probably wants to hear “The [amount of money] was sent to [name of the person]” rather than just “OK.” At the same time, when you ask Alexa to turn off the lights in a garage, hearing “The lights in the garage are off” all the time might be too much, so be sure to test confirmations carefully to find out what confirmations your users feel is critical in order to feel successful with the VUI.
A good conversational system keeps track of the dialog, memorizing all previous turns and of previous interactions. A solid system will use this information to create a better experience for users by offering a more personalized experience.
For example, when a user orders pizza, the system might remind them about their previous order:
User: “I want to order a pizza.”
System: “Last time you ordered Quattro Formaggio from Pizza & Pasta. Do you want to order it again?”
User: “Yay, I do!”
Cover Alternate Phrases
People can use different words to describe the same thing, and it’s vital to take this moment into account when designing your VUI. For each voice user flow that you designed in the previous step, think about the different ways users could phrase those requests. Consider word variations and synonyms that they might use.
Depending on the capabilities of your voice product, the number of utterances that users can vocalize when interacting with VUI can easily run into the hundreds, making the task of mapping them out really complex. Fortunately, there are special tools available to help you with that. For example, if you design apps for Alexa, you can use Amazon Echo Utterance Expander for that purpose.
Test Your Dialogues
Now when you have all your dialogues written, it’s time to start testing them. Why? Because the way we speak is far less formal than the way we write. To make sure you design dialogues that sound natural, it’s vital to test them before moving to prototyping. Two simple techniques will help you do it:
Record and play audio with your dialogs. You’ll hear nuances of words and sentences that just aren’t natural.
Role play conversations to make sure they’re natural and intuitive. A technique called ‘Wizard of Oz’ will help you quickly identify the problems in your dialogues. If you’re Mac user, you can use a tool called Say Wizardto make things easier.
Prototype Your App
Now that we’ve written, mapped and tested our dialogues we can finally move on to designing and prototyping the experience. Adobe XD makes it easy for designers to create a working prototype for voice-enabled Amazon or Google apps and test it with real users. The tool allows you to prototype the actual voice inputs and outputs for the app. A typical interaction consists of user input and system responses:
To design user requests, we need to create voice triggers. To add a new voice trigger, drag a connector from an element in one artboard to another. When the attributes menu opens, select Voice from Trigger menu and add your utterance in the Command field.
Speech Playback will simulate the response of the voice app. To add Speech Playback, you need to select Time as the Trigger and set the action to Speech Playback.
Adobe XD allows you to prototype for voice-first products like the Amazon Echo Show, and voice-only products such as Google Home.
Last but not least, if you design Amazon Alexa Skill for Amazon Echo Show or Amazon Echo Spot, XD provides a VUI kit for those devices. You can download it here. This VUI kit provides all the building blocks you need to get started building an Alexa skill.
Testing is a mandatory part of the design process. Without testing, you can’t say whether your app will work for your users or not.
Test Your Prototypes With Target Users
Conduct usability testing sessions with representatives from your target audience, and observe how users interact with your app. Track the tasks completion rate and CSAT (Customer Satisfaction Score). If possible, try to record a video for each session.
Use Test Simulators
Both Amazon and Google provide testing tools that let you test your Skill or Action in simulation of the hardware devices and their settings. This testing will give you a good feel for the voice experience in the real world.
Refine the voice application after sending it to the market.
Once you’ve rolled out your app, you should track how the app is being used with analytics. Here are some of the key metrics to keep an eye out for are:
Intents and utterances,
User engagement metrics,
Most of the metrics you need you will find within your Skill developer account without any additional coding.
Human-computer interaction has never been about graphical user interfaces. First and foremost, it has always been about communication. It’s evident that voice will be a natural way for the new generation of users to interact with technology, and as a designer, you should be ready for these new challenges and the opportunities they unlock for new ways of looking at interaction design.
Fitts’s Law provides a model of human movement, established in 1954 by Paul Fitts, which can accurately predict the amount of time taken to move to and select a target. Although originally developed according to movement in the physical world, in human-computer interaction Fitts’s Law is typically applied to movement through the graphical user interface using a cursor or other type of pointer. Fitts’s Law has been formulated mathematically in a number of ways; however, its predictions are consistent across the many different mathematical representations.
Put simply, Fitts’s Law states “…the time to acquire a target is a function of the distance to and size of the target”. As the distance increases, movement takes longer and as the size decreases selection again takes longer. Whilst Paul Fitts established his law of movement before the advent of graphical user interfaces, the law is no less robust when applied to navigation through the virtual world.
Applying Fitts’s Law to User Interface Design
The size of a target and its distance from the user’s current position within the user interface affect user experience in a number of ways. Some of the major implications for user interface design and user experience in turn are considered below:
1. Command buttons and any other interactive element in the graphical user interface must be distinguished from other non-interactive elements by size. Whilst it may seem obvious, user interface design often ignores that the larger a button is the easier it is to click with a pointing device. As interactive objects decrease in size there is a smaller surface area, requiring a level of precision that increases selection times.
2. The outer edges and corners of the graphical user interface can be acquired with greater speed than anywhere else in the display, due to the pinning action of the screen. As the user is restricted in their movements the pointing device cannot move any further when they reach the outermost points of the screen; fixing the cursor at a point on the periphery of the display.
3. Pop-up menus better support immediate selection of interactive elements than dropdown menus as the user does not have to move the cursor from its current position. Therefore, graphical designs that allow the user to interact without moving help to reduce the ‘travel time’.
4. Selecting options within linear menus, whether vertical (e.g. dropdown menus) or horizontal (e.g. top-level navigation), takes longer than clicking options in pie menus – where choices are arranged in a circle. Travelling distance is the same for all options in pie menus, unlike linear menus where distance increases the further along or down the list of options the user goes. In addition, the size of target areas is large in the pie menu, with the wedge-shaped buttons affording a larger margin for error when moving the cursor.
5. Task bars impede movement through the interface as they require a more time-consuming level of precision than when options are placed on the outer limits of the screen. Although unconnected to Fitts’s Law, multiple task bars can introduce a certain level of confusion or at the very least require the user to engage consciously with the screen arrangement to ensure appropriate selection.
A Simple Rule
The aim of user interface design should be to reduce the distance from one point to the next and make the target object large enough to enable prompt detection and selection of interactive elements without sacrificing accuracy. One particular aspect of this is ensuring that users are able to click anywhere on an interactive element to carry out the assigned action. For example, links, whether they are in the form of text, images or buttons (such as those seen in the ASOS screenshot below), should afford clicking on the whole region.
Unfortunately, in web design especially, users are often forced to position the cursor directly over a specific portion of a link in order to interact. It is important to design according to the users’ expectations; they should be able to click anywhere within the clearly defined boundaries.
Mobile designs need to take into account the way that users work with a mobile phone. That means understanding that distractions can come into play when the smartphone is in use and also ensuring that you make the input process as simple as possible to counteract their impact. Luke Wroblewski suggests the “one thumb, one eyeball” test as an efficient way of coming to grips with this problem. It may help make your mobile designs more user friendly and enhance the mobile user experience.
Analysing usage patterns for smartphones is a complicated affair. The ThinSlices app design site takes a close look at how smartphones are utilized in daily life and some of the emerging patterns of use for mobile devices.
The Usage of Smartphones
ThinSlices offers these insights into mobile usage:
People use their phones in 68% of cases at home rather than in work
72% of smartphone users don’t let their phones out of their reach at any time
More than half of all mobile phones are now smartphones
Half of all smartphone users consider their phone to be their primary access point for the Internet
There are 7 categories of usage on mobile – 3 categories account for 77% of all time spent on the smartphone – socializing, shopping and “me time”. (Source: Harvard Business Review)
self-expression (interests and hobbies),
discovery (seeking information),
preparation (getting ready for other activities),
accomplishing (managing health, productivity or finances),
“me time” (relaxing or being entertained). “Me time” is the most dominant usage with 46% of all time spent on a smartphone dedicated to this activity.
Nearly half of mobile users only use their smartphones for traditional phone activities – calling and texting. These users do not download apps or surf the web. Indicating the potential for growth in the app/mobile web space on smartphones.
The most used apps in the world are social apps (Facebook, YouTube, Google+, WeChat, Twitter, Skype, Whatsapp and Instagram) but the most used app is Google Maps (suggesting that “on the go” access is very common on smartphones).
Usage times vary by culture (Chinese use theirs most after lunch, Europeans in the afternoon and Americans in the evenings).
The average user interacts with their phone 150 times a day!
This data is interesting not just because of the opportunities it represents for UX designers in terms of the app market itself, but also because it suggests that there is no standard pattern of mobile usage either. This confirms that the best UX design for mobile is one which will take into account the possibility of the user being distracted from the task(s) they set out to do.
The One Thumb, One Eyeball Test for Good Mobile Design
Luke Wroblewski, Product Director at Google, notes that in a distracted environment, the best form of interaction with a smartphone is one which delivers high speed interaction with very easy to use functionality. He calls the typical mobile usage experience a “one thumb, one eyeball” experience, since the highly distracted environment causes most mobile users to engage in one-handed use with short spans of partial attention.
The one thumb, one eyeball test is thus about finding out if your mobile design allows users to easily use the app with one hand and partially distracted attention. In other words: Can users perform a certain number of tasks with just one hand in under 60 seconds?
If an interaction is measured in minutes or seconds, anything that complicates it is likely to hinder the user experience. Users engaged with smartphones in that 150 uses a day are often not going to have the time to play around for 5-10 minutes working out how to interact with an app or mobile website. They expect that you will cater for their “need for speed” in the design, and if you don’t, they’ll go elsewhere to someone who will.
The one thumb, one eyeball test was proposed by Luke during the design of “Polar”, an app designed to create photo polls and allow voting on them. The objective was that a user should be able to create a new poll in less than a minute using only one thumb to do so.
The results were impressive: Luke’s team delivered a process so simple that most users could deliver a new poll in thirty seconds. They also tested whether voice input could deliver a faster experience, and concluded that it wasn’t any significant amount faster than the one thumb input process. (Note: This may be because users are more familiar with one thumb input processes than voice input processes and the efficiencies might improve as voice input becomes a more widely used form of interaction with smartphones).
User experience designers may find that the one thumb, one eyeball test is a great way to conduct simple usability research for mobile apps and mobile websites. Certainly, it will not be an expensive test to conduct and may appeal to even the smallest design/development teams on the tightest budgets.
The Take Away
Mobile app and web use is very different from the desktop. Users face a variety of different situations throughout the course of the day and in order to deliver a high-quality user experience, designers need to optimize the interaction with their products to enable the highest chances of user acceptance. The one thumb, one eyeball test is a simple measure to see if the design delivers this simplicity of interaction.
For existing products, it’s also recommended to check the number of times each color is actually used in code. By doing that you’ll notice that some of the colors are used in many different places, while others are used only once. The insight you get will help you organize your palette.
2. Let brand colors form the basis
If you’re working for an established brand, most probably the brand has an established brand palette of colors.
Don’t modify the brand colors for UI Desing
You can’t go creative with brand colors because by doing that you’ll deviate from brand guidelines.
Try to use primary brand color for most of the “chrome” on the app
Try to use brand colors for UI elements that make the structure of your design — headers, footers, menus, etc.
If your brand color works both for light and dark background, you can use it as a layout color for your design.
3. Define foundational colors
Establish your whites and blacks
When it comes to selecting whites and blacks, it’s always better not to choose extreme versions of colors. White doesn’t have to be absolute white (#FFFFFF). Similar to that, black doesn’t have to absolute black.
Depending on how strict you want to be with your color palette, you may want to include a range of tints (a color mixed with white) and shades (a color mixed with black). But be careful! Having too many tints and shades can make the procedure of color selection harder for designers.
Find low contrast neutral color
Low contrast neutral colors are bad for elements that require reading but absolutely fine for elements like input fields. Input fields don’t need to stand out very much so low contrast neutral color can help you create a perfect UI container.
Limit the number of primary colors
Ideally, you should have a small number of approved primary colors (1–3 primaries that represent your brand) and a sufficient number of accent colors.
4. Define interactive colors
Interactive colors are colors that we use for interactive elements — buttons, links, and other UI controls that users can click or touch.
Limit the total number of interactive colors
If possible, try to use only one color as your primary interactive color. By doing that you’ll help your user memorize this color.
You can create lighter and darker versions of your interactive color. Color shades can help you convey different states for your UI elements — for example, a pressed state/hover state.
Strive for consistency
Color can be a helpful wayfinding tool for your users. It’s a good idea to use the same color for links and buttons. By doing that you help your users to recognize interactive elements.
5. Define denotive colors
Denotive colors are colors that mean something. You’ll need to have colors for states such as error, warning, and success.
Error state color
Use a share of red to indicate the error state. If one of your brand colors happen to be red, it’s better to avoid using it for error messages. Why? Because by doing that you make users associate your brand color with problems.
Success state color
Use a shade of green to indicate success state. If one of your brand colors is green, it’s absolutely fine to use it for success state. Users will associate your brand color with a positive outcome.
Limit the number of denotive colors
Ideally, you should have only one color for error and another for success. But be sure that colors you choose for error and success work both for low and high- contrast backgrounds.
Disabled state color
Disabled state is traditionally grayed out. Usually, designers use low opacity color for that. One crucial moment that you should remember when selecting disabled state color — make sure the color has enough contrast, so it’s readable for your users.
6. Clear naming conventions
If you’re saving your colors in the design system, make sure to give clear names for each color you use. Color names should be both easily understood and memorable. Both designers and developers should be able to easily refer to particular colors defined in the system.
Try to avoid using gradation of adjectives (lightBlue, darkBlue); use functional names instead — names that describe the color by the place in the UI.
Create accessible color palettes so people who are color blind be able to use your products.
This subject may seem incredibly “big” for a single article, but it’s about the specific nature of usabilitythat we often overlook or confuse. With this appreciation, you’ll be able to design more effectively, and your website’s usership will be able to grow, too.
Usability replaced the outmoded label “user friendly” in the early 1990s. “Usability” has had trouble finding the definition we use now. Different approaches to what made a product “usable” splintered between looking at it with the view of the product in mind (i.e., the ergonomic design, such as a curved keyboard); looking at it from the point of view of the user (how much work and satisfaction/frustration he/she experiences using it); and the view of the user’s performance, which involves how easy the product is to use, if it’s to be used in the real world.
“Usability” refers to the ease of access and/or use of a product or website. It’s a sub-discipline of user experience design. Although user experience design (UX Design) and usability were once used interchangeably, we must now understand that usability provides an important contribution to UX; however, it’s not the whole of the experience. We can accurately measure usability.
A design is not usable or unusable per se; its features, together with the user, what the user wants to do with it, and the user’s environment in performing tasks, determine its level of usability. A usable interface has three main outcomes:
It should be easy for the user to become familiar with and competent in using the user interface on the first contact with the website. If we take a travel agent’s website that a designer has made well, the user should be able to move through the sequence of actions to book a ticket quickly.
It should be easy for users to achieve their objective through using the website. If a user has the goal of booking a flight, a good design will guide him/her through the easiest process to purchase that ticket.
It should be easy to recall the user interface and how to use it on subsequent visits. So, a good design on the travel agent’s site means the user should learn from the first time and book a second ticket just as easily.
This isn’t the only set of requirements for usability. For example, a usable interface will be relatively error-free when used.
We can measure usability throughout the development process, from wireframes to prototypes to the final deliverable. Testing can be done with paper and pencil but also remotely when we have higher-fidelity prototypes.
It’s important to analyze the users’ performance and concerns with a web design as early as possible. From there, we can apply a set of guidelines with a grain of salt; because they tend to be general, we need to adapt them to our specific area. Guidelines show a product’s features proven to improve usability. We can fine-tune design revisions according to these guidelines, as long as we look at all the dimensions. Sometimes, it might just involve tweaking a menu layout; or, it might involve looking much higher.
We have to consider the user at all points when determining usability. If our designs are to be “usable”, they have to pass the test with a minimum number of criteria. If our product were a mouse and not a website, we’d have to ensure that it conformed to standards (to receive that all-important “CE” imprint). For a website, it might be easier to explore how our design ranks alongside a competitor’s. Let’s go back to the travel agent’s and see where we might improve our design.
Users can navigate to “buy” button in 294 seconds, on average.
Returning users navigate to “buy” button in 209 seconds, on average.
18% of users bought a ticket on finding landing page.
42% of users went no further than the landing page.
Happy Huzzah’s Getcha There, Inc.
Users can navigate to “buy” button in 198 seconds, on average.
Returning users navigate to “buy” button in 135 seconds, on average.
32% of users bought a ticket on finding landing page.
12% of users went no further than the landing page.
Glancing at these metrics tells us something. We need to check out what “Happy Huzzah’s Getcha There, Inc.” is doing, because something’s certainly working there!
In addition to content, we have web development and design considerations for usability. These are (mainly) outlined as follows:
Servers used to host websites are a usability consideration. Two major factors to consider when selecting servers are:
Speed – Google ranks by usability to some extent. How quickly your page loads is one of the ranking factors — so, speed to load is also a Search Engine Optimization (SEO) concern. A website that’s slow to load and slow to respond turns users off. Servers influence how fast a page will load depending on their capacity, specialization, etc. Naturally, it’s not just servers that influence the speed of a page — the web designer has a lot of influence over this in the way he/she serves images, graphics, etc., too.
Downtime – During downtime, a website is completely inaccessible. It’s fair to say that most websites will experience the occasional moment of downtime when a server falls offline. However, some suffer more than most; choosing a reliable server enables the delivery of a better user experience. One bad experience might have a user shrug and come back later. More, and that user may go somewhere else.
Focus the HTML you use on delivering a better user experience. While, to date, only mobile websites benefit from user experience ranking on Google, it’s probably fair to infer that in the future this will also be true on all platforms. Some key considerations for your HTML include:
Use ALT tags – ALT tags are used in conjunction with images; they let you convey additional information about the image that isn’t displayed as part of the main text. ALT tags assist with indexing in search engines (they let you tell the search engine about the content of the image). They also help with screen-reader narration for visually impaired users.
404 Not Found Page – Broken links happen, particularly in large websites. While ideally, you should test all links on a regular basis and repair any broken ones, it’s a good idea to have a plan for when users encounters a broken link. That plan is the “404 Not Found Page” — a well-designed 404 page will try to assist the user in returning to a positive experience. The default 404 page isn’t helpful in this respect. Clunky and primitive, it gives users the impression that they’ve come to the end of an escalator that isn’t attached to a floor. They don’t want to fall off and land on an archaic message. As a designer, never lose sight of that. That little courtesy goes a long way.
The visual factors that impact the overall user experience are the factors where, normally, youthe designer have the most control. That means paying careful attention to:
Font Size and Color- Choose fonts that are easy to read. That means high levels of contrast with the background and font sizes large enough for users to read easily. If some of your user base is elderly or visually impaired, make fonts larger.
Branding – Branding, in particular the company logo, helps users know where they are online. Based on eye movement patterns, the ideal place for the logo is the top-left corner of the screen. This is where users who read from left to right are most likely to look when first arriving on the site.
Layout Colors – Colors need to be consistent in order to convey branding and also to develop an aesthetic appeal. In addition, they must deliver readability. Often, they need to convey hierarchy of information, too.
Navigation – For users to get the most from a website, they need to get from point A (the entry point) to point B (where they want to be) as quickly and easily as possible. That means providing useful navigation systems, including (for larger websites) search functions, to facilitate that transition.
Content – The web designer may or may not be responsible for creating the website copy, but there are design elements in the way you display that copy for user experiences:
Headings – Organize content into manageable chunks through the use of headings, sub-headings, etc. This means developing a scheme for consistent display of each type of heading throughout the website, ensuring a consistent experience as users navigate around the site.
Paragraphs – Make paragraphs clear and easily recognizable to help prevent the user from being overwhelmed by a “wall of text”. You can also apply Gestalt principles to paragraphs to help better illustrate the relationships between blocks of content.
Website Usability Tools
Testing your website is easy, thanks to a host of tools. Many are free; some are freemium, others premium. Get one that works for your website, then let it gather the data about usability. Many let you test on your existing usership; you can tell from the data what they’re experiencing, what’s going right and not-so right. Here’s a list of some:
WebPage FX is a tool for testing the readability of content on a website.
Pingdom offers an insight into speed of response from your website.
An Element of User Experience
It would be wonderful if we could draw the borders of user experience as if it were a country on a map. Unfortunately, the reality is more fuzzy. As much as we like making sense of phenomena and applying frameworks, we must remember that users are people. As such, they make decisions steered by logic andemotions.
As we saw above, many designers get confused at the difference between usability and the larger branch of user experience. Core areas of the user experience include (Usability, 2014):
Usability: A measure of a user’s ability to arrive on a site, use it easily, and complete the desired task. Remember, we’re designing websites, where there is flow, rather than focusing on page design and assuming everything will flow later.
Useful content: The website should include enough information in an easily digestible format that users can make informed decisions. Keep Hick’s Law in mind here: streamline your design to be simple. Use restraint.
Desirable/Pleasurable Content: The best user experiences come when the user can form an emotional bond with the product or website. That means moving beyond usable and useful and on to developing content that creates that bond. Emotional design is a huge part of the user experience. An English grammar website that offers daily tips might prove itself useful. But if that tip is funny, users won’t only remember the rule; they may return for more!
Accessibility: For people with different levels of disability, online experiences can be deeply frustrating. There are a set of accessibility standards with which sites should conform to assist the visually impaired, the hearing impaired, the motion impaired, etc. Content for the learning disabled needs careful consideration in order to provide a more complete user experience, too.
Credibility: The trust that your website engenders in your users also plays a part in the user experience. One of the biggest concerns users have online is security (in many cases, they worry about privacy, too). Addressing these concerns through your design, for example by showing security features and having easily accessible policies regarding these concerns, can help create a sense of credibility for the user.
Naturally, the usability of a design is important. However, we need to consider usabilityalongside these other concerns to create a great user experience. The UX comes as much from graphical design, interactive design, content, etc. as it does from usability alone.
The Take Away
Usability refers to how easily a user interacts with a website or product. It comes under the heading of UX design, but is not the whole story of user experience design. In usability, we designers have to focus on three aspects in particular:
Users should find it easy and become proficient when using a design interface.
They should be able to achieve their goal easily through using that design.
They should be able to learn the interface easily, so that return visits are just as, if not more, easy.
We should analyze our web design when determining usability, taking into account everything from accessibility and usefulness of content to credibility and designing content users will enjoy. That means thinking ahead. Who are your users? Might they have trouble reading your text? Can you make them smile or laugh by adopting a fun tone (e.g., edument—entertainment and education—is useful when teaching)? Users will want to feel reassured that they are navigating securely. Make them feel so.
You also should consider the realities of the web. Finding a reliable server for your site that loads quickly is crucial. At the HTML level, you should use ALT tags and design a helpful catch page in case a link is broken.
Visual factors, including layout colors and content formatting are important, too. Having a good-looking site is all very well, but can users navigate easily?
Finally, test, test, and test. A plethora of website usability tools exist. Never underestimate the value of testing from an early stage. By working out where users click, for example, you’ll be well on track to learning their ways and how usable your site is.
Usability tests are critical for the success of any product. One must note that the scope of conducting the usability test is vast and can be practically done over any product from cloud-based software to futuristic gaming consoles.
However, in this blog, we will discuss specifically how usability tests work for websites individually. Although the principles for web usability are similar as with any other products, except the fact that they are more important considering that there are more than billions of website as of today.
The point being, there are a lot of websites which are similar to each other in plenty of ways. If you’re not standing out or being useful, users are likely to move to the next website.
Procedure for Running Website Usability Test
Damian Rees, Co-founder of Experience Solutions explains how he adapted website usability testing for the most optimized experience. Since the internet is highly accessible, one of his core principles is setting criteria and expectations beforehand so that your tests proceed with the right level of technical proficiency. While doing so, here are four points one should keep in mind:
#1. Ask users to behave naturally.
Your website must support multiple test cases and modes of use. Getting started with open-ended tasks will give you a preview of how users use your website outside of the testing environment.
#2. Allow users to finish the task as they want.
If the users are getting sidetracked, let them be. In the real world, you won’t be there to keep them on the track. The purpose of the test is how a user interacts with your website.
#3. Do thorough competitor analysis.
Only testing your own site will rob you of the broader context. After all, it isn’t about how users interact with your website- it’s about tailoring your website based on how they use other websites.
#4. Hide the testing site.
Due to general psychology, your users are bound to get conscious and less honest if you tell them which site you’re testing. Users may find it out at the end though the more you delay, the accurate the results.
It would be obviously helpful if you cater to these points. Though don’t be too rigid. Keep a loose attitude and give space to the user for more natural results.
Criteria for Running Website Usability Tests
When conducting a usability test for a website, there are certain criteria for websites which might not be relevant to other products. Jacob Gube, founder of Six Revisions- a web and app development company, says that qualitative feedback alone is not enough for websites. That is because simple technical performance criteria like speed affect the user experience drastically.
There are six basic criteria that must be taken into account while doing the website usability tests:
Content is the heart of any website and hence, special stress is given to this criteria. One should be paying extra attention to the site’s legibility, comprehension, language & grammar and ease of reading. There are various readability tools available on the internet for you to use.
Navigation is critical to decrease the bounce rate and increase the page visit time. Make sure you place proper links, CTAs, links etc. evenly on the page. In how many clicks your users are being able to finish their tasks? Card sortingwill help you answer this question.
#3. Task Success
One of the most important aspects of the usability test is to find whether the users are being able to complete the tasks or not. For example, creating an account, placing an order, etc. To get started with this criteria, assign an open task to the users to analyze the task success rate and then directly follow up with a single question.
#4. UX Design
It is quite obvious that focus on user satisfaction may get out of sight while we are focusing on the qualitative features. There are various types of tests, interviews, field studies, diary studies to help you access this factor. The point being, just being usable is not enough, the experience should also be delightful.
Do you like to wait for the website to launch? Neither do your or your users. Speed is indeed significant for your website’s success since UX, functionality and SEO performance depends on it. Google Page Speed is a handy tool for you to measure your website’s speed.
Eyetracking is particularly useful for understanding details in users’ reading behaviors and how they deal with advertisements. But new UX teams shouldn’t employ eyetracking for their initial usability studies. Only at the highest UX-maturity levels should a team start using eyetracking because eyetracking has several downsides, including:
difficulty of tracking the eyes of people with thick-rim eyeglasses or heavy eye makeup
Luckily there is now an alternative: instead of tracking users’ eyes, we can track their ears.
Eartracking in Its Infancy
We first became aware of the benefits of eartracking during our research with cats and dogs. Many animals have ears that visibly turn in the direction of their attention. Turning the ears is clearly an evolutionary adaptation that allows predators to keenly follow the prey and it also enables potential preys to notice a stalking predator before it comes into visible range.
(Some dogs have floppy ears that don’t turn, but even these ears are adaptive: floppy-eared dogs are usually so cute that humans will feed them, and they won’t need to hunt in the first place. Also, floppy-eared dogs have high job security: the United States Transportation Security Administration has decided that the vast majority of bomb-sniffing dogs in airports should be floppy-eared breeds because of better passenger acceptance caused by extra cuteness.)
Although humans don’t have floppy ears, our ears don’t visibly turn in the direction of potential dangers or potential food. However, evolution has preserved vestiges of ear-turning muscles, as is clearly demonstrated by anybody who wiggles their ears. Humans have some ability for small ear movements under conscious control. What is less commonly known is that we also exhibit subconscious micromovements of the ears. When analyzing ear movements as an indicator of human reaction, we look at two new biometrics: 1) the distance the ear moves, and 2) how many times the ear moves (known as microwiggles) per second.
These micromovements are not observable by the naked eye, because the ear moves less than 0.1 mm(0.004 inches), and most people don’t know that the human ear can microwiggle up to six times in one second. In fact, because microwiggles of the ear are so unobtrusive, they have not been the subject of serious research until now.
Ear–Mind Hypothesis Is Real
The main finding from our research is that the ear–mind hypothesis is as valid as the eye–mind hypothesis that underlies the use of eyetracking in user research. The eye–mind hypothesis states that people look at what they are interested in, which is why we can use measures of gaze direction to estimate what the user is attending to. Similarly, the (now confirmed) ear–mind hypothesis states that micromovements of the ear are directed toward things that are startling or surprising to the user.
Note the difference between the two sense–mind hypotheses: the eyes might look at anything that is merely interesting, while the ears react only at unexpected stimuli that are potentially of high interest or importance. This difference is obviously caused by the evolutionary background for micromovements of the ear. Fossils more than 160,000 years old have indicated that our ancestors had ear macromovements, about a thousand times more pronounced than our ear movements today. These movements supported survival in eat-or-be-eaten scenarios, where noticing surprising or startling things were of utmost importance.
Though small, ear micromovements can be picked up by an 8K video camera that’s placed closely enough to the ear that’s being studied. (8K cameras are not yet common, but NHK in Japan has been experimentingwith this next-generation video technology since December 2018 and was kind enough to lend us one of their cameras.)
A second technology advance now allows us to turn micromovement video streams into true eartracking and tell where the user’s attention is directed. A machine-learning algorithm has been trained with 10,000 hours of video recordings from our most recent eartracking study, during which we tracked users’ ears as they attempted standard tasks on a wide variety of websites. Unfortunately, running the resulting AI software in real time (which is obviously required for practical use of eartracking in a usability study, since, in order to ask followup questions after a task, the facilitator needs to know what the user attended to, as well as how far and how many times her ears microwiggled) currently requires a supercomputer rated at 50 petaflops per second.
Eartracking is not a highly sensitive technique: it can only capture environment stimuli that receive a high level of interest. Thus, a phenomenon like banner blindness, which we have documented through our eyetracking studies, would not be something that we’d expect to capture with eartracking because we do not expect users to be surprised by advertising banners (unless they come with loud noises that play automatically). Thus, banners would not register in eartracking even if users did pay attention to these ads (which we know from eyetracking that they don’t).
Another interesting contrast between eyetracking and eartracking relates to gender differences. First, let me point out that we almost never observe any substantial differences between male and female users in usability studies. In terms of the user interaction, users of both genders are equally annoyed by, for example, zigzag layouts that impede scanning.
However, there are certainly some differences in the content that different genders find interesting. For instance, men in our eyetracking research looked much more at certain body parts in photos. (For the sake of remaining a family website, we won’t go into further details, but the heatmaps are in our book.)
Eartracking found another interesting difference between male and female users: Male ears twitched quite substantially when the user interface included pictures of wooly mammoths or stereo equipment. (In fact, these are the only instances in our entire research study where the micromovements reached their maximum extent of 0.1 mm, and up to five wiggles in one second. Nonmammoth and nonstereo UI elements rarely registered more than 0.05 mm, though photos of elephants scored 0.08 mm — possibly because of the resemblance between mammoths and elephants during the initial 100 ms of exposure.) In contrast female ears didn’t twitch any more on pages showing wooly mammoths or stereo equipment than on webpages with other pictures.
Why do men’s ears react more strongly than women’s ears to webpages with pictures of wooly mammoths? We can only speculate, but it’s likely that during the era of the cave people, it was mainly the men of the tribe who were assigned to hunt the wooly mammoth, and since such a kill would be a major win, the hunters got highly attuned to recognizing this animal. As for the stereo equipment, your guess is as good as mine.
The finding of gender differences in mammoth webpages was due to a lucky coincidence: we employed the new eartracking technology with a few of the users during our recent study of UX design for children, and happened to include the wooly-mammoth page on National Geographic Kids’ website. (Subsequently, we repeated this test with adult users and confirmed the finding.)
Eartracking Strengths over Eyetracking
It’s still early days in applying eartracking to UX research, but it seems a promising new methodology. Compared to eyetracking, eartracking has the advantage of being able to measure surprises, which will be particularly valuable for game user research. There’s also an obvious accessibility advantage when testing with people who are blind and can’t participate in eyetracking studies.
On the other hand, eartracking has some weaknesses that parallel the disadvantages of eyetracking. As we mentioned before, certain user characteristics are difficult for current eyetracking equipment. Similarly, eartracking equipment can’t track people with:
hair that covers the ears
diamond stud earrings that reflect light into the camera, or any ear cuffs
If your user research budget has an extra $100 million that you don’t know how to spend, and you have a highly advanced UX team and mature product team, consider allocating the money to an eartracking study.