A project to present how gesture control can be integrated in vehicles.
Eric Ries, author of Lean Startup, describes a startup as:
“A human institution designed to deliver a new product or service under conditions of extreme uncertainty”
I was responsible for defining high-level structure of the project together with the team, designing flows and interactions, as well as conducting user testing.
User Interface Design
Usability Testing and Interviews
“Design isn’t just about beauty; it’s about market relevance and meaningful results.”
In a startup company like uSens, I am always involved directly in the business side. Understand company strategy in order to design the right solutions for the sake of the company’s business.
This project was initialized because our business manager was reaching out to several automotive companies but we didn't have any proper product for them to showcase our technology.
We only had 2 weeks including implementation.
To showcase the performance of our hand tracking technology.
To demonstrate how the hand tracking technology can be used in car.
For our business team to get clients’ interest and thus bring potential revenue for the company.
For the users
Be able to interact with the infotainment system using their hands.
Focus on driving and focus on the road.
Less time spent looking at the screen when interact with the infotainment system.
Everything was NEW. It was new to the company, because we haven’t done anything in the automotive industry before. It was also new to the industry, thus there isn’t much that I can look up for reference.
Time for completing this project was very limited.
Technology limitation. The algorithm wasn’t really ready and robust enough and we only had a few limited gestures that we could actually use. We had to figure out how to design and use them in a good way.
Since everything was new to us, time was limited, and technology wise, we could only design for what we have, we made some constraints and assumptions when designing for this project.
After company’s business strategy was confirmed, we learned that this demo is targeting cars that have smaller screen for their in-car infotainment systems. We discussed with engineers and decided to use a fairly common and simple way to approach this. Using an Android pad to simulate an in-car infotainment system screen and use our existing sensor to track driver’s hand motions.
RESEARCH & INSIGHTS
We interviewed with our business team to learn what kind of people he is talking to in other companies. We also did the secondary research online. Based on the limited information we had, we define our target audience, which is most likely our clients, to be around age of 25-50, familiar with cars and existing infotainment systems, interested in new technologies, openminded and willing to try new things.
There are certain rules and regulations we need to follow in the safety category. Based on the survey conducted by the U.S. Army Human Engineering Laboratory, the single glance time for radio task should between 0.5 - 1.5 second. If the car was at a speed of 65 mpg, which equals 29 meter-per-second. Every time the driver takes his/her eyes off the road, the chance of accident will be increased. Our app should require no more than one second for each command as well as obtain information from the screen.
PIONEERS AND COMPETITORS
Based on research, there are some existing concepts of hand interactions in cars including BMW’s gesture control system which has already been used on some of their cars.
There are two types of gestures: dynamic gestures and static gestures.
The dynamic gestures offer user the “mouse-like” function whereas the static gestures provides user with the “button-like” function.
The mouse-like operation requires users’ constant input and attention; whereas the button-like operation requires onetime input with no attention.
ANALYZE CURRENT GESTURES
After checking with the algorithm team, we found out that the only resource we have, the set of gestures we could use, is the one they trained for a different project. There are not enough gestures and many don’t make sense for a driver to use in his/her car. Since I don’t have the power to tell the algorithm team what to do, I had to make sense of what I got by utilizing the power of design.
It would be a lot better if we could design first and let algorithm team train for a better user experience. However, there was not enough time for that. I, as a designer, had to face the reality and the constraints, and design around it.
At this point, I've started to see some patterns of connecting gestures with functions. Design plays an important role to bridge them and make sense of them.
Only several apps are universally being used in a vehicle, such as music, navigation, climate control, etc. Since the time is short for us, we decided to choose only music and a simple incoming call feature for this short-term project.
There’s no need to mention the importance of affordance. It has become the fundamental elements to understand the psychological communication between user and objects.
Here comes the process of connecting gesture with function. Many people use left and right hands to remember and explain the concept of coordination system. So, I adopted the same concept. I use the direction of the thumb to indicate any directional functions such as the next and previous button in the music app.
As for the visual feedback, the color will be changing according to each action performed by the driver.
Motion was added to reinforce the visual feedback for the dynamic gestures - the more the hand moves, the more the card flips.
We found out that people are having trouble remembering all gestures at the very beginning, which makes it hard for the first-time user to even interact with the app because they are not sure what to do with their hands.
Thus we made a short tutorial teaching all the right gestures.
We did simple testing on the tutorial as well, trying to see if people understand a virtual skeleton hand doing the gestures. The result turned out to be not as good. In the end, we decided to film real hands doing the gestures so that users could map a lot easier to their own hands.
Since video talks better than us explaining a complex concept with images and texts, we decided to make a promotion video.
Gesture control has a steep learning curve.
Gesture control has a higher threshold and requires less attention.
There’s no doubt that in a lot of scenarios, gesture control is a lot better than pressing a button, infotainment system for example. While driving, it is good to keep the driver's attention on the road all the time. In other words, the infotainment system should require as less attention as possible. Unlike physical buttons and a touchscreen where the user needs to remember the exact position of each function and speeding time to locate them each time, gesture control gives the user freedom to perform the function without asking them to look at the screen.
I believe that gesture control will someday become the mainstream and people will eventually get used to using it naturally just like when apple introduced touchscreen iPhone to the world in 2007. Before that ,when the first generation of touchscreen phone came out during the early 90's, most people doubted its functionality and usability. But just look at nowadays? Are you not using a touchscreen phone?
Many people seem to have huge concern regarding taking away the physical buttons. Here’s the example of eliminating the home button from iPhone. It was the iconic element for iPhone ever since the 1st generation. People complained about getting rid of the home button that they have gotten so used to.
But now, people have started to enjoy the convince of full screen and hard to go back to the phone with a physical button on the front side of a smartphone. Every other phone brand started to follow this trend as well. This is how fast a good new way of interaction can be introduced to people’s life and change their user habits.
A similar trend is happening in the automotive industry. One example is the Tesla. The model S has adopted a large breakthrough screen to replace the conventional screen. People were impressed. Now, in the model 3, Tesla introduced touchscreen originated operation and took almost all physical buttons out of control panel. Again, it challenges the conventional way of interaction and people seem to like it.
By eliminating tangible controls opens a new gate for us to reconsider the human-machine interaction. What if the screen is not a touchscreen? What if the screen is not reachable by the hand? What if the screen is even bigger than what Tesla has? There are many possibilities and opportunities for us to re-think and re-design the whole system of in-car interactions.