This is an HMI design challenge I completed at New York University, where my research centered around designing for self-driving systems future vision. However, I still wanted to inform my visual decisions with user-centered methodology.
RESEARCH
This design question is very vague, so I first started dissecting it. There are so many self-driving experiences out there, so what is the specific scenario we’re tackling? I started with conducting user surveys and interviews, asking “What comes to mind when I talk about self-driving cars?”
Interestingly, people often think about full-automation vehicles, the more futuristic, sometimes even scifi perception of it. I’d like to highlight the interesting findings I received. Many users report feeling reluctant about adopting self-driving features due to safety concerns.
CONVERGE
With that research, I centered this project around a more North Star area: aiming for Level 5 full automation system because that’s an area that requires the most UX support and user advocacy. Imagine we have that for the future. As a UXD, my role now is to identify how we could facilitate the transition into a fully autonomous era so that users feel safe and more trust toward the system.
RESEARCH
Now, the next piece of the design question: There are so many HMI options. What is the specific medium we need?
I researched into existing HMI methods, and that’s when I started looking at AR heads-up display (HUD). I noticed that many companies currently use AR HUD in GPS or ways to alert users of road conditions.
EXPERT INTERVIEW
Now, the next piece of the design question: There are so many HMI options. What is the specific medium we need?
To help me decide on how to move to the design stage, I conducted expert interviews with 2 experts who have worked with self-driving vehicles extensively. Here are some key insights they offered.
CONVERGE
DEVELOP
TESTING
Now how do I bring this design to a group of audience? How do I test a future-vision project? I did not have much support at NYU in this area, so I had to come up with alternatives myself.
TESTING
The main feedback I got were just it looks cool, it’s interesting to see, great idea. Ultimately, users still know it’s a VR headset so they are absolutely safe, but it’s a first step of testing I could do with the limited resources.
NEXT STEPS
Ultimately, I’m still designing for a North Star vision, so I need to pull it back and think about the present. How do we implement changes to the current reality, or 2-5 years into the future?
This project was my attempt at an HMI visual challenge that tackles a problem space I’m very passionate about, and informing my visual decisions with user-centered thinking, exploring how we could continue serving the users in a future that’s heavily directed by technology.