Voice Interfaces

The next generation in-car voice system for Mercedes-Benz.


Project information

I spearheaded the complete redesign of the in-car voice system for the 2019 Mercedes-Benz system release. As experienced by many users, in-car voice systems have commonly been robotic, command-based, and limited. This redesign propelled this next generation system towards more organic speech interactions and understandable, supporting on-screen interactions. 

The final product can be seen in the 2019 A-Class models. 


Mercedes-Benz R&D


Timeline: 2 years



Revolutionized in-car voice systems to behave more naturally, and closer to the consumer's mental model of voice assistants. 


Mercedes-Benz’s new MBUX in-car assistant and smart UI rocks
— TechCrunch
... the system worked [like] Amazon Alexa or Google Assistant [and] integrated well...
— The Verge



I started by pinpointing the current issues between speech dialogues and the corresponding interactions. My team defined principles for both digital and dialogue interactions. From there I dove deeper into the needs of the user defining use cases, creating flows, prototyping, testing, creating final screen PSDs, and visualizing the voice feedback look & feel. 


why use screen content to support the dialogue? 

Robust levels of information feedback

Robust levels of information feedback

Reminders for where you were in the original conversation.

Reminders for where you were in the original conversation.


dialogue & interaction flows

This project was a result of a joint effort from our speech team, development team, and my team (design). Our speech colleagues provided my team with some predetermined dialogue flows which we assessed from a user experience perspective. As a result, we proposed major dialogue changes based off our findings.

Provided Interaction

Provided Interaction

Suggested Interaction

Suggested Interaction


Speech Team Dialogue Flow (Provided)

Design Team Dialogue Flow (Suggested)

Above you can see an example of a provided dialogue flow. If the finding of a command came back negative, the entire interaction would cancel. I suggested that we provide a resolution by continuing the dialogue interaction with the user to offer a completion of the initial command. This "resolution" of interactions was adopted for the rest of the dialogue system. 



The information architecture above is a result of that collaboration. It showcases the final version of the primary speech interaction flow with the resulting on-screen feedback. 



layouts and on-screen interactions

I defined the on-screen interactions to work primarily for voice & touch, secondarily for rotary dial. For example, the tiled results are designed in such a way that the user can simply say "next page" or swipe to the next page; either way both interactions result in a paging transition to the next set of results. Concepts were user tested to confirm usability and understandability. 



These wireframes outline the assets & interactions that are needed throughout the voice UI as defined by myself and two other design colleagues. These particular screens are visual responses to dialogue interactions including: the voice specific area, tiled results, map results, messages, and weather.




I created all of the heroscreens and layouts for the entire voice system within Photoshop: over 40 different screens, including 18 individual backgrounds for the weather screens to indicate different weather patterns for day and night (visible below).


feedback states

Additionally, I defined the main feedback states to keep the user informed on where s/he is in the interaction.

  • Waiting for Input

  • User Speaking

  • Processing

An example can be seen in the video below. 


how it works

This was one of the many commercials Mercedes-Benz released of the in car voice system.


Aqua Terra Culinary

A lunch ordering system for web and mobile.


project information

This project required a redesign of the web application for an internal-facing employee dashboard, a consumer-facing interface, and the creation of a mobile application. After experiencing the original consumer-facing website, I determined the new site needed to optimize for usability and understandability. Based off the client’s preferences and budget, the mobile application would be a simplistic extension of the website allowing only for ordering. 


aqua terra culinary


Timeline: 2 months



  • Reduction of steps
  • Clarification of information
  • Optimized for ease and accessibility  

There’s no nutritional information for the meals I’m looking at. Since my child has a nut allergy, I want to be extra careful about what I’m ordering.
— User

wireframe web

I defined pain points of the current web application with a usability test. To revitalize the areas that needed it, I  reorganized the layout to be able to view the important pieces of information easily, and shortened the number of steps in the checkout flow.


wireframe mobile

As a simplified version of the website, the mobile application focuses primarily on viewing the month’s lunch options, selecting, and ordering. 




I utilized a bright and friendly color palette since the lunches were inteded for elementary school children. Even though the main ordering would be done by their parents, the cheerful tones would differentiate it from other functional websites. 



I used the same color palette as the website to carry over into the mobile application. In this example, the parent has two children that he needs to order for. The two children are color coded to ensure each gets the meal that s/he wants.


Voice Based AI 

Design the future intelligent car system.


project information

Like J.A.R.V.I.S. in Iron Man, Samantha in Her, and countless of other imaginary personifications of voice based artificial intelligence, my team tackled defining in-car AI methods of the very near future. 

Some of these concepts, like understanding and responding to implicit dialogue, have already been implemented in the most recent Mercedes-Benz system (2019). Most, however, will be spoken of in broad terms since these concepts aren’t yet in production.


Mercedes-Benz R&D


Timeline: 4 months



  • Dramatic qualitative improvements in user feedback regarding

    • Convenience

    • Ease of use

  • Pushed the current production timelines to include implicit dialogue interactions




My team focused on several large categories regarding how voice users could benefit from an in car voice system.

Our concepts were built atop a machine learning base to ensure the system’s ability to mold to the user’s needs and personality, and to be able to reflect natural speech patterns. Additionally I further defined how to provide feedback of the AI states utilizing cabin lighting and graphical elements. 



In order to narrow the scope of this expansive task, my team wrote and clustered pain points of the current system to define areas of focus. We each took a few topics to own.



I sketched solutions for the use cases I brainstormed. My team consolidated all ideas and formed design principles to align our results.


general concept testing

My team created prototypes of the high-level system concepts to test several voice interactions we designed out:

  • Interrupt a user's conversation to provide pertinent information to the conversation

  • Using only GUI to suggest a stop along the way

  • Offering a smaller number of curated results rather than a large list

We partnered with our in-house research team to perform qualitative tests. 

Interrupting me while I’m speaking to another person is kind of creepy. I don’t mind if the car collects data, I just don’t want to know that it’s doing it.
— User 9

further concept testing

As a result of the general concept test, we were able to focus our attention on concepts that were well received, and of greater perceived value to the user. We refined the questions that we needed to answer so that ultimately we could provide the most accurate recommendations and guidelines to the production team.


visual - digital

I explored visual variations of what an intelligent system could manifest as digitally, utilizing space along the sides of the screen, as well as designing options with depth. With Cinema 4D I tested motion and surfaces. 


visual - environment

The entire cabin space was considered in the design of communicating intelligent states. My team and I specified which areas would be most appropriate to communicate items of most important, moderately important, and less important suggestions.


Machine Learning 

Define a pattern for interacting with the system intelligence.

Screen Shot 2018-07-23 at 10.26.59 PM.png

project information

As the car becomes more capable, the user has more opportunity to utilize the interior cabin space for other tasks. There's a period of time between a system that doesn’t understand the user at all, and a system that is human-like itself, where the system still needs to get to be trained to learn about human behavior. This is the era that we’re in right now. This project focused on creating interactions for this in-between time and how the user could interact with the still-growing system intelligence. 

No designs are visualized due to confidentiality.


Mercedes-Benz R&D

role: ux & interaction designer

Timeline: 6 months



Presented concepts to Head of Design, Head of Engineering, and other company executives, influencing the company in rethinking the presentation of intelligence to the user. 




My team designed a system that modifies the interface as it gets to know the user more over time. We tackled several new features. To effectively test and present these new concepts, we put together a prototyping setup where we were able to show the multiple scenarios.


User Journey

My team identified our users and ensured we accounted for every kind of scenario that s/he would encounter during any time of the day by assessing needs through multiple user journeys.



I wireframed interaction variants for each feature. My team laser cut and assembled a prototype setup where we could use mounted iPads to simulate the individual and interdependent screen interactions so we could test our multi screen concepts.



Using a rich wireframe styling, my team created an interactive prototype that responded to touch with speech and UI changes. I used the prototyping tool, Flinto, for parts that required only visual interactions. Our UX Engineer was able to program the voice interactions that my team designed.


Client Project 

A smart lockbox mobile application.


project information

I was hired as a freelance designer to visually design out a mobile application that would accompany the company’s smart lockbox. The client had worked with another designer previously who had worked on a darker styling. I went in a lighter direction to encourage a more modern feel. Additionally, I advised on the interactions and architecture to ensure optimal understandability and usability.  


startup client

role: freelance designer



  • Created clear solutions for mobile interactions & navigability
  • New investment buy-in

03.24.17_PINGZEE FLOWS MARCH 24 2017_Page_16.png


I was given a series of wireframes to derive the visual styling and screens from. The flows weren't intuitive so I suggested that my client and I work together to create a more straightforward access point for all the available features. 



Prior to beginning my process, I clarified what all of the important features were, why they were important, and what kind of functionality was required. I then quickly sketched out some ideas. Below are some examples of what I proposed.



As we continued our collaboration, we would discuss and I would provide input as visualized here.



The main color palette was carried over from the previous generation look & feel, however the final results are quite different. I updated the typefaces, and designed a new icon family.




Multimodal Interactions 

Explore and define new system interaction methods.


project information

As in-car systems become more intelligent there are more opportunities to provide the user with relevant and contextual information. Additionally alternative ways to interact with digital products come to light, bringing the machine & human relationship closer to the organic way that humans communicate with one another. This project tackled how to provide the user with appropriate information through understandable, natural interactions.

No designs are visualized here due to confidentiality.


Mercedes-Benz R&D

role: interaction designer

Timeline: 2 months



Showcased the versatility of alternative modes of interacting through a direct presentation to the Head of Design. 




I explored different input methods (gaze, gestures, hovering, and so on) to determine which were the most optimal to achieve the end goal of reducing the user’s cognitive load. From here I defined the specific usages within each system feature to create an ease of use not currently seen in today’s in-car programs. 



With a focus on the map feature, I explored many options to use alternative methods of interacting including:

  • Hover

  • On proximity

  • Gestures

  • Force Touch

I designed the output to alter contextually based on where the user was in the system.



I then created rich wireframes showcasing the results of specific interactions. Many more variations and flows were created in Sketch, and then prototyped in Flinto.



I created a detailed prototype in Flinto for our UX Research team to test. The trickier interaction to test was the “on approach”. I utilized a Microsoft Kinect device to simulate a reaction for approaching the screen.


Terafina, Inc. 

A financial tablet application.


project information

I partnered with the Terafina, Inc. product team to create a better experience for banking end users as they go through the online application form on a tablet. The main goal was to determine how to reduce the application drop off rate over the course of a 2 week sprint. 


Terafina, inc.

role: researcher, ux & interaction designer

Timeline: 2 weeks



Improved rates of process completion within the final product.


Screen Shot 2017-06-21 at 9.24.40 AM.png

Usability Test

To determine the pain points for the user flow, my team and I performed a usability study. I tested the application flow with 7 randomly choses individuals. Below are our results.

editUsability Test Results.jpg

It feels like a long process. I've gone through multiple pages, but i'm still on step one.

- User 2


Some major issues that the users encountered as they went through the application included:

  • Lacking a sense of progress through the form
  • Lacking a sense of security as a banking system 

As a result, my team and I determined the most important things to tackle during this two week sprint was the design of the progress bar. This was the most noticeable missing piece of the system. Users would repeatedly state that they would have exited the form on stage 2 because there was no clear indication of where they were and how many more pages there were left.  



As the primary point of reference for the user, the progress bar should reflect the number of steps still needed to complete the process & provide the user with a feeling of movement through the forms. Below are some of my sketches that explored those capabilities.



I wireframed three progress bar variations that best resolved the issues that the users had described.



My team created rough prototypes to test in a comparative study. To determine which was the proper recommendation for our client, we measured success through the qualitative rating of three key components:

  1. Understanding of Current Location
  2. Progress through the Form
  3. Anticipation of Next Steps

final recommendation


 Human-Machine Communication 

Research project.


project information

I pitched this exploratory concept because I believe we need to define how people and machines communicate with each other more organically. As far as human to human interactions go, the majority of communication is nonverbal. Typically machine to human communication has been largely explicit through spoken or written dialogue.  In theory we can use alternative, non explicit, means (light, sound, touch, smell) to communicate more naturally with a user.  I lead the team in this exploration primarily pushing research & ideation focused on a dynamic car cabin environment that could appropriately respond to a user's observed emotional state.

No designs are visualized here due to confidentiality. However, my process can be read about below.


Mercedes-Benz R&D

role: lead, interaction designer, prototyper

Timeline: 2 months




I created an in-car user journey that I broke down by phases of emotional output. With my team, I then brainstormed what people thought would be appropriate responses to these emotions. Below I further define the expected response flows to specific primary emotions I determined were high priorities based off the original user journey.


information areas

Within the cabin space, I assessed where primary and secondary forms of communication would be most effectively placed for the user. 

  • Dashboard

  • Window

  • Footwell

  • Ceiling



I decided to use our limited amount of time to focus primarily on light and projection. I explored varying intensities and hues of light, testing how each would effect users individually on an emotional scale, and how much was distracting. I used an LED strip and an Arduino board setup for testing. 



I also explored a number of ways sounds could be used to communicate emotion. Utilizing a soundboard and music theory, I created tones and chords that would illicit a feeling.



Our sense of touch is an extremely powerful tool to be able to discern what's going on. I mainly used changes in temperature and material textures to enhance the situation. 


setup build

As my team was exploring these concepts, I also built a makeshift dashboard by molding and attaching acrylic pieces to a metal platform. This was used for physical prototyping of the concepts. 



Through animations, and the physical prototype, I created several light environments that cover the system's response to the user emitting positivity, excitement, and relaxation.


Project & Product Sharing 

A content sharing web application for large companies.


project information

I was asked to design a content sharing platform. Large companies, such as Mercedes-Benz, have multiple teams working on a variety of overlapping projects. The project goal was to reduce the redundancy of work, promote collaborations, and encourage cross functional behavior by offering a transparent platform for teams to share their work & product testing. 


mercedes-Benz R&D

role: visual designer

timeline: 2 weeks




The homepage's desired functionality was to allow for browsing, so I provided a variety of layout proposals that promoted that interaction. 



My client liked the casual browsing that a standard grid layout afforded , but wanted a more unique look and feel. I decided to create a free flowing template where the natural direction of the images would prompt the user to browse and continue finding new things.

Below is the base flow including a breakdown of the content page.



I put together a moodboard emphasizing minimal colors, graphical geometric shapes, and warm accents.