CINDY

Voice Interfaces

The next generation in-car voice system for Mercedes-Benz.

03-mercedes-benz-user-experience-cockpit-revolution-2560x1539-1280x770.jpg
 

Project information

I spearheaded the complete redesign of the in-car voice system for the 2019 Mercedes-Benz system release. As experienced by many users, in-car voice systems have commonly been robotic, command-based, and limited. This redesign propelled this next generation system towards more organic speech interactions and understandable, supporting on-screen interactions. 

The final product can be seen in the 2019 A-Class models. 

 
 

Mercedes-Benz R&D

role: UX, INTERACTION & VISUAL DESIGNER

Timeline: 2 years

 
 
 

impact 

Revolutionized in-car voice systems to behave more naturally, and closer to the consumer's mental model of voice assistants. 

 

Mercedes-Benz’s new MBUX in-car assistant and smart UI rocks
— TechCrunch
... the system worked [like] Amazon Alexa or Google Assistant [and] integrated well...
— The Verge
 
 

 
 
 

CONCEPT

I started by pinpointing the current issues between speech dialogues and the corresponding interactions. My team defined principles for both digital and dialogue interactions. From there I dove deeper into the needs of the user defining use cases, creating flows, prototyping, testing, creating final screen PSDs, and visualizing the voice feedback look & feel. 

 

why use screen content to support the dialogue? 

 Robust levels of information feedback

Robust levels of information feedback

 Reminders for where you were in the original conversation.

Reminders for where you were in the original conversation.

 

dialogue & interaction flows

This project was a result of a joint effort from our speech team, development team, and my team (design). Our speech colleagues provided my team with some predetermined dialogue flows which we assessed from a user experience perspective. As a result, we proposed major dialogue changes based off our findings.

 
 
 Provided Interaction

Provided Interaction

 
 
 Suggested Interaction

Suggested Interaction

 
 

Speech Team Dialogue Flow (Provided)

Design Team Dialogue Flow (Suggested)

Above you can see an example of a provided dialogue flow. If the finding of a command came back negative, the entire interaction would cancel. I suggested that we provide a resolution by continuing the dialogue interaction with the user to offer a completion of the initial command. This "resolution" of interactions was adopted for the rest of the dialogue system. 

 

architecture

The information architecture above is a result of that collaboration. It showcases the final version of the primary speech interaction flow with the resulting on-screen feedback. 

 

 

layouts and on-screen interactions

I defined the on-screen interactions to work primarily for voice & touch, secondarily for rotary dial. For example, the tiled results are designed in such a way that the user can simply say "next page" or swipe to the next page; either way both interactions result in a paging transition to the next set of results. Concepts were user tested to confirm usability and understandability. 

 

wireframes

These wireframes outline the assets & interactions that are needed throughout the voice UI as defined by myself and two other design colleagues. These particular screens are visual responses to dialogue interactions including: the voice specific area, tiled results, map results, messages, and weather.

 
 
 

screens

 

I created all of the heroscreens and layouts for the entire voice system within Photoshop: over 40 different screens, including 18 individual backgrounds for the weather screens to indicate different weather patterns for day and night (visible below).

 
 
 
 
 

feedback states

Additionally, I defined the main feedback states to keep the user informed on where s/he is in the interaction.

  • Waiting for Input
  • User Speaking
  • Processing

An example can be seen in the video below. 

 

how it works

This was an early prototype that illustrates the basic interaction flow that I defined, as well as highlights the wave states that I designed.

 
 
 

Aqua Terra Culinary

A lunch ordering system for web and mobile.

aquaterra_1.jpg
 

project information

This project required a redesign of the web application for an internal-facing employee dashboard, a consumer-facing interface, and the creation of a mobile application. After experiencing the original consumer-facing website, I determined the new site needed to optimize for usability and understandability. Based off the client’s preferences and budget, the mobile application would be a simplistic extension of the website allowing only for ordering. 

 
 

aqua terra culinary

role: UX, INTERACTION & VISUAL DESIGNER

Timeline: 2 months

 
 
 

impact

  • Reduction of steps
  • Clarification of information
  • Optimized for ease and accessibility  
 
 

 
 
There’s no nutritional information for the meals I’m looking at. Since my child has a nut allergy, I want to be extra careful about what I’m ordering.
— User
 

wireframe web

I defined pain points of the current web application with a usability test. To revitalize the areas that needed it, I  reorganized the layout to be able to view the important pieces of information easily, and shortened the number of steps in the checkout flow.

 
 
 
 

wireframe mobile

As a simplified version of the website, the mobile application focuses primarily on viewing the month’s lunch options, selecting, and ordering. 

 

 
 
 
 

Website

I utilized a bright and friendly color palette since the lunches were inteded for elementary school children. Even though the main ordering would be done by their parents, the cheerful tones would differentiate it from other functional websites. 

 
 
 
 
 

Mobile

I used the same color palette as the website to carry over into the mobile application. In this example, the parent has two children that he needs to order for. The two children are color coded to ensure each gets the meal that s/he wants.

 
 

Voice Based AI 

Design the future intelligent car system.

screenmock1.jpg
 

project information

Like J.A.R.V.I.S. in Iron Man, Samantha in Her, and countless of other imaginary personifications of voice based artificial intelligence, my team tackled defining in-car AI methods of the very near future. 

Some of these concepts, like understanding and responding to implicit dialogue, have already been implemented in the most recent Mercedes-Benz system (2019). 

 
 

Mercedes-Benz R&D

role: UX, INTERACTION & VISUAL DESIGNER

Timeline: 4 months

 
 
 

impact

  • Dramatic qualitative improvements in user feedback regarding
    • Convenience
    • Ease of use
  • Pushed the current production timelines to include implicit dialogue interactions 
 
 

 
 

concept

The areas of focus that my team defined as a result of prior research include:

  • Adaptivity
  • Personalization
  • Organic Interactions

Our concepts were built atop a machine learning base to ensure the system’s ability to mold to the user’s needs and personality, and to be able to reflect natural speech patterns. Additionally I further defined how to provide feedback of the AI states utilizing cabin lighting and graphical elements. 

 
IMG_6693 copy.jpg

Scope

In order to narrow the scope of this expansive task, my team wrote and clustered pain points of the current system to define areas of focus. Below are use cases that I defined for topics that were defined in the brainstorm, and that I was responsible for.

 
 
 
IMG_1739.png

Define

I sketched solutions for the use cases I brainstormed. My team consolidated all ideas and formed design principles to align our results.  

Slice.png
 
 
 

general concept testing

My team created prototypes of the high-level system concepts namely testing:

  • Range of Proactivity
    • From interrupting a user's conversation to provide pertinent information to the conversation
    • To less invasive cases, like gently suggesting a coffee stop along the way to work.
  • Offering a smaller number of curated results rather than a large list
  • Content changing on the screen (not user initiated)

We partnered with our in-house research team to perform qualitative tests. 

 

curated results

proactivity & memory

 
Interrupting me while I’m speaking to another person is kind of creepy. I don’t mind if the car collects data, I just don’t want to know that it’s doing it.
— User 9
 
I like that the system knows me, and I don’t have to go through a bunch of results to decide where I want to go.
— User 4
 

further concept testing

As a result of the general concept test, we were able to focus our attention on concepts that were well received, and of greater perceived value to the user. We refined the questions that we needed to answer so that ultimately we could provide the most accurate recommendations and guidelines to the production team.

For example, in the wireframes below I cover visibility of curated music selection & handling of car warnings. Many more variations and flows were created in Sketch, then prototyped through Flinto.

 

visual - digital

I explored visual variations of what an intelligent system could manifest as digitally, utilizing space along the sides of the screen, as well as designing options with depth. With Cinema 4D I tested motion and surfaces. 

 
 
screenvisualizations.jpg
 
 
170425_voice_ck_v3-(Converted)_edit.gif
 
 

visual - environment

The entire cabin space was considered in the design of communicating intelligent states. Below is a breakdown of each piece of the ecosystem.

 
mock.png
 
 

Machine Learning 

Define a pattern for interacting with the system intelligence.

Screen Shot 2018-07-23 at 10.26.59 PM.png
 

project information

As the car becomes more capable, the user has more opportunity to utilize the interior cabin space for other tasks. There's a period of time between a "dumb" system and a "smart" system where the system still needs to get to know the user's preferences. This project focused on creating interactions for this in-between time and how the user could interact with the learning system intelligence. 

 
 

Mercedes-Benz R&D

role: ux & interaction designer

Timeline: 6 months

 
 
 

impact

Presented concepts to Head of Design, Head of Engineering, and other company executives, influencing the company in rethinking the presentation of intelligence to the user. 

 
 

 
 

concept

My team designed a system that modifies the interface as it gets to know the user more over time. We tackled a home screen redesign, introduced a timeline history feature, and developed new interaction patterns. To effectively test and present these new concepts, we ultimately put together a prototyping setup where we were able to show cross screen interactions and content changes for altering scenarios.

Here I highlight the feature name the Rich Context Area.  

 
IMG_0779 (1) copy.jpg

User Journey

My team identified our users and  ensured we accounted for every kind of scenario that s/he would encounter during any time of the day by assessing needs through multiple user journeys.

 
 
 

feature: rich context area

The Rich Content Area is the interaction area that was created for the system to be able to surface intelligent, contextual suggestions. Since its initial behavior is similar to a notification I spent some time identifying the differences, sketching & wireframing the use cases associated for the RCA, and creating animations to showcase the differences.  

 
 
 

notification

 The notification drops down from the top of the screen, and is a confirmation of a completed interaction; tapping the notification can result in a direct change of the on screen content.

The notification drops down from the top of the screen, and is a confirmation of a completed interaction; tapping the notification can result in a direct change of the on screen content.

RCA (RICH CONTENT AREA)

 The RCA interaction drops down from the top, and offers the user contextually intelligent suggestions that requires the user's approval to act upon. The user can also adjust parts of the suggestion & see details if the RCA notification is swiped down upon and revealed. RCA interactions do not effect content on the main screen.

The RCA interaction drops down from the top, and offers the user contextually intelligent suggestions that requires the user's approval to act upon. The user can also adjust parts of the suggestion & see details if the RCA notification is swiped down upon and revealed. RCA interactions do not effect content on the main screen.

 
 
 

Create

I wireframed interaction variants for each feature. My team laser cut and assembled a prototype setup where we could use mounted iPads to simulate the individual and interdependent screen interactions so we could test our multi screen concepts.

 
 
EVAprototype.jpg
 
 
 

prototype

Using a rich wireframe styling, my team created an interactive prototype that responded to touch with speech and UI changes. This particular prototype showcased  the Rich Content Area and the types of suggestions that would surface. 

 
 
 
 

Client Project 

A smart lockbox mobile application.

mock1.jpg
 

project information

I was hired as a freelance designer to visually design out a mobile application that would accompany the company’s smart lockbox. The client had worked with another designer previously who had worked on a darker styling. I went in a lighter direction to encourage a more modern feel. Additionally, I advised on the interactions and architecture to ensure optimal understandability and usability.  

 
 

startup client

role: freelance designer

 
 
 

impact

  • Created clear solutions for mobile interactions & navigability
  • New investment buy-in
 
 

 
 
03.24.17_PINGZEE FLOWS MARCH 24 2017_Page_16.png

Wireframes

I was given a series of wireframes to derive the visual styling and screens from. The flows weren't intuitive so I suggested that my client and I work together to create a more straightforward access point for all the available features. 

 
 
 

recommendations

Prior to beginning my process, I clarified what all of the important features were, why they were important, and what kind of functionality was required. I then quickly sketched out some ideas. Below are some examples of what I proposed.

 
 
 
 

collaboration

As we continued our collaboration, we would discuss and I would provide input as visualized here.

 
 
 
 

styleguide

The main color palette was carried over from the previous generation look & feel, however the final results are quite different. I updated the typefaces, and designed a new icon family.

styleguide.png
 
 
 

screens

 
 

Multimodal Interactions 

Explore and define new system interaction methods.

map1.jpg
 

project information

As in-car systems become more intelligent there are more opportunities to provide the user with relevant and contextual information. Additionally alternative ways to interact with digital products come to light, bringing the machine & human relationship closer to the organic way that humans communicate with one another. This project tackled how to provide the user with appropriate information through understandable, natural interactions.

 
 

Mercedes-Benz R&D

role: interaction designer

Timeline: 2 months

 
 
 

impact

Showcased the versatility of alternative modes of interacting through a direct presentation to the Head of Design. 

 
 

 
 

concept

I explored different input methods (gaze, gestures, hovering, and so on) to determine which were the most optimal to achieve the end goal of reducing the user’s cognitive load. From here I defined the specific usages within each system feature to create an ease of use not currently seen in today’s in-car programs. 

 

sketches

With a focus on the map feature, I explored many options to use alternative methods of interacting including:

  • Hover
  • On proximity
  • Gestures
  • Force Touch

I designed the output to alter contextually based on where the user was in the system. Please take a look at my sketches below.

 
 
 

wireframes

Below are rich wireframes that I created showcasing the results of specific interactions. Many more variations and flows were created in Sketch, then prototyped in Flinto.

 
 
 

prototype

Play to follow the user discover places on the map.

 
 
 
 

Terafina, Inc. 

A financial tablet application.

terafina_ipad.jpg
 

project information

I partnered with the Terafina, Inc. product team to create a better experience for banking end users as they go through the online application form on a tablet. The main goal was to determine how to reduce the application drop off rate over the course of a 2 week sprint. 

 
 

Terafina, inc.

role: researcher, ux & interaction designer

Timeline: 2 weeks

 
 
 

impact

Improved rates of process completion within the final product.

 
 

 
 
Screen Shot 2017-06-21 at 9.24.40 AM.png

Usability Test

To determine the pain points for the user flow, my team and I performed a usability study. I tested the application flow with 7 randomly choses individuals. Below are our results.

 
editUsability Test Results.jpg

It feels like a long process. I've gone through multiple pages, but i'm still on step one.

- User 2

 

Some major issues that the users encountered as they went through the application included:

  • Lacking a sense of progress through the form
  • Lacking a sense of security as a banking system 

As a result, my team and I determined the most important things to tackle during this two week sprint was the design of the progress bar. This was the most noticeable missing piece of the system. Users would repeatedly state that they would have exited the form on stage 2 because there was no clear indication of where they were and how many more pages there were left.  

 
 
 

sketches

As the primary point of reference for the user, the progress bar should reflect the number of steps still needed to complete the process & provide the user with a feeling of movement through the forms. Below are some of my sketches that explored those capabilities.

 
 
 

wireframes

I wireframed three progress bar variations that best resolved the issues that the users had described.

 
 
 

prototype

My team created rough prototypes to test in a comparative study. To determine which was the proper recommendation for our client, we measured success through the qualitative rating of three key components:

  1. Understanding of Current Location
  2. Progress through the Form
  3. Anticipation of Next Steps
 

final recommendation

 
 
Recommendation.jpg
 
 

 Human-Machine Communication 

Research project.

editpersonality.png
 

project information

I pitched this exploratory concept because I believe we need to define how people and machines communicate with each other more organically. As far as human to human interactions go, the majority of communication is nonverbal. Typically machine to human communication has been largely explicit through spoken or written dialogue.  In theory we can use alternative, non explicit, means (light, sound, touch, smell) to communicate more naturally with a user.  I lead the team in this exploration primarily pushing research & ideation focused on a dynamic car cabin environment that could appropriately respond to a user's observed emotional state.

 
 

Mercedes-Benz R&D

role: lead, interaction designer, prototyper

Timeline: 2 months

 
 

 
 
edit_IMG_3251.jpg

Define

I created an in-car user journey which I broke down by phases of emotional output. With my team, I then brainstormed what people thought would be appropriate responses to these emotions. Below I further define the expected response flows to specific primary emotions I determined were high priorities based off the original user journey.

 

information areas

Within the cabin space, I assessed where primary and secondary forms of communication would be most effectively placed for the user. 

  • Dashboard and Window Areas: Primary Forms
  • Ceiling and Footwell Areas: Secondary Forms 
IMG_5780_edit.png
 

light

I decided to use our limited amount of time to focus primarily on light and projection. I explored varying intensities and hues of light, testing how each would effect users individually on an emotional scale, and how much was distracting. I used an LED strip and an Arduino board setup for testing. Below are some sketches I created as a response to some emotional use cases. 

 With the use of microcontrollers and an SD card adapter, I programmed short color & hue changing animations onto an LED strip which was then able to mimic the lighting environments below. 

With the use of microcontrollers and an SD card adapter, I programmed short color & hue changing animations onto an LED strip which was then able to mimic the lighting environments below. 

 Noon Sun: alert, jarring

Noon Sun: alert, jarring

 Overcast: aloof, alert

Overcast: aloof, alert

 Tungsten (lightbulb): lethargic, calm

Tungsten (lightbulb): lethargic, calm

 
 
 

sound

I also explored a number of ways sounds could be used to communicate emotion. Utilizing a soundboard and music theory, I created tones and chords that would illicit a feeling. Some examples of the abstracted tones are below. 

 
 
 

haptics

Our sense of touch is an extremely powerful tool to be able to discern what's going on. I mainly used changes in temperature and material textures to enhance the situation. 

 
 
 

setup build

As my team was exploring these concepts, I was also building a makeshift dashboard by molding and attaching acrylic pieces to a metal platform. This was used for physical prototyping of the concepts. 

 

environments

Below is a run through of some light environments that I created that covers the system's response to positivity, excitement, and relaxation.

  1. Interior with no modifications
  2. Soothing: windows tint to create a darker, more cradling environment; as the light catches the water is projected onto the dashboard and flows serenely around the user; the undertone is warm.
  3. Positivity: pastel colors surround the user and pulse in an ambient manner; 2-5 minutes
  4. Positivity: see the world through rose colored glasses; the windows tint to a warmer hue so the user is able to enjoy his/her surroundings through a different perspective.
  5. Joy: embedded miniscule LEDs in the dashboard light up with a joyful shade of blue; enough to create an environment without seeing an onslaught of light.
  6. Excitement: the ambient light pulses with a variety of colors; 1-2 minutes
 
 
environments.gif
 
 

Project & Product Sharing 

A content sharing web application for large companies.

homepage-1.png
 

project information

I was asked to design a content sharing platform. Large companies, such as Mercedes-Benz, have multiple teams working on a variety of overlapping projects. The project goal was to reduce the redundancy of work, promote collaborations, and encourage cross functional behavior by offering a transparent platform for teams to share their work & product testing. 

 
 

mercedes-Benz R&D

role: visual designer

timeline: 2 weeks

 
 

 
 

Proposals

The homepage's desired functionality was to allow for browsing, so I provided a variety of layout proposals that promoted that interaction. 

 
 
 
 
 
Homescreen.png

wireframes

My client liked the casual browsing that a standard grid layout afforded , but wanted a more unique look and feel. I decided to create a free flowing template where the natural direction of the images would prompt the user to browse and continue finding new things.

Below is the base flow including a breakdown of the content page.

 
 
 

moodboard

I put together a moodboard emphasizing minimal colors, graphical geometric shapes, and warm accents. 

moodboard.jpg
 
 

screens