A white Mercedes mini-van with the blue Ergosign logo is parked in a courtyard.

Prototyping for autonomous driving: Ergosign usability testing

Lukas Flohr Senior UX Designer • Specialist Mobility

Esther Barra Senior Communication Manager

Dieter Wallach Founder • Managing Director

12/03/2024 • 7 minutes reading time

Where does the Wizard of Oz fit into the world of mobility? 

Picture yourself in a car, cruising down the road without a care in the world and without anyone actively driving the car. A long-talked-about scenario finally looks to become a reality, with a lot of research and testing going into the topic to ensure autonomous cars drive safely and passengers feel at ease. 

In a world where artificial intelligence calls the shots on the road, how do you trust a machine to navigate the bustling streets and ensure your safety? That is the focus of a recent study „Prototyping Autonomous Vehicle Windshields with AR and Real-Time Object Detection Visualization: An On-Road Wizard-of-Oz Study“ by Dr. Lukas Flohr, Professor Dr. Dieter Wallach, Joseph S. Valiyaveettil, and Professor Antonio Krüger. They explored how transparent information and passenger-focused interfaces can help to understand and embrace the AI-powered revolution on our roads.

The Challenge

While reaching technological maturity, autonomous vehicles (AVs) encounter substantial challenges related to gaining public acceptance. Trust is a primary barrier to AV adoption, hinging on factors such as performance expectations and vehicle reliability. Privacy and security concerns, software errors, and potential hacker attacks further contribute to trust-related barriers. Additional potential user concerns  encompass safety, usability, accessibility, and comfort. 

Addressing these challenges is crucial for fostering the adoption of AV technology. A comprehensive understanding of users, systems, and their environment is vital in shaping human-computer interaction strategies to counteract these challenges.

A Solution 

HMIs are the primary interaction point between passengers and level 4 and 5 AVs. They play a critical role in overcoming acceptance challenges and in building trust. Transparent internal communication has been identified as a key factor for high acceptance and positive user experience (UX). Studies on HMIs, including augmented reality (AR) based concepts, demonstrate their potential to increase trust and situation awareness in automated vehicles. However, findings emphasize the importance of careful design and consideration of the amount of information provided, as excessive details may not necessarily lead to increased trust. These insights underscore the nuanced relationship between information transparency and user trust. This project set out to answer three research questions by prototyping and testing through a Wizard-of-Oz approach: 

  1. Can we increase AV passengers’ acceptance and UX by providing transparent system information via (AR-based) visualization of detected objects in the vehicle windshield? 

  2. How and when should this information be displayed during AV rides in urban environments? 

  3. How can we create a suitable prototyping framework to investigate research question 1 and research question 2, as well as related questions in complex urban real-life environments?

The Ergosign usability testing car with Lukas Flohr standing to the left and Joseph Valiyaveettil to the right.
Lukas Flohr, Joseph Valiyaveettil and the testing vehicle.

Toto, I've got a feeling we're not in Kansas anymore.

The deployment and, hence, testing of driverless rides in complex urban environments faces significant limitations. Overcoming those is essential for empirical concept studies related to AVs. Context-based prototyping emerges as a viable approach to consider both the complex context and the user experience. Simulators may offer controllability and reproducibility, but they struggle to represent high-fidelity complex environments and are confined to artificial lab contexts. 

In contrast, WoOz set-ups can be applied in real-world environments on public roads, enabling the evaluation of AVs and their human-machine interfaces. The method involves hidden human drivers (wizards) simulating automation. Whilst effective in practice, WoOz presents challenges related to maintaining the illusion and ensuring consistent driving styles, so they require a carefully crafted cover story. Our ruse? We pretended our AV had a state-of-the-art windshield —  and we were testing this windshield with AR-based real-time object detection visualization, that still required someone in the driver’s seat for emergencies.  

Prototyping to test the solution 

Our WoOz setup simulated an AV ride through the city using an electric Mercedes-Benz EQV minivan, providing a flexible and easily replicable set-up for future experiments. The windshield HMI prototype displayed real-time augmented reality (AR) bounding box visualizations over detected objects. Three concept variants were designed, including baseline information, counts of detected objects per class in a status bar, and a combination of both. These variants aimed to gauge the impact of the amount of displayed information on passengers' experience.

Our study involved 30 participants, with a balanced distribution of gender and a diverse age range. Participants experienced test rides in the WoOz vehicle through an urban environment, with stops at parking lots to change HMI variants. Data collection included both quantitative and qualitative measures, such as standardized questionnaires, as well as single-item assessments of perceived risk, safety, well-being, and nausea.

The research methodology also incorporated a semi-structured interview and debriefing session to gather participants' feedback on their preferences. This comprehensive approach aimed to provide valuable insights into enhancing AV passengers' acceptance and UX by leveraging transparent system information, with a focus on the challenges posed by real urban environments. The study's findings can contribute to the development of effective and user-friendly HMIs for future autonomous vehicles.

A trip in the prototyping vehicle.

Results and key take-aways:  

  1. The study revealed that providing visual system feedback on detected objects significantly enhances the acceptance and UX of AVs. Participants expressed a strong preference for augmented reality (AR) overlays on the AV windshield, considering them instrumental in improving contextual understanding and trust in the system. However, some participants found them to be too distracting, favoring only general information on the overall system state. 

  2. Participants in the study advocated for configurable display settings in AVs, allowing them to control the presentation of certain information. In terms of content, most participants favored visualizing only objects impacting their ride, with a classification based on hazard level, as opposed to having constant visual feedback on all detected objects. 

  3. You need a thoughtful cover story and suitable hardware appearance for maintaining the WoOz deception, with participants' attention shifted toward the futuristic HMI prototype. 73% of the participants believed that the car was driving autonomously, which indicates a successful application of the WoOZ paradigm.

Limitations and potential for future work: 

Like most studies, this study experienced some limitations and there are three areas for future research: 

  1. The study sample: the affinity for technology in participants was quite high. This may impact external validity and the belief in the WoOz deception. 

  2. The HMI prototype: a more holistic HMI with a less narrow focus on object detection will allow a broader examination of the topic. Likewise, a more powerful system will decrease performance limitations. 

  3. The WoOz approach: while advantageous for dynamic urban context exploration, the method presents challenges in terms of objectivity and reliability. Overall, the framework created for the study is valuable for prototyping and may be extended to investigate various aspects of autonomous vehicle experiences and HMI concepts.

The future of mobility, AVs, and how UX research and design can contribute to the field is an important passion of ours. We look forward to contributing to this future like we do in the STADT:up research project. You can read up on STADT:up here. Our experience in mobility does, however, not stop there. We also contributed to APEROL, CARIAD (Audi), Thyssenkrupp’s carvaloo and even Lufthansa’s AVIATOR platform. 

Lukas is a Senior UX Designer and our Mobility Specialist. He has been with Ergosign since 2016 and is currently finishing his doctorate focusing on interaction with autonomous vehicles. In his research, Lukas looks at context-based prototyping methods and a human centered, collaborative development of intercation concepts for future mobility.

Lukas FlohrSpecialist Mobility

Dieter Wallach is a Ph.D. cognitive scientist who, as a UX pioneer and university lecturer, significantly influenced the German-speaking user experience scene. He is the founder and co-managing director of Ergosign. Dieter Wallach conducts research as a professor of Human-Computer Interaction and Usability Engineering in the Department of Computer Science and Microsystems Technology at the University of Kaiserslautern.

Dieter WallachManaging Director

The full citation of the above mentioned paper:

Lukas A. Flohr, Joseph S. Valiyaveettil, Antonio Krüger, and Dieter P. Wallach. 2023. Prototyping Autonomous Vehicle Windshields with AR and Real-Time Object Detection Visualization: An On-Road Wizard-of-Oz Study. In Designing Interactive Systems Conference (DIS ’23), July 10– 14, 2023, Pittsburgh, PA, USA. ACM, New York, NY, USA, 15 pages.