harman-720x720

The 2025 User Experience: Predictions for the Future of Personalized Technology

By Stefan Marti, Vice President of Future Experience & AI at HARMAN

History is witness to the many scientific leaders and technology visionaries who all tried to predict what innovations will exist in the future. While not all predictions come to fruition, others were not so far off. We may not have flying cars like The Jetsons (yet), but self-driving cars are close to becoming a reality. At HARMAN, our engineers, designers, and developers keep a constant eye on the future in our best efforts to anticipate the ‘next big thing.’ When it comes to what the next five or ten years will bring for the User Experience (UX) and personalized tech space, I have a few bets of my own…

Implicit Over Explicit User Interfaces
Within the next five years, I expect there will be a holistic transition from explicit to implicit user interfaces. Your computer’s keyboard is a typical example of an explicit interface – without direct action or input from the user, the keyboard won’t type on its own and it certainly isn’t learning anything in the process. An implicit interface differs in that it does not lie dormant until the user gives it a stimulus. Rather, it takes action and measures the user’s cognitive, emotional, and physiological state, as well as their reactions, to automatically respond and take measures to ensure a better user experience.

harman-720x720

We’re now working on a new platform at HARMAN that will gauge user experience as it relates to Over-the-Air (OTA) updates. Cars with OTA capabilities perform updates overnight, so when drivers get in the car the next morning, they are greeted with a notification that one or more of the vehicle’s systems, such as the navigation or entertainment system, have been updated. At HARMAN, we’re envisioning a Driver Monitoring System (DMS) that can assess how these updates are being received by using various psychophysiological signals to understand the drivers’ reactions. We are looking at various metrics such as facial expressions, emotional responses, cognitive load, heart rate variability, and more. With such technological advances in the DMS domain that change explicit user interfaces into implicit ones, we can help OEMs identify which system changes are performing well without any inputs from the end user.

Cognitive Audio Tech
There’s nothing better than listening to music while driving, but when you enter heavy traffic or realize you’re lost, a driver’s first instinct is usually to turn the volume down as the additional stimuli are overwhelming and increase your cognitive workload. Today, we have to manually turn the volume down or pause the music, but what if your device could monitor your mental workload and automatically adapt for optimal outcomes?

home

At HARMAN, we’re determining how to outfit headphones with EEG sensors to monitor the wearer’s cognitive load. Of course, there are products on the market that perform a simplified version of this task today – such as heartrate monitoring on fitness trackers – but over time these devices will be able to measure the amount of attention the wearer is paying to the music by identifying their emotional and cognitive state through physiological markers. In other words, your headphones will be sophisticated enough to identify when you are no longer paying full attention and will automatically pause or rewind the audio content based on that information.

These future-facing solutions can also be applied in the automotive space in order to change the interior environment (music, temperature, color, etc.) of the car. Our engineers are working on solutions capable of observing the user’s mood and recognizing their emotions to automatically change the playlist or adjust the intensity of the air conditioning. The emphasis on machine learning and the phasing out of explicit and continuous user commands is the future of true personalization.

Automatic Personalization
Eventually, of course, personalization will be largely automatic. You can add a personal element to several devices today by inputting your own preferences, but in the future, that manual step can be eliminated as devices automatically learn from you.

HARMAN-Redbend-Connected-Home-Software-Management-1

Take for example a smart home thermostat. Maybe you like the temperature to be a bit cooler at night, so you actively set your preferred temperature. Eventually, the system learns your preferences and begins to automatically set itself. However, this combination of explicit and implicit interfaces could lead to a negative user experience. You may find yourself needlessly double-checking if the thermostat has already lowered the temperature, or if you need to adjust it manually. With a truly personalized device, this redundancy will be eliminated. The issue is that these devices aren’t currently capable of telling you what it has learned about you, which is why HARMAN is working on conversational interfaces that allow users to understand the information the system has gathered so far and adjust anything that is off. In the future, these devices will also be able to monitor physiological signals and combine that information with the environmental preferences it has already learned from the user to automatically create the ideal environment.

At HARMAN, our goal is to revolutionize how people interact with their digital ecosystem to improve their lives in meaningful and elegant ways. In the future, true personalization capabilities will allow our devices to improve our environments, habits, and experiences without constant commands from the user, thus creating seamless user experiences.