Thursday, December 20, 2018

Emotive Driving Is A "Thing" At Korean Auto Maker KIA

Much of the focus on self-driving cars, naturally, has centered on the driving technology, however, the future user experience that a fully automated car can potentially deliver also needs to be looked at, feels KIA. Image Credit: KIA via MotorAuthority (2018)

Emotive Driving Is A "Thing" At Korean Auto Maker KIA

Next month at the 2019 International CES show in Las Vegas, running from January 8th to January 11th, KIA plans on revealing a window, or better, a sensory space by which Human and Vehicle become one.

"READ" that ... Artifical Intelligence in real-time.

At CES 2019, Kia will look into the future, to a time when self-driving cars are the norm. Image Credit: KIA via MediaPost (2018)

This edited and excerpted from MediaPost Communications -

Kia To Unveil In-Car Tech For 'Emotive Driving' Future 

In a move aimed at a post-autonomous driving era, Kia Motors plans to show new technologies including an analysis of a driver’s emotional state.

Kia’s ‘Space of Emotive Driving’ exhibit at the coming 2019 CES, will highlight the concept of a new Real-time Emotion Adaptive Driving (or, READ) system, technology created in collaboration with the MIT Media Lab.

The READ system can optimize and personalize a cabin space in a vehicle by analyzing a driver’s emotional state in real time through artificial intelligence, bio-signal recognition technology, according to Kia.

“We have developed READ system to create an interactive future mobility in-cabin space by converging cutting-edge vehicle control technology and AI-based emotional intelligence,” stated Albert Biermann, president and head of research and development division of Kia Motors. “READ system will enable continuous communication between drivers and vehicles through the unspoken language of emotional feeling, thereby providing an optimized human senses-oriented space for drivers in real-time.”
[Reference Here]

MIT Media Lab has helped by integrating an adaptation of their ground breaking technology that was profiled in FORBES, which stated that MIT has created a special headset that allows one to communicate with a computer system by simply thinking what one wanted to say to it, kind of like being able to issue a command to Amazon's Alexa or Apple's Siri telepathically.

It's called AlterEgo and it's a wearable device that attaches to one's jaw and face (don't worry, it's totally non-invasive and doesn't break the skin, although the prototype looks a little goofy) where electrodes pick up neuromuscular signals triggered when one says words in one's head. These signals aren't detectable when one look's at someone who is verbalizing internally.
[ht: FORBES - Reference Here]

... notes from The EDJE




TAGS: KIA, MIT, MIT Media Lab, Real-time Emotion Adaptive Driving, READ, AlterEgo, CES, The EDJE

No comments: