Wednesday 23 June 2010

Talk at Automotive Interiors Expo 2010

Today I gave a talk at Automotive Interiors Expo 2010 in Stuttgart. Traditionally this fair is about car seats, wood for the dashboard, colors and lights - and they are still there. The talks were much more about digital technologies and there was a lot about the human machine interface. We are very interested in this topic (last year we started the auto-ui.org conference series) and published an article about automotive UI research in IEEE Pervasive [1]. The slides for my talk, entitled: "Multimodal human-computer interaction in the car Novel interface and application concepts" are online available. I first introduced pervasive computing, then talked a little about an application platform for the car, and then gave an overview of some of our recent projects on automotive user interfaces, in particular about:
  • Gazemarks, support for attention switching by eye tracking [2]
  • Vibrofeedback in the steering wheel [3]
  • Text input while driving [4] and gesture interaction on the steering wheel [5]
  • Video communication in the car [6]
  • The design space for automotive user interfaces [7, 8]
  • our open source driving simulator [9, 10] for evaluating attention demands of secondary tasks
When walking back I saw a concept car (www.edag-light-car.com/) that has an interesting public display integrated in the backside.



[1] Schmidt, A., Spiessl, W., and Kern, D. 2010. Driving Automotive User Interface Research. IEEE Pervasive Computing 9, 1 (Jan. 2010), 85-88. DOI= http://dx.doi.org/10.1109/MPRV.2010.3
[2] Kern, D., Marshall, P., and Schmidt, A. 2010. Gazemarks: gaze-based visual placeholders to ease attention switching. In Proceedings of the 28th international Conference on Human Factors in Computing Systems (Atlanta, Georgia, USA, April 10 - 15, 2010). CHI '10. ACM, New York, NY, 2093-2102. DOI= http://doi.acm.org/10.1145/1753326.1753646
[3] Kern,D., Hornecker,E., Marshall,P., Schmidt,A., Rogers, Y. 2009. Enhancing Navigation Information with Tactile Output Embedded into the Steering Wheel. In Proceedings of the 7th Int. Conference on Pervasive Computing 2009 (Pervasive 2009). Nara, Japan. Springer LNCS 5538, pp 42-58.
[4] Kern, D., Schmidt, A., Arnsmann, J., Appelmann, T., Pararasasegaran, N., and Piepiera, B. 2009. Writing to your car: handwritten text input while driving. In Proceedings of the 27th international Conference Extended Abstracts on Human Factors in Computing Systems (Boston, MA, USA, April 04 - 09, 2009). CHI '09. ACM, New York, NY, 4705-4710. DOI= http://doi.acm.org/10.1145/1520340.1520724
[5] Pfeiffer, M., Kern, D., Schöning, J., Döring, T., Krüger, A., and Schmidt, A. 2010. A multi-touch enabled steering wheel: exploring the design space. In Proceedings of the 28th of the international Conference Extended Abstracts on Human Factors in Computing Systems (Atlanta, Georgia, USA, April 10 - 15, 2010). CHI EA '10. ACM, New York, NY, 3355-3360. DOI= http://doi.acm.org/10.1145/1753846.1753984
[6] Grace Tai, Dagmar Kern, Albrecht Schmidt: Bridging the Communication Gap: A Driver-Passenger Video Link. In: Mensch und Computer 2009. Berlin 2009.
[7] Kern, D. and Schmidt, A. 2009. Design space for driver-based automotive user interfaces. In Proceedings of the 1st international Conference on Automotive User interfaces and interactive Vehicular Applications (Essen, Germany, September 21 - 22, 2009). AutomotiveUI '09. ACM, New York, NY, 3-10. DOI= http://doi.acm.org/10.1145/1620509.1620511
[8] Fotos of over 100 different car UIs from IAA 2010
[10] CARS: Open source software for the driving simulator.