Thursday, 30 April 2009
Open Lab Day in Essen
Today we had an open lab day - our first one in Essen. We invited colleagues, admin staff, students, friends, and family to have a look how we spent our days ;-) and what interesting systems we create with our students and in our research projects. We had several applications on our multi-touch table running, showed two prototypes in the automotive domain (text input while driving and vibration feedback in the steering wheel), demonstrated a new form of interaction with a public display and let people try an eye-tracking application.
Andreas Riener visits our lab
Andreas Riener from the University of Linz came to visit us for 3 days. In his research he works on multimodal and implicit interaction in the car. We talked about several new ideas for new user multimodal interfaces. Andreas had a preseure matt with him and we could try out what sensor readings we get in different setups. It seems that in particular providing redundancy in the controls could create interesting opportunities - hopefully we find means to explore this further.
Tuesday, 28 April 2009
Meeting on public display networks
Sunday night I travelled to Lugano for a meeting public display networks. I figured out that going there by night train is the best option - leaving midnight in Karlsruhe and arriving at 6am there. As I planned to sleep all the time my assumption was that the felt travel time would be zero. Made my plan without the rail company… the train was 2 hours late and I walked up and down for 2 hours in Karlsruhe at the track - and interestingly the problem would have been less annoying if public displays would provide the relevant information … The most annoying thing was passengers had no information if or when the train will come and no one could tell (neither was anyone at the station nor was anyone taking calls at the hotline).
The public display - really nice state of the art hardware - showed for 1 hour nothing and that it showed that the train is one hour late (was already more than 1 hour after the scheduled time) and finally the train arrived 2 hours late (the display still showing 1 hour delay). How hard can it be to provide this information? It seems with current approaches it is too hard…
On my way back I could observe a further example of short comings with content on public display. In the bus office they had a really nice 40-50 inch screen showing teletext of the departure. The problem was it was the teletext for the evening as the staff has to manually switch the pages. Here too it is very clear the information is available but current delivery systems are not well integrated.
In summary it is really a pity how poorly the public display infrastructures are used. It seems there are a lot of advances in the hardware but little on the content delivery, software and system side.
The public display - really nice state of the art hardware - showed for 1 hour nothing and that it showed that the train is one hour late (was already more than 1 hour after the scheduled time) and finally the train arrived 2 hours late (the display still showing 1 hour delay). How hard can it be to provide this information? It seems with current approaches it is too hard…
On my way back I could observe a further example of short comings with content on public display. In the bus office they had a really nice 40-50 inch screen showing teletext of the departure. The problem was it was the teletext for the evening as the staff has to manually switch the pages. Here too it is very clear the information is available but current delivery systems are not well integrated.
In summary it is really a pity how poorly the public display infrastructures are used. It seems there are a lot of advances in the hardware but little on the content delivery, software and system side.
Labels:
displays,
project-topic,
public spaces,
travel
Saturday, 25 April 2009
Offline Tangible User Interface
When shopping for a sofa I used an interesting tangible user interface - magnetic stickers. For each of the sofas systems the customer can create their own configuration using these magnetic stickers on a background (everything in a scale 1:50).
After the user is happy with the configuration the shop assistant makes a xerox copy (I said I do not need a black and white copy I make my own color copy with the phone) and calculates the price and writes up an order. The interaction with the pieces is very good and also great as a shared interface - much nicer than comparable systems that are screen based. I could imaging with a bit of effort one could create a phone application that scans the customer design, calculates the prices, and provides a rendered image of the configuration - with the chosen color (in our case green ;-). Could be an interesting student project…
After the user is happy with the configuration the shop assistant makes a xerox copy (I said I do not need a black and white copy I make my own color copy with the phone) and calculates the price and writes up an order. The interaction with the pieces is very good and also great as a shared interface - much nicer than comparable systems that are screen based. I could imaging with a bit of effort one could create a phone application that scans the customer design, calculates the prices, and provides a rendered image of the configuration - with the chosen color (in our case green ;-). Could be an interesting student project…
Thursday, 23 April 2009
App store of a car manufacturer? Or the future of cars as application platform.
When preparing my talk for the BMW research colloquium I realized once more how much potential there is in the automotive domain (if you looks from am CS perspective). My talk was on the interaction of the driver with the car and the environment and I was assessing the potential of the car as a platform for interactive applications (slides in PDF). Thinking of the car as a mobile terminal that offers transportation is quite exciting…
I showed some of our recent project in the automotive domain:
Towards the end of my talk I invited the audience to speculate with me on future scenarios. The starting point was: Imagine you store all the information that goes over the bus systems in the car permanently and you transmit it wireless over the network to a backend storage. Then image 10% of the users are willing to share this information publicly. That is really opening a whole new world of applications. Thinking this a bit further one question is how will the application store of a car manufacturer look in the future? What can you buy online (e.g. fuel efficiency? More power in the engine? A new layout for your dashboard? …). Seems like an interesting thesis topic.
[1] Kern, D., Schmidt, A., Arnsmann, J., Appelmann, T., Pararasasegaran, N., and Piepiera, B. 2009. Writing to your car: handwritten text input while driving. In Proceedings of the 27th international Conference Extended Abstracts on Human Factors in Computing Systems (Boston, MA, USA, April 04 - 09, 2009). CHI EA '09. ACM, New York, NY, 4705-4710. DOI= http://doi.acm.org/10.1145/1520340.1520724
I showed some of our recent project in the automotive domain:
- enhance communication in the car; basically studying the effect of a video link between driver and passenger on the driving performance and on the communication
- handwritten text input; where would you put the input and the output? Input on the steering wheel and visual feedback in the dashboard is a good guess - see [1] for more details.
- How can you make it easier to interrupt tasks while driving - we have some ideas for minimizing the cost of interruptions for the driver on secondary tasks and explored it with a navigation task.
- Multimodal interaction and in particular tactile output are interesting - we looked at how to present navigation information using a set of vibra tactile actuators. We will publish more details on this at Pervasive 2009 in a few weeks.
Towards the end of my talk I invited the audience to speculate with me on future scenarios. The starting point was: Imagine you store all the information that goes over the bus systems in the car permanently and you transmit it wireless over the network to a backend storage. Then image 10% of the users are willing to share this information publicly. That is really opening a whole new world of applications. Thinking this a bit further one question is how will the application store of a car manufacturer look in the future? What can you buy online (e.g. fuel efficiency? More power in the engine? A new layout for your dashboard? …). Seems like an interesting thesis topic.
[1] Kern, D., Schmidt, A., Arnsmann, J., Appelmann, T., Pararasasegaran, N., and Piepiera, B. 2009. Writing to your car: handwritten text input while driving. In Proceedings of the 27th international Conference Extended Abstracts on Human Factors in Computing Systems (Boston, MA, USA, April 04 - 09, 2009). CHI EA '09. ACM, New York, NY, 4705-4710. DOI= http://doi.acm.org/10.1145/1520340.1520724
Friday, 3 April 2009
Visit to Newcastle University, digital jewelry
I went to see Chris Kray at Culture Lab at Newcastle University. Over the next months we will be working on a joined project on a new approach to creating and building interactive appliances. I am looking forward to spending some more time in Newcastle.
Chris showed me around their lab and I was truly impressed. Besides many interesting prototypes in various domains I have not seen this number of different ideas and implementations of table top systems and user interface in another place. For picture of me in the lab trying out a special vehicle see Chris' blog.
Jayne Wallace showed me some of her digital jewelry. A few years back she wrote a very intersting article with the title "all the useless beauty" [1] that provides an interesting perspective on design and suggests beauty as a material in digital design. The approach she takes it to design deliberately for a single individual. The design fits their personality and their context. She created a communication device to connect two people in a very simple and yet powerful way [2]. A further example is a piece of jewelry that makes the environment change to provide some personal information - technically it is similar to the work we have started with encoding interest in the Bluetooth friendly names of phones [3] but her artefacts are much more pretty and emotionally exciting.
[1] Wallace, J. and Press, M. (2004) All this useless beauty The Design Journal Volume 7 Issue 2 (PDF)
[2] Jayne Wallace. Journeys. Intergeneration Project.
[3] Kern, D., Harding, M., Storz, O., Davis, N., and Schmidt, A. 2008. Shaping how advertisers see me: user views on implicit and explicit profile capture. In CHI '08 Extended Abstracts on Human Factors in Computing Systems (Florence, Italy, April 05 - 10, 2008). CHI '08. ACM, New York, NY, 3363-3368. DOI= http://doi.acm.org/10.1145/1358628.1358858
Chris showed me around their lab and I was truly impressed. Besides many interesting prototypes in various domains I have not seen this number of different ideas and implementations of table top systems and user interface in another place. For picture of me in the lab trying out a special vehicle see Chris' blog.
Jayne Wallace showed me some of her digital jewelry. A few years back she wrote a very intersting article with the title "all the useless beauty" [1] that provides an interesting perspective on design and suggests beauty as a material in digital design. The approach she takes it to design deliberately for a single individual. The design fits their personality and their context. She created a communication device to connect two people in a very simple and yet powerful way [2]. A further example is a piece of jewelry that makes the environment change to provide some personal information - technically it is similar to the work we have started with encoding interest in the Bluetooth friendly names of phones [3] but her artefacts are much more pretty and emotionally exciting.
[1] Wallace, J. and Press, M. (2004) All this useless beauty The Design Journal Volume 7 Issue 2 (PDF)
[2] Jayne Wallace. Journeys. Intergeneration Project.
[3] Kern, D., Harding, M., Storz, O., Davis, N., and Schmidt, A. 2008. Shaping how advertisers see me: user views on implicit and explicit profile capture. In CHI '08 Extended Abstracts on Human Factors in Computing Systems (Florence, Italy, April 05 - 10, 2008). CHI '08. ACM, New York, NY, 3363-3368. DOI= http://doi.acm.org/10.1145/1358628.1358858
Wednesday, 1 April 2009
Ubicomp Spring School in Nottingham - prototyping user interfaces
On Tuesday and Wednesday afternoon I ran practical workshops on creating novel user interfaces complementing the tutorial on Wednesday morning. The aim of the practical was to motivate people to more fundamentally question user interface decisions that we make in our research projects.
On a very simple level an input user interface can be seen as a sensor, a transfer function or mapping, and an action in the system that is controlled. To motivate that this I showed two simple javascript programs that allowed to play with the mapping of the mouse to a movement of a button on the screen and with moving through a set of images. If you twist the mapping functions really simple tasks (like moving one button on top of the other) may get complicated. Similarly if you change the way you use the sensor (e.g. instead of moving the mouse on a surface, having several people moving a surface over the mouse) such simple tasks may become really difficult, too.
With this initial experience, a optical mouse, a lot of materials (e.g. fabrics, cardboard boxes, picture frames, toys, etc.), some tools, and 2 hours of time the groups started to create their novel interactive experience. The results created included a string puppet interface, a frog interface, a interface to the (computer) recycling, a scarf, and a close contact dancing interface (the music only plays if bodies touch and move).
The final demos of the workshop were shown before dinner. Seeing the whole set of the new interface ideas one wonders why there is so little of this happening beyond the labs in the real world and why people are happy to live with current efficient but rather boring user interfaces - especially in the home context…
On a very simple level an input user interface can be seen as a sensor, a transfer function or mapping, and an action in the system that is controlled. To motivate that this I showed two simple javascript programs that allowed to play with the mapping of the mouse to a movement of a button on the screen and with moving through a set of images. If you twist the mapping functions really simple tasks (like moving one button on top of the other) may get complicated. Similarly if you change the way you use the sensor (e.g. instead of moving the mouse on a surface, having several people moving a surface over the mouse) such simple tasks may become really difficult, too.
With this initial experience, a optical mouse, a lot of materials (e.g. fabrics, cardboard boxes, picture frames, toys, etc.), some tools, and 2 hours of time the groups started to create their novel interactive experience. The results created included a string puppet interface, a frog interface, a interface to the (computer) recycling, a scarf, and a close contact dancing interface (the music only plays if bodies touch and move).
The final demos of the workshop were shown before dinner. Seeing the whole set of the new interface ideas one wonders why there is so little of this happening beyond the labs in the real world and why people are happy to live with current efficient but rather boring user interfaces - especially in the home context…
Ubicomp Spring School in Nottingham - Tutorial
The ubicomp spring school in Nottingham had an interesting set of lectures and practical sessions, including a talk by Turing Award winner Robin Milner on a theoretical approach to ubicomp. When I arrived on Tuesday I had the chance to see Chris Baber's tutorial on wearable computing. He provided really good examples of wearable computing and its distinct qualities (also in relation to wearable use of mobile phones). One example that captures a lot about wearable computing is an adaptive bra. The bra one example of a class of interesting future garments. The basic idea is that these garments detects the activity and changes their properties accordingly. A different example in this class is a shirt/jacket/pullover/trouser that can change its insulation properties (e.g. by storing and releasing air) according to the external temperature and the users body temperature.
My tutorial was on user interface engineering and I discussed: what is different in creating ubicomp UIs compared to traditional user interfaces. I showed some trends (including technologies as well as a new view on privacy) that open the design space for new user interfaces. Furthermore we discussed the idea about creating magical experiences in the world and the dilemma of user creativity and user needs.
There were about 100 people the spring school from around the UK - it is really exciting how much research in ubicomp (and somehow in the tradition of equator) is going on in the UK.
My tutorial was on user interface engineering and I discussed: what is different in creating ubicomp UIs compared to traditional user interfaces. I showed some trends (including technologies as well as a new view on privacy) that open the design space for new user interfaces. Furthermore we discussed the idea about creating magical experiences in the world and the dilemma of user creativity and user needs.
There were about 100 people the spring school from around the UK - it is really exciting how much research in ubicomp (and somehow in the tradition of equator) is going on in the UK.
Subscribe to:
Posts (Atom)