User testing

what: UX concept for large displays interaction
duration: 4 months
team: Arganka Yahya, Biju Thankachan, & Tuomas Alahäivälä
roles: interaction design, analysis, programming, video production
tools: pen & paper, Processing, iMovie
links: presentation and video demo

heart.wall is a concept of embodied large displays interaction. We built a prototype with an infrared (IR) camera that can detect human presence in front of the wall. Then, we designed a heart beat metaphor and projected it on the wall using the projector.

Overview

Walls are becoming invisible and unattractive. We interact with the walls without fully realized their existence; we lay, write, and stick a post-it note on them. What if the wall can interact with us? How can they sense our presence? We imagine the future that walls will be more utilized by implementing pervasive technology. It can be implemented in museum installations or public spaces; moreover, supports the concept of smart city.

To pursue the idea, we built a prototype with an infrared (IR) camera that detects human presence in front of the wall. Then, we designed a heart beat metaphor and projected it on the wall using the projector. We investigated three Control Display (CD) gain values as an initial step to develop seamless interaction with the system. A small usability session is held to find the preferred CD gain value and ask feedback from the potential user.

Design and Evaluation Methods

We will describe the design process and evaluation method in this section. First, the usage environment and user persona are described. Second, the development process of heart.wall consisting of technology used and interaction techniques are explained. After that the test procedure are described.

Usage Environment

Public space such as museum installation or corridor that has a flat wall and empty space in front of it is selected as the usage environment. The flat surface is needed on this project due to the limitation of current technology of a screen projector.

User Personas

The first user is Marja, a 25-year Finnish woman. She studies Interactive Media at University of Tampere. She is a tech-savvy. She often plays video games, and loves to design websites. She wants to relax and get inspiration between lectures. She would like to use the installation between lectures to get into playful mood and for inspiration, then discuss it with friends.

The second user is Pekka, a 28-year Finnish man, a recent graduate from Design school in Tampere. Currently, he is starting his own start up in Protomo. He is interested in finding new interaction techniques to apply into commercial products. He would like to use the installation to try different control gestures in an audiovisual environment.

Usage Scenarios

Marja has just finished her class and goes to the university restaurant. She finds an interactive wall display on her way located on the hallway and attracted by the movements and sounds of the system. She decides to stop, seconds later, the wall gives sound and grow or shrink as she moves. She moves in front of the screen start interacting with the wall. After three minutes of interaction, she remembers to have lunch and prepare to leave. Marja leaves the interaction area; the system begins to play a screen saver, waiting for another people to interact.

Pekka attends a meeting in Demola Tampere. He sees people interacting with large display. He decided to get closer and see what happened. When he got closer, he noticed that the circle start to grow and beat. He moves back and forth and realize that he can interact with the system. The current person realizes his present and start to maintain his personal space. However, they managed to use the system simultaneously.

Development Process

We started with two brainstorming sessions, continued by sketching the ideas on the paper. After that, we sought inspirations and similar project that has been done. Then, we start to define the user environment, personas, and scenarios. We created simple power point presentation to illustrate the kind of interaction that we pursued. We then develop the system both for virtual testing and real testing, due to limitation of space and resource. Lastly, we conduct a small usability test to investigate the preferable CD gain value.

Hardware and Software

Microsoft Kinect for Xbox360 Model 1414 is used on this project. It has an IR depth-meter camera that range from 1.2 to 3.5 meters. A projector screen and sound system is connected to a Macbook Pro to show the interactive display.

Early trial

Figure 1. Early trial to detect a palm and leave a trace of it

Processing is used as a development environment. OpenNI framework and Nite Middleware are used processed the depth data from Kinect. OpenNI is open source SDK used for the development of 3D sensing middleware libraries and applications. We used SimpleOpenNI library, an OpenNI and NITE wrapper library for Processing. Another library is also used to produce control user interface and sound: ControlP5 and Minim.

Early trial

Figure 2. A desktop app to simulate user movements, before implemented it with the Kinect

Interaction Techniques

Two interaction techniques are described on this report, while the rest can be found on the project’s demonstration video (see Additional Materials). First, step in or out to the interaction area. When the user steps into the usage area, the display screen is activated. A circle appears on the screen whose center corresponds to the Center of Mass of the user. For example, initially the projector displays a screen with a normal image. As the user steps into the designated area (roughly 3x3m2), the system is activated. After that, it will display a circle, indicating that the system recognizes the presence of the user in the surrounding.

Second, step closer or away from the display. It used a real world metaphor, get closer objects appear larger and vice versa. Circle size and heart beat sound will getting bigger and louder as the user get closer to the display. It will also shrink and fade when they step away from the display. We used CD gain value to manipulate the circle size.

Procedures

Participant’s task was to match the size of target circle and a circle that represents their position, by moving back and forth. One turn will be completed when they successfully matched 10 target. We tested 3 CD gain values for each turn: CD = 1, 2, and 3. Time completion was counted for each turn. Each participant repeats the whole experiments two times to reduce error in data collection.

The experiment was a within subjects design with CD gain value as a factor. For the parametric statistical test a one-way within subjects analysis on variance (ANOVA) was used. Since the sig-value of ANOVA was not significant, no further analysis has been done

Result

Table 1 and Figure 3 shows the distribution of mean zooming time with the three CD gain values (+/- standard deviation). CD = 1 had mean of 3720,67 deciseconds; CD = 2 had 3745,17 deciseconds; and CD = 3 had 3622.17 deciseconds. There were no statistically significant differences found in one-way within subjects analysis of variance (ANOVA) ().

The analysis of the current test results gave an ambiguous result to the effect of CD gain to zooming time. Next section discusses more about these adjustments and possible further research goals.

Table 1. Mean Time and Standard Deviation of All CD Gain

Test Number CD = 1 CD = 2 CD = 3
1 5246 4098 3593
2 3158 3432 2803
3 4020 2490 5435
4 3778 3507 3031
5 3106 4580 3013
6 3016 4364 3858
Mean 3720.67 3745.17 3622.17
Standard Deviation 346.91 312.76 397.49

Mean zooming times and S.E.M.s for the three CD gain values

Figure 3. Mean zooming times and S.E.M.s for the three CD gain values

Conclusion

Based on our simple experiment and analysis, we found several factors that hinder our insight. First, a small sample size, only three participants doing the tasks with two attempts. If the effect is weak, the experiment may need a larger sample size before the analysis of variance to find whether the difference in CD would bring any changes in user performance. Second, there could be variability among the participants. Some participants excelled at certain CD gain value, while the other was having difficulties at the same value. This problem can be solved with carefully recruiting more participants with more specific requirements and restrictions.

Test environment was also a challenge in this experiment. We installed the system in OASIS, a social research space in University of Tampere. One of the problem is in one or two occassions, some poeple got into the interaction area while the participant performing the task. This event has caused the sensor to misdedect the participant with the people who just got in. Although we have tried to keep the interaction area clear, a dedicated empty space and better conditioning on the test environment will keep the experiment run smoothly.