Play in social and tangible interactions

Now that the IxDA has posted a video of my presentation at Interaction 09 to Vimeo, I thought it would be a good idea to provide a little background to the talk. I had already posted the slides to SlideShare, so a full write-up doesn’t seem necessary. To provide a little context though, I will summarize the thing.

Summary

The idea of the talk was to look at a few qualities of embodied interaction, and relate them to games and play, in the hopes of illuminating some design opportunities. Without dwelling on what embodiment really means, suffice to say that there is a school of thought that states that our thinking originates in our bodily experience of the world around us, and our relationships with the people in it. I used the example of an improvised information display I once encountered in the paediatric ward of a local hospital to highlight two qualities of embodied interaction: (1) meaning is socially constructed and (2) cognition is facilitated by tangibility.1

ix09-lightning-talk-presented012

With regards to the first aspect — the social construction of meaning — I find it interesting that in games, you find a distinction between the official rules to a game, and the rules that are arrived at through mutual consent by the players, the latter being how the game is actually played. Using the example of an improvised manège in Habbo, I pointed out that under-specified design tends to encourage the emergence of such interesting uses. What it comes down to, as a designer, is to understand that once people get together to do stuff, and it involves the thing you’ve designed, they will layer new meanings on top of what you came up with, which is largely out of your control.

ix09-lightning-talk-presented015

For the second aspect — cognition being facilitated by tangibility — I talked about how people use the world around them to offload mental computation. For instance, when people get better at playing Tetris, they start backtracking more than when they just started playing. They are essentially using the game’s space to think with. As an aside, I pointed out that in my experience, sketching plays a similar role when designing. As with the social construction of meaning, for epistemic action to be possible, the system in use needs to be adaptable.

ix09-lightning-talk-presented025

To wrap up, I suggested that, when it comes to the design of embodied interactive stuff, we are struggling with the same issues as game designers. We’re both positioning ourselves (in the words of Eric Zimmerman) as meta-creators of meaning; as designers of spaces in which people discover new things about themselves, the world around them and the people in it.

Sources

I had several people come up to me afterwards, asking for sources, so I’ll list them here.

  • the significance of the social construction of meaning for interaction design is explained in detail by Paul Dourish in his book Where the Action Is
  • the research by Jean Piaget I quoted is from his book The Moral Judgement of the Child (which I first encountered in Rules of Play, see below)
  • the concept of ideal versus real rules is from the wonderful book Rules of Play by Katie Salen and Eric Zimmerman (who in turn have taken it from Kenneth Goldstein’s article Strategies in Counting Out)
  • for a wonderful description of how children socially mediate the rules to a game, have a look at the article Beyond the Rules of the Game by Linda Hughes (collected in the Game Design Reader)
  • the Will Wright quote is from an interview in Tracy Fullerton’s book Game Design Workshop, second edition
  • for a discussion of pragmatic versus epistemic action and how it relates to interaction design, refer to the article How Bodies Matter (PDF) by Scott Klemmer, Björn Hartmann and Leila Takayama (which is rightfully recommended by Dan Saffer in his book, Designing Gestural Interfaces)
  • the Tetris research (which I first found in the previously mentioned article) is described in Epistemic Action Increases With Skill (PDF), an article by Paul Maglio and David Kirsh
  • the “play is free movement…” quote is from Rules of Play
  • the picture of the guy skateboarding is a still from the awesome documentary film Dogtown and Z-Boys
  • for a lot of great thinking on “loose fit” design, be sure to check out the book How Buildings Learn by Stewart Brand
  • the “meta-creators of meaning” quote is from Eric Zimmerman’s foreword to the aforementioned Game Design Workshop, 2nd ed.

Thanks

And that’s it. Interaction 09 was a great event, I’m happy to have been a part of it. Most of the talks seem to be online now. So why not check them out? My favourites by far were John Thackara and Robert Fabricant. Thanks to the people of the IxDA for all the effort they put into increasing interaction design’s visibility to the world.

  1. For a detailed discussion of the information display, have a look at this blog post. []

A day of playing around with multi-touch and RoomWare

Last Saturday I attended a RoomWare workshop. The people of CanTouch were there too, and brought one of their prototype multi-touch tables. The aim for the day was to come up with applications of RoomWare (open source software that can sense presence of people in spaces) and multi-touch. I attended primarily because it was a good opportunity to spend a day messing around with a table.

Attendance was multifaceted, so while programmers were putting together a proof-of-concept, designers (such as Alexander Zeh, James Burke and I) came up with concepts for new interactions. The proof-of-concept was up and running at the end of then day: The table could sense who was in the room and display his or her Flickr photos, which you could then move around, scale, rotate, etc. in the typical multi-touch fashion.

The concepts designers came up with mainly focused on pulling in Last.fm data (again using RoomWare’s sensing capabilities) and displaying it for group-based exploration. Here’s a storyboard I quickly whipped up of one such application:

RoomWare + CanTouch + Last.fm

The storyboard shows how you can add yourself from a list of people present in the room. Your top artists flock around you. When more people are added, lines are drawn between you. The thickness of the line represents how similar your tastes are, according to Last.fm’s taste-o-meter. Also, shared top artists flock in such a way as to be closest to all related people. Finally, artists can be acted on to listen to music.

When I was sketching this, it became apparent that orientation of elements should follow very different rules from regular screens. I chose to sketch things so that they all point outwards, with the middle of the table as the orientation point.

By spending a day immersed in multi-touch stuff, some interesting design challenges became apparent:

  • With tabletop surfaces, stuff is closer or further away physically. Proximity of elements can be unintentionally interpreted as saying something about aspects such as importance, relevance, etc. Designers need to be even more aware of placement than before, plus conventions from vertically oriented screens no longer apply. Top-of-screen becomes furthest away and therefore least prominent in stead of most important.
  • With group-based interactions, it becomes tricky to determine who to address and where to address him or her. Sometimes the system should address the group as a whole. When 5 people are standing around a table, text-based interfaces become problematic since what is legible from one end of the table is unintelligible from the other. New conventions need to be developed for this as well. Alexander and I philosophized about placing text along circles and animating them so that they circulate around the table, for instance.
  • Besides these, many other interface challenges present themselves. One crucial piece of information for solving many of these is knowing where people are located around the table. This issue can be approached from different angles. By incorporating sensors in the table, detection may be automated and interfaces could me made to adapt automatically. This is the techno-centric angle. I am not convinced this is the way to go, because it diminishes people’s control over the experience. I would prefer to make the interface itself adjustable in natural ways, so that people can mold the representation to suit their context. With situated technologies like this, auto-magical adaptation is an “AI-hard” problem, and the price of failure is a severely degraded user experience from which people cannot recover because the system won’t let them.

All in all the workshop was a wonderful day of tinkering with like-minded individuals from radically different backgrounds. As a designer, I think this is one of the best way be involved with open source projects. On a day like this, technologists can be exposed to new interaction concepts while they are hacking away. At the same time designers get that rare opportunity to play around with technology as it is shaped. Quick-and-dirty sketches like the ones Alexander and I came up with are definitely the way to communicate ideas. The goal is to suggest, not to describe, after all. Technologists should feel free to elaborate and build on what designers come up with and vice-versa. I am curious to see which parts of what we came up with will find their way into future RoomWare projects.