Play in social and tangible interactions

Now that the IxDA has post­ed a video of my pre­sen­ta­tion at Inter­ac­tion 09 to Vimeo, I thought it would be a good idea to pro­vide a lit­tle back­ground to the talk. I had already post­ed the slides to SlideShare, so a full write-up doesn’t seem nec­es­sary. To pro­vide a lit­tle con­text though, I will sum­ma­rize the thing.


The idea of the talk was to look at a few qual­i­ties of embod­ied inter­ac­tion, and relate them to games and play, in the hopes of illu­mi­nat­ing some design oppor­tu­ni­ties. With­out dwelling on what embod­i­ment real­ly means, suf­fice to say that there is a school of thought that states that our think­ing orig­i­nates in our bod­i­ly expe­ri­ence of the world around us, and our rela­tion­ships with the peo­ple in it. I used the exam­ple of an impro­vised infor­ma­tion dis­play I once encoun­tered in the pae­di­atric ward of a local hos­pi­tal to high­light two qual­i­ties of embod­ied inter­ac­tion: (1) mean­ing is social­ly con­struct­ed and (2) cog­ni­tion is facil­i­tat­ed by tan­gi­bil­i­ty.1


With regards to the first aspect — the social con­struc­tion of mean­ing — I find it inter­est­ing that in games, you find a dis­tinc­tion between the offi­cial rules to a game, and the rules that are arrived at through mutu­al con­sent by the play­ers, the lat­ter being how the game is actu­al­ly played. Using the exam­ple of an impro­vised manège in Hab­bo, I point­ed out that under-spec­i­fied design tends to encour­age the emer­gence of such inter­est­ing uses. What it comes down to, as a design­er, is to under­stand that once peo­ple get togeth­er to do stuff, and it involves the thing you’ve designed, they will lay­er new mean­ings on top of what you came up with, which is large­ly out of your con­trol.


For the sec­ond aspect — cog­ni­tion being facil­i­tat­ed by tan­gi­bil­i­ty — I talked about how peo­ple use the world around them to offload men­tal com­pu­ta­tion. For instance, when peo­ple get bet­ter at play­ing Tetris, they start back­track­ing more than when they just start­ed play­ing. They are essen­tial­ly using the game’s space to think with. As an aside, I point­ed out that in my expe­ri­ence, sketch­ing plays a sim­i­lar role when design­ing. As with the social con­struc­tion of mean­ing, for epis­temic action to be pos­si­ble, the sys­tem in use needs to be adapt­able.


To wrap up, I sug­gest­ed that, when it comes to the design of embod­ied inter­ac­tive stuff, we are strug­gling with the same issues as game design­ers. We’re both posi­tion­ing our­selves (in the words of Eric Zim­mer­man) as meta-cre­ators of mean­ing; as design­ers of spaces in which peo­ple dis­cov­er new things about them­selves, the world around them and the peo­ple in it.


I had sev­er­al peo­ple come up to me after­wards, ask­ing for sources, so I’ll list them here.

  • the sig­nif­i­cance of the social con­struc­tion of mean­ing for inter­ac­tion design is explained in detail by Paul Dour­ish in his book Where the Action Is
  • the research by Jean Piaget I quot­ed is from his book The Moral Judge­ment of the Child (which I first encoun­tered in Rules of Play, see below)
  • the con­cept of ide­al ver­sus real rules is from the won­der­ful book Rules of Play by Katie Salen and Eric Zim­mer­man (who in turn have tak­en it from Ken­neth Goldstein’s arti­cle Strate­gies in Count­ing Out)
  • for a won­der­ful descrip­tion of how chil­dren social­ly medi­ate the rules to a game, have a look at the arti­cle Beyond the Rules of the Game by Lin­da Hugh­es (col­lect­ed in the Game Design Read­er)
  • the Will Wright quote is from an inter­view in Tra­cy Fullerton’s book Game Design Work­shop, sec­ond edi­tion
  • for a dis­cus­sion of prag­mat­ic ver­sus epis­temic action and how it relates to inter­ac­tion design, refer to the arti­cle How Bod­ies Mat­ter (PDF) by Scott Klem­mer, Björn Hart­mann and Leila Takaya­ma (which is right­ful­ly rec­om­mend­ed by Dan Saf­fer in his book, Design­ing Ges­tur­al Inter­faces)
  • the Tetris research (which I first found in the pre­vi­ous­ly men­tioned arti­cle) is described in Epis­temic Action Increas­es With Skill (PDF), an arti­cle by Paul Maglio and David Kirsh
  • the “play is free move­ment…” quote is from Rules of Play
  • the pic­ture of the guy skate­board­ing is a still from the awe­some doc­u­men­tary film Dog­town and Z-Boys
  • for a lot of great think­ing on “loose fit” design, be sure to check out the book How Build­ings Learn by Stew­art Brand
  • the “meta-cre­ators of mean­ing” quote is from Eric Zimmerman’s fore­word to the afore­men­tioned Game Design Work­shop, 2nd ed.


And that’s it. Inter­ac­tion 09 was a great event, I’m hap­py to have been a part of it. Most of the talks seem to be online now. So why not check them out? My favourites by far were John Thackara and Robert Fab­ri­cant. Thanks to the peo­ple of the IxDA for all the effort they put into increas­ing inter­ac­tion design’s vis­i­bil­i­ty to the world.

  1. For a detailed dis­cus­sion of the infor­ma­tion dis­play, have a look at this blog post. []

A day of playing around with multi-touch and RoomWare

Last Sat­ur­day I attend­ed a RoomWare work­shop. The peo­ple of Can­Touch were there too, and brought one of their pro­to­type mul­ti-touch tables. The aim for the day was to come up with appli­ca­tions of RoomWare (open source soft­ware that can sense pres­ence of peo­ple in spaces) and mul­ti-touch. I attend­ed pri­mar­i­ly because it was a good oppor­tu­ni­ty to spend a day mess­ing around with a table.

Atten­dance was mul­ti­fac­eted, so while pro­gram­mers were putting togeth­er a proof-of-con­cept, design­ers (such as Alexan­der Zeh, James Burke and I) came up with con­cepts for new inter­ac­tions. The proof-of-con­cept was up and run­ning at the end of then day: The table could sense who was in the room and dis­play his or her Flickr pho­tos, which you could then move around, scale, rotate, etc. in the typ­i­cal mul­ti-touch fash­ion.

The con­cepts design­ers came up with main­ly focused on pulling in data (again using RoomWare’s sens­ing capa­bil­i­ties) and dis­play­ing it for group-based explo­ration. Here’s a sto­ry­board I quick­ly whipped up of one such appli­ca­tion:

RoomWare + CanTouch +

The sto­ry­board shows how you can add your­self from a list of peo­ple present in the room. Your top artists flock around you. When more peo­ple are added, lines are drawn between you. The thick­ness of the line rep­re­sents how sim­i­lar your tastes are, accord­ing to’s taste-o-meter. Also, shared top artists flock in such a way as to be clos­est to all relat­ed peo­ple. Final­ly, artists can be act­ed on to lis­ten to music.

When I was sketch­ing this, it became appar­ent that ori­en­ta­tion of ele­ments should fol­low very dif­fer­ent rules from reg­u­lar screens. I chose to sketch things so that they all point out­wards, with the mid­dle of the table as the ori­en­ta­tion point.

By spend­ing a day immersed in mul­ti-touch stuff, some inter­est­ing design chal­lenges became appar­ent:

  • With table­top sur­faces, stuff is clos­er or fur­ther away phys­i­cal­ly. Prox­im­i­ty of ele­ments can be unin­ten­tion­al­ly inter­pret­ed as say­ing some­thing about aspects such as impor­tance, rel­e­vance, etc. Design­ers need to be even more aware of place­ment than before, plus con­ven­tions from ver­ti­cal­ly ori­ent­ed screens no longer apply. Top-of-screen becomes fur­thest away and there­fore least promi­nent in stead of most impor­tant.
  • With group-based inter­ac­tions, it becomes tricky to deter­mine who to address and where to address him or her. Some­times the sys­tem should address the group as a whole. When 5 peo­ple are stand­ing around a table, text-based inter­faces become prob­lem­at­ic since what is leg­i­ble from one end of the table is unin­tel­li­gi­ble from the oth­er. New con­ven­tions need to be devel­oped for this as well. Alexan­der and I phi­los­o­phized about plac­ing text along cir­cles and ani­mat­ing them so that they cir­cu­late around the table, for instance.
  • Besides these, many oth­er inter­face chal­lenges present them­selves. One cru­cial piece of infor­ma­tion for solv­ing many of these is know­ing where peo­ple are locat­ed around the table. This issue can be approached from dif­fer­ent angles. By incor­po­rat­ing sen­sors in the table, detec­tion may be auto­mat­ed and inter­faces could me made to adapt auto­mat­i­cal­ly. This is the tech­no-cen­tric angle. I am not con­vinced this is the way to go, because it dimin­ish­es people’s con­trol over the expe­ri­ence. I would pre­fer to make the inter­face itself adjustable in nat­ur­al ways, so that peo­ple can mold the rep­re­sen­ta­tion to suit their con­text. With sit­u­at­ed tech­nolo­gies like this, auto-mag­i­cal adap­ta­tion is an “AI-hard” prob­lem, and the price of fail­ure is a severe­ly degrad­ed user expe­ri­ence from which peo­ple can­not recov­er because the sys­tem won’t let them.

All in all the work­shop was a won­der­ful day of tin­ker­ing with like-mind­ed indi­vid­u­als from rad­i­cal­ly dif­fer­ent back­grounds. As a design­er, I think this is one of the best way be involved with open source projects. On a day like this, tech­nol­o­gists can be exposed to new inter­ac­tion con­cepts while they are hack­ing away. At the same time design­ers get that rare oppor­tu­ni­ty to play around with tech­nol­o­gy as it is shaped. Quick-and-dirty sketch­es like the ones Alexan­der and I came up with are def­i­nite­ly the way to com­mu­ni­cate ideas. The goal is to sug­gest, not to describe, after all. Tech­nol­o­gists should feel free to elab­o­rate and build on what design­ers come up with and vice-ver­sa. I am curi­ous to see which parts of what we came up with will find their way into future RoomWare projects.