A day of playing around with multi-touch and RoomWare

Last Sat­ur­day I attend­ed a RoomWare work­shop. The peo­ple of Can­Touch were there too, and brought one of their pro­to­type mul­ti-touch tables. The aim for the day was to come up with appli­ca­tions of RoomWare (open source soft­ware that can sense pres­ence of peo­ple in spaces) and mul­ti-touch. I attend­ed pri­mar­i­ly because it was a good oppor­tu­ni­ty to spend a day mess­ing around with a table.

Atten­dance was mul­ti­fac­eted, so while pro­gram­mers were putting togeth­er a proof-of-con­cept, design­ers (such as Alexan­der Zeh, James Burke and I) came up with con­cepts for new inter­ac­tions. The proof-of-con­cept was up and run­ning at the end of then day: The table could sense who was in the room and dis­play his or her Flickr pho­tos, which you could then move around, scale, rotate, etc. in the typ­i­cal mul­ti-touch fashion.

The con­cepts design­ers came up with main­ly focused on pulling in Last.fm data (again using RoomWare’s sens­ing capa­bil­i­ties) and dis­play­ing it for group-based explo­ration. Here’s a sto­ry­board I quick­ly whipped up of one such application:

RoomWare + CanTouch + Last.fm

The sto­ry­board shows how you can add your­self from a list of peo­ple present in the room. Your top artists flock around you. When more peo­ple are added, lines are drawn between you. The thick­ness of the line rep­re­sents how sim­i­lar your tastes are, accord­ing to Last.fm’s taste-o-meter. Also, shared top artists flock in such a way as to be clos­est to all relat­ed peo­ple. Final­ly, artists can be act­ed on to lis­ten to music.

When I was sketch­ing this, it became appar­ent that ori­en­ta­tion of ele­ments should fol­low very dif­fer­ent rules from reg­u­lar screens. I chose to sketch things so that they all point out­wards, with the mid­dle of the table as the ori­en­ta­tion point.

By spend­ing a day immersed in mul­ti-touch stuff, some inter­est­ing design chal­lenges became apparent:

  • With table­top sur­faces, stuff is clos­er or fur­ther away phys­i­cal­ly. Prox­im­i­ty of ele­ments can be unin­ten­tion­al­ly inter­pret­ed as say­ing some­thing about aspects such as impor­tance, rel­e­vance, etc. Design­ers need to be even more aware of place­ment than before, plus con­ven­tions from ver­ti­cal­ly ori­ent­ed screens no longer apply. Top-of-screen becomes fur­thest away and there­fore least promi­nent in stead of most important. 
  • With group-based inter­ac­tions, it becomes tricky to deter­mine who to address and where to address him or her. Some­times the sys­tem should address the group as a whole. When 5 peo­ple are stand­ing around a table, text-based inter­faces become prob­lem­at­ic since what is leg­i­ble from one end of the table is unin­tel­li­gi­ble from the oth­er. New con­ven­tions need to be devel­oped for this as well. Alexan­der and I phi­los­o­phized about plac­ing text along cir­cles and ani­mat­ing them so that they cir­cu­late around the table, for instance.
  • Besides these, many oth­er inter­face chal­lenges present them­selves. One cru­cial piece of infor­ma­tion for solv­ing many of these is know­ing where peo­ple are locat­ed around the table. This issue can be approached from dif­fer­ent angles. By incor­po­rat­ing sen­sors in the table, detec­tion may be auto­mat­ed and inter­faces could me made to adapt auto­mat­i­cal­ly. This is the tech­no-cen­tric angle. I am not con­vinced this is the way to go, because it dimin­ish­es people’s con­trol over the expe­ri­ence. I would pre­fer to make the inter­face itself adjustable in nat­ur­al ways, so that peo­ple can mold the rep­re­sen­ta­tion to suit their con­text. With sit­u­at­ed tech­nolo­gies like this, auto-mag­i­cal adap­ta­tion is an “AI-hard” prob­lem, and the price of fail­ure is a severe­ly degrad­ed user expe­ri­ence from which peo­ple can­not recov­er because the sys­tem won’t let them.

All in all the work­shop was a won­der­ful day of tin­ker­ing with like-mind­ed indi­vid­u­als from rad­i­cal­ly dif­fer­ent back­grounds. As a design­er, I think this is one of the best way be involved with open source projects. On a day like this, tech­nol­o­gists can be exposed to new inter­ac­tion con­cepts while they are hack­ing away. At the same time design­ers get that rare oppor­tu­ni­ty to play around with tech­nol­o­gy as it is shaped. Quick-and-dirty sketch­es like the ones Alexan­der and I came up with are def­i­nite­ly the way to com­mu­ni­cate ideas. The goal is to sug­gest, not to describe, after all. Tech­nol­o­gists should feel free to elab­o­rate and build on what design­ers come up with and vice-ver­sa. I am curi­ous to see which parts of what we came up with will find their way into future RoomWare projects.

Published by

Kars Alfrink

Designer turned design researcher. Postdoc at TU Delft. Exploring contestable AI.

4 thoughts on “A day of playing around with multi-touch and RoomWare”

  1. Pingback: ٶ_roomware

Comments are closed.