PhD update – January 2019

Thought I’d post a quick update on my PhD. Since my pre­vi­ous post almost five months have passed. I’ve been devel­op­ing my plan fur­ther, for which you’ll find an updat­ed descrip­tion below. I’ve also put togeth­er my very first con­fer­ence paper, co-authored with my super­vi­sor Gerd Kortuem. It’s a case study of the MX3D smart bridge for Design­ing Inter­ac­tive Sys­tems 2019. We’ll see if it gets accept­ed. But in any case, writ­ing some­thing has been huge­ly edu­ca­tion­al. And once I final­ly fig­ured out what the hell I was doing, it was sort of fun as well. Still kind of a trip to be paid to do this kind of work. Look­ing ahead, I am set­ting goals for this year and the near­er term as well. It’s all very rough still but it will like­ly involve research through design as a method and maybe object ori­ent­ed ontol­ogy as a the­o­ry. All of which will serve to oper­a­tionalise and eval­u­ate the use­ful­ness of the “con­testa­bil­i­ty” con­cept in the con­text of smart city infra­struc­ture. To be continued—and I wel­come all your thoughts!


Design­ing Smart City Infra­struc­ture for Contestability

The use of infor­ma­tion tech­nol­o­gy in cities increas­ing­ly sub­jects cit­i­zens to auto­mat­ed data col­lec­tion, algo­rith­mic deci­sion mak­ing and remote con­trol of phys­i­cal space. Cit­i­zens tend to find these sys­tems and their out­comes hard to under­stand and pre­dict [1]. More­over, the opac­i­ty of smart urban sys­tems pre­cludes full cit­i­zen­ship and obstructs people’s ‘right to the city’ [2].

A com­mon­ly pro­posed solu­tion is to improve cit­i­zens under­stand­ing of sys­tems by mak­ing them more open and trans­par­ent [3]. For exam­ple, GDPR pre­scribes people’s right to expla­na­tion of auto­mat­ed deci­sions they have been sub­ject­ed to. For anoth­er exam­ple, the city of Ams­ter­dam offers a pub­licly acces­si­ble reg­is­ter of urban sen­sors, and is com­mit­ted to open­ing up all the data they collect.

How­ev­er, it is not clear that open­ness and trans­paren­cy in and of itself will yield the desired improve­ments in under­stand­ing and gov­ern­ing of smart city infra­struc­tures [4]. We would like to sug­gest that for a sys­tem to per­ceived as account­able, peo­ple must be able to con­test its workings—from the data it col­lects, to the deci­sions it makes, all the way through to how those deci­sions are act­ed on in the world.

The lead­ing research ques­tion for this PhD there­fore is how to design smart city infrastructure—urban sys­tems aug­ment­ed with inter­net-con­nect­ed sens­ing, pro­cess­ing and actu­at­ing capabilities—for con­testa­bil­i­ty [5]: the extent to which a sys­tem sup­ports the abil­i­ty of those sub­ject­ed to it to oppose its work­ings as wrong or mistaken.

Ref­er­ences

  1. Bur­rell, Jen­na. “How the machine ‘thinks’: Under­stand­ing opac­i­ty in machine learn­ing algo­rithms.” Big Data & Soci­ety 3.1 (2016): 2053951715622512.
  2. Kitchin, Rob, Pao­lo Car­dul­lo, and Cesare Di Feli­cianto­nio. “Cit­i­zen­ship, Jus­tice and the Right to the Smart City.” (2018).
  3. Abdul, Ashraf, et al. “Trends and tra­jec­to­ries for explain­able, account­able and intel­li­gi­ble sys­tems: An hci research agen­da.” Pro­ceed­ings of the 2018 CHI Con­fer­ence on Human Fac­tors in Com­put­ing Sys­tems. ACM, 2018.
  4. Anan­ny, Mike, and Kate Craw­ford. “See­ing with­out know­ing: Lim­i­ta­tions of the trans­paren­cy ide­al and its appli­ca­tion to algo­rith­mic account­abil­i­ty.” New Media & Soci­ety 20.3 (2018): 973–989.
  5. Hirsch, Tad, et al. “Design­ing con­testa­bil­i­ty: Inter­ac­tion design, machine learn­ing, and men­tal health.” Pro­ceed­ings of the 2017 Con­fer­ence on Design­ing Inter­ac­tive Sys­tems. ACM, 2017.

Published by

Kars Alfrink

Kars is a designer, researcher and educator focused on emerging technologies, social progress and the built environment.