Robots, immaterials & control

I made this presentation at the Canadian Centre for Architecture on 14 February . The purpose of the presentation is to inspire architecture students for a 3 day competition to envision how information technology will change the architectural experience of the city of Montreal in 50 years.

Here are the slides or you can watch the whole talk here (starts at 44:43).

Sneak peek of I for Interface walkshop materials

The studio recieved a sneek peek of the particpant materials for the I for Interface walkshop tomorrow  (2pm, 1 December 2012) at the CCA.

The materials consist of a beautiful map and stickers of urban interfaces for participants in the tour. There is also a large scale map we will use for a group discussion about what we observe after the tour. The materials have been designed by the supremely talented Jessica Charbonneau from the CCA / TagTeam Studio.

If you haven’t signed up yet there may still be a few places available. To register for the event please call +1 (514) 939-7002 or email

Neil Clavin Studio presents ‘Interface’ walkshop at CCA

Neil Clavin Studio invites you to participate in ‘Interface’ a walkshop on urban interfaces in Montreal at the Canadian Centre for Architecture (CCA) on Saturday 1 December 2012.

The walkshop is a guided tour around Montreal observing and photographing urban interfaces such as CCTV cameras, environment sensors and public displays. Participants will develop an awareness  of urban interfaces  and discuss how networked technology influences their experience of the city. The walkshop is aimed at professionals, students and citizens interested in design, urbanism and technology.

The ‘Interface’  walkshop is part of the ABC: MTL exhibition currently running at the CCA. ABC: MTL is a self portrait of Montreal, an urban abecedary and open-source initiative that maps contemporary Montreal in a diversity of ways and media.

The ‘Interface’ walkshop takes place at the CCA, Montreal at 2pm on Saturday 1 December. A followup co-design session where participants can explore potential new interfaces to improve the urban experience of Montreal takes place on Saturday 8 December.

To register for either event please call +1 (514) 939-7002 or email The event is free to attend but places are limited.

Natural input on film

View on Vimeo

Natural input is the future of interfaces – from air gestures to facial gestures, from touch input to voice input. Apple appears to be pursuing a natural input strategy for the evolution of its products – from iOS gestures and Siri voice commands to filings for facial gesture patents for software to respond to a user’s emotional state while using software.

Hardware manufacturers are seeking new ways to differentiate their products as we move away from abstracted to more direct ways of interacting with applications and systems. This leads to faster learning curves for input and more intuitive and efficient modes of interaction. Early mouse users were taught to move the mouse on a table top rather than placing the mouse directly on the screen. Leap Motion’s system evolved from its inventor’s frustration with arcane combinations of selections and mouse clicks to shape 3D models on screen which would take seconds with a piece of clay.

What are other possible forms of input for the future? How can emerging technologies be applied to create natural input interfaces?

If we consider the experience of interacting with computing devices closer to how we  interact with people, animals and physical objects we move towards a more cohesive and natural model of interaction. This is a model where we continue to point, grab and move objects on screen as if they are physical artifacts or speak to interfaces as if they are obedient servants. It is also a model which moves towards a more subtle, nuanced language of non verbal communication like tone of voice, posture and even genetic identifiers.

Examining interfaces in science fiction the studio creates a ‘future ethnographic study‘* and analysis of how we might use natural input interfaces tomorrow.


Star Wars, 1977. [Film] George Lucas, USA: 20th Century Fox

The practice drone tracks the user’s posture, looking for undefended parts of the body. It may use a motion sensor to target moving objects unless line of sight is blocked by bright light sources.


Blade Runner, 1982. [Film] Ridley Scott, USA: Warner Brothers

User interacts with a system for examining 3 dimensional photographs via voice input. The user can ask the system to pan, zoom in and enhance areas of the photograph. The user can also request a print of a selected area.

Emotion (including engagement, eye gaze,paralanguage, clothing)

Scott, T. (2012) Prometheus Viral – “Quiet Eye”. [video online]

Software analyses video input of a user for facial gestures , gaze aversion and voice set and voice quality. The combination of inputs provides an analysis of the emotional state of the user including honesty, coercion, excitement, malintent and anxiety. The software also analyses clothing and accessories worn by the user which may imply other character traits. In this example the software detects a cross pendant worn by the subject which may signify some of the subjects beliefs. Facial recognition processes confirm the authenticity of the user against a database of similar named individuals.

Blade Runner, 1982. [Film] Ridley Scott, USA: Warner Brothers

The hardware helps an interviewer analyse a subject’s emotional response through involuntary iris dilation and changes in respiration.

2001: A Space Odyssey, 1968. [Film] Stanley Kubrick, Metro-Goldwyn-Major

The system uses visual and voice input to interface with and analyse the emotional state of the user. The user’s emotional state is analysed via voice set and qualities, posture and respiration. The system also uses visual input to lip read when audio is obscured.


Gattaca, 1997. [Film] Andrew Niccol, USA: Columbia Pictures

An electronic barrier authenticates users via DNA analysis against a database of approved users.

Minority Report, 2002. [Film] Stephen Spielberg, USA: 20th Century Fox

Retina sensors identify the user to present targeted advertising messages and seamlessly approve payments for services such as public transportation.


Minority Report, 2002. [Film] Stephen Spielberg, USA: 20th Century Fox

The user stands in a situated zone for interacting with the system. The user interacts with a virtual panoramic screen interacting with 2 dimensional projections. This is a ‘work station’  for focussed non casual computing. Perhaps the situational aspect is due to the complexity and expense of the technology or the need for security and access to the database. The ‘light gloves’ worn by the user seem to indicate a ‘master user’ status to prevent interference from others in the space.

Reportedly the actor found filming these scenes so tiring he needed to take breaks after only a few minutes.  

Iron Man 2, 2010. [Film] John Favreau, USA: Paramount Pictures


The user manipulates 3 dimensional projected gestural interfaces distributed throughout a dedicated workspace.

Prometheus, 2012. [Film] Ridley Scott, USA: 20th Century Fox

The user interacts with a projected gestural interface in an attentive seated position. The complex layering may be a special setting for android users with faster cognitive processing capability than humans.

*This post is inspired by Joe Malia’s great study on video conference systems on film.

Week 10: In motion

MunichIts been a busy week with the feeling that the constant push of the studio is starting to create some significant motion.

There were some great responses to the Urban Interface Safari walkshop and the studio is in dialogue with a couple of organisations about some pretty damn exciting projects for networked objects and cities. Let’s see what develops.


Week 9: Photos & analysis – Urban Interface Safari walkshop


On Sunday 11 March 2012 a group of designers and a digital journalist toured Cologne for an Urban Interface Safari ‘walkshop’ (in collaboration with Bottled City) to find, document and discuss interfaces in public space. You can retrace our tour via either this Flickr map or this Google map and see all of the photos here.  Below is a summary of the findings and discussion. I’d like to thank everyone who took time to join, take photos and inform fresh thinking on how we can integrate public interfaces into our city to be more accessible, pleasant, unobtrusive and convenient : Martin Beyerle, Marcus Bösch, Andreas Echterhoff, Jan Güra,  Katharina Schlösser and Jan Schröder.

The open data, low input, remote access future of public interfaces

Today our cities bristle with networks of  cameras, sensors and displays we barely notice.The designers, planners and manufacturers of these public interfaces shape our experience of the city.

Cities and services project  public, perception forming and long lived artefacts into the urban sphere. These organisations must make these interfaces effective for the citizens and customers who use and ultimately pay for these interfaces. For cities and services to fully serve the public we must design public interfaces for  minimal user input, open data, clear information design and remote access via mobile devices

Release open data from public interfaces to citizens

Some public interfaces clearly display their functions – information displays, clocks and digital thermometers but many of the sensors  are anonymous – perching quietly on walls and cornices. What data are they collecting? Where is the data going? Who is using it? How can we access it? Public sensors rarely declare their function – who uses the information from a sensor and how we can access it.  Public sensors could have IP addresses attached to access their data via a mobile device- eg. one could access the atmospheric conditions for the corner one is standing on via the nearby wind speed  and barometric pressure sensors.

Public interface APIs can also be made accessible via the internet for personal use and re-appropriation via services like Pachube.

CCTV cameras were a prominent feature of our walkshop with a taxonomy of sizes and shapes related to civic, corporate and private use. Some cameras attempted to camouflage themselves adopting the colouring of their surroundings.

Again there was no declaration of what data was being collected or how the public could access that data. Perhaps it is this lack of declaration of the use of CCTV surveillance which makes many of us so paranoid about its presence.  To this extent CCTV permeates popular culture as a counter-culture icon from a music video composed of CCTV footage, to its presence in street art. Walkshoppers in London can look out for flying CCTV drones during the London Olympics this year..



Design public interfaces for minimal input and touchless interaction where possible

In the discussion following our walkshop some of us unexpectedly bonded over a shared fear of contagion through touching public interfaces from ATMs to transit, from the unpleasantness of an ATM on a Saturday night to commuters sneezing onto surfaces.

This fear of touching public objects was contrasted with an enthusiasm for touchless interfaces like Near Field Communication (NFC) which we saw at the central train station or even QR codes (despite their many poor deployments as highlighted on WTF QR Codes)

Scan with Care

This desire for touchless interfaces makes sense in the urban context as seen in existing successful interfaces like motion and pressure sensors.

These interfaces ‘fade into the noise of the city’. The desire for touchless or gesture interfaces also calls into question the idea of large scale interactive outdoor public displays with the problems of ‘multiple inputs and potentially thousands of users sharing the same surface as observed by Sami Niemelä of Nordkapp.

A preferable model may be towards  calmer, more ambient interfaces like the automation of flushing, taps and paper towels in public bathrooms as observed by Dan Saffer.

Provide clear information design for public displays and extend interfaces to mobile devices

Faced by a lonely and unloved public wifi kiosk complete with touch screen and web camera we discussed why such interfaces are not used. The conclusion was that as a public interface it adds no value to the user. The required functionality is already available in a more personal, private and hygienic form in a smartphone. There is no need to use public communication devices or even public wifi.

More value is added by interfaces like a NFC touchpoint via the mobile phone where the phone acts a ‘remote control’ for interacting with the city – ‘checking in’ on train journeys and car sharing services.


Although access to remote information like train times is greatly valued there is still a real need for clear prominent well designed public displays. When dashing through the station  there is not the time nor co-ordination to start up a smartphone app and research the departure time and platform for the next fast train. Of course there is potential for more predictive design of such smartphone applications, sensing the context of use via location or an intimacy with the user’s daily routine and calendar.


As public interfaces permeate our cities and further inform our direct experience of the city care must be taken with their design, development and deployment. Public interfaces must be designed for minimal – preferably touchless – interaction, developed for open access to public data and deployed as clearly designed information systems extending their interfaces contextually to mobile devices.

Week 8: Urban Interface Safari – Sunday 11 March 2012

Sunday 11 March 2012  is the first Urban Interface Safari in Cologne – it should be an interesting and fun tour!

A few reminders:

  • We meet at 14:00 (2pm) outside the Cologne Hbf in front of Zeitcafe (look at that big arrow in the picture!).
  • We wait until 14:15 for any latecomers and then start the tour.
  • If you are taking photos with your smartphone please turn on location information. This way we can look at the location of photos on a map later.
  • Bring a pen
  • It will be 11° C tomorrow so wear some sensible clothes
  • I look a bit like this…

Finally the worksheets (walksheets?) are back from the printer so we are all set. See you tomorrow!