In Digital Ground Malcolm McCullough talks about how tuning is a central part of interaction design practice. How part of the challenge of any project is to get to a point where you can start tweaking the variables that determine the behaviour of your interface for the best feel.
“Feel” is a word I borrow from game design. There is a book on it by Steve Swink. It is a funny term. We are trying to simulate sensations that are derived from the physical realm. We are trying to make things that are purely visual behave in such a way that they evoke these sensations. There are many games that heavily depend on getting feel right. Basically all games that are built on a physics simulation of some kind require good feel for a good player experience to emerge.
Physics simulations have been finding their way into non-game software products for some time now and they are becoming an increasing part of what makes a product, er, feel great. They are often at the foundation of signature moments that set a product apart from the pack. These signature moments are also known as microinteractions. To get them just right, being able to tune well is very important.
The behaviour of microinteractions based on physics simulations is determined by variables. For example, the feel of a spring is determined by the mass of the weight attached to the spring, the spring’s stiffness and the friction that resists the motion of the weight. These variables interact in ways that are hard to model in your head so you need to make repeated changes to each variable and try the simulation to get it just right. This is time-consuming, cumbersome and resists the easy exploration of alternatives essential to a good design process.
In The Setup game designer Bennett Foddy talks about a way to improve on this workflow. Many of his games (if not all of them) are playable physics simulations with punishingly hard controls. He suggests using a hardware interface (a MIDI controller) to tune the variables that determine the feel of his game while it runs. In this way the loop between changing a variable and seeing its effect in game is dramatically shortened and many different combinations of values can be explored easily. Once a satisfactory set of values for the variables has been found they can be written back to the software for future use.
I do believe such a setup is still non-trivial to make work with todays tools. A quick check verifies that Framer does not have OSC support, for example. There is an opportunity here for prototyping environments such as Framer and others to support it. The approach is not limited to motion-based microinteractions but can be extended to the tuning of variables that control other aspects of an app’s behaviour.
For example, when we were making Standing, we would have benefited hugely from hardware controls to tweak the sensitivity of its motion-sensing functions as we were using the app. We were forced to do it by repeatedly changing numbers in the code and building the app again and again. It was quite a pain to get right. To this day I have the feeling we could have made it better if only we would have had the tools to do it.
Judging from snafus such as the poor feel of the latest Twitter desktop client, there is a real need for better tools for tuning microinteractions. Just like pen tablets have become indispensable for those designing the form of user interfaces on screens. I think we might soon find a small set of hardware knobs on the desks of those designers working on the behaviour of user interfaces.