Lecture 07: Interface and devices of control

Some of these notes originally produced by Owen Green:

Preamble

What do we mean by ‘interface’?

It’s easy to collapse the idea to just being about knobs and dials but this approach misses a lot and is prone to produce slightly disappointing results. Whilst the bulk of this session is going to revolve around practical demonstrations of getting data from the outside world in to MAX, it is important that we first take the time to think properly about what it is we are trying to achieve with our interfaces / interfacing.

Function & Affordance

tech1

The orthodox conception of technology assumes a simple, static model where technologies represent functions in an uncomplicated way. By this account the ‘interface’ is just something that mediates our actions and be judged more or less successful in how it enables one to perform some function. This model only really works for relatively simple cases: by and large the notion of what function a technology performs simply isn’t so clear cut (especially for things like musical instruments).

tech2

One way of approaching the richness of the topic beyond simply knobs and dials is through the concept of affordances, which come from James Gibson’s work developing an ecological account of perception. Affordances give us a more sophisticated idea of what something is supposed to do by highlighting that what something can be used for is actually a function of the relationship between a organism, an object and wider environmental factors (including the autobiography of the organism). For instance, a chair affords sitting to us humans because its shape and our shape make this an evident possibility but also because we have a personal and cultural history of chairs being things to sit on. To a woodworm, the affordances are wholly different.

By this account, interface takes on a much more dynamic role as something that arises between a person and a thing: what this thing seems able to do is shaped in part by its actual physical characteristics but also by our curiosity, desires and sense of possibility. This recalls a concept of intra-action developed by the theorist-physicist Karen Barad:

Applied to our working context then, our point is that the way we design interfaces for our sound making gizmos, especially when dealing with software, is a more sensitive set of problems than just finding something  that works. The decisions we make will promote some ways of interacting and occlude others and precisely how this happens is a function of us as cultural beings and individuals, but also the the kind of work we’re trying to do. (e.g. making something to support making a collaborative soundtrack for a film…)

Gesture

A major concern when crafting interfaces for soundful software revolves around the relationship between the characteristics of our physical movement and the resulting sound.

How these relationships work out have a significant bearing on how playable we find our interface, as well as how intuitive and how flexible.

Things to consider in particular are whether the interface affords being timely for the context we’re working in

  • Can we get to where we need to be, sonically, in the time available?
  • Can we start and stop with appropriate speediness?

and the extent to which a sound seems to be appropriate to the physical movements we’re making

  • Do differences in physicality produce desirable differences in sound?
  • Does the system respond to our excitement?

Mapping

The standard vocabulary for approaching the engineering of crafting these relationships between physical and sonic gesture centres around the concept of mapping.

Conceptually, we regard the problem as concerning how to relate a set of controls (data from somewhere) to a set of parameters of our synth / system. I’m sure this is familiar to many of you who have used, say, Ableton Live.

A mapping describes the way that an input relates to an output. We’ve already seen some of this in previous weeks where we’ve scaled inputs in to some range, or seen that we can interpolate between presets with pattrstorage. Viewed in this way, we have a powerful toolbox of techniques inherited from control system engineering.

We can conceptualise different kinds of mappings between controls and parameters:

  • one to one: a single control drives a single parameter
  • one to many: a control drives many parameters
  • many to one: different controls affect a single parameter
  • many to many: some arbitrarily complex relationship between controls and parameters.

This separation of concerns certainly makes engineering sense in terms of keeping the problems tractable, and maximising the potential for re-using components of software, but does miss out on the textural richness of real physical systems, where what we regard as ‘controls’ may themselves be interestingly cross-coupled rather than totally independent (just for instance).

This leads to a important idea to bear in mind: actual physical actions may be radically non-linear, surprising, noisy, extensible in ad hoc ways, whereas systems designed according to a engineering orthodoxy may be sterile, unexciting, closed. A pressing question is always how to preserve the excitement of the phsyical and the messy (to which there are essentially two pathways: engineering less (e.g. not de-noising) or engineering much, much more (e.g. machine learning).

Your options in designing a computer-sound control system

Option 1 – controller to sound

A bespoke piece or project designed specifically for the interface you have to hand:

  • benefits
  • challenges

Option 2 – some control is needed but you don’t know what it will be

A project that needs control but may have different controllers as the piece and your studio evolves

  • benefits
  • challenges

Option 3 – controller needs to do multiple things simultaneously

  • benefits
  • challenges

Option 4 – a change in controller will radically change how things are played but not the system / sound itself

  • benefits
  • challenges

Demos

Getting things in to Max

  • MIDI
  • Phone
  • HI (joysticks)
  • Others? (Camera? Microphones?)

Dealing with Data

  • Scaling, smoothing, labelling, routing

Mapping

  • Let’s try and make a example together

Tools of interest

Iannix – sequencing – www.iannix.org/en/
Antescofo – score following – forumnet.ircam.fr/product/antescofo-en/
Qlab – show control – figure53.com/qlab/ (can pay per day for full functionality)
Osculator – osculator.net/download/ (it’s paid, but quite cheap given the hours required to make similar in Max/PD)…

— what else?