There is such an enormous amount of innovative technology that
has been developed for the Brain Opera that it is impossible to
mention it all here. Brain Opera technology is a natural extension
of the Hyperinstruments project, started at the MIT Media Lab
in 1986 by Tod Machover and Joe Chung, and joined by Neil Gershenfeld
in 1991 and myself in 1993. At first designed to enhance the virtuosity
of some of the world's greatest performers, from Yo-Yo Ma to Prince,
hyperinstruments started evolving in 1991 towards the development
of expressive tools for non-professional musicians. The Brain
Opera is the culmination to date of this work, and points the
way to the further development of expressive objects (furniture,
remote controls, clothing, etc.) and responsive environments (including
living rooms, concert halls, and department stores).
Among the more significant new technical developments for the
Brain Opera are the Harmonic Driving system, the Melody Easel,
the Rhythm Tree, the Gesture Wall, the Digital Baton, the Singing
and Speaking Trees, and the Sensor Carpet. Among the project's
numerous software innovations are the Singing Trees (analysis
of every nuance and "feeling" of vocal quality); Harmonic
Driving (parametric algorithms that allow a piece of music to
be shaped and "personalized" while it is playing); the
Rhythm Tree (which analyzes multiple-person behavior to create
a complex systemic reaction); the Performance Hyperinstruments
(which forge an array of continuous gesture and discrete positional
information into intuitive, natural controls); and the entire
Brain Opera system, which is itself a complex networked environment
capable of integrating new elements into an existing structure
automatically or in human-assisted fashion.
Below is a graphical and technical discussion of the technological
developments of each of the individual Brain Opera experiences:
1) Forest Stations
A floormat switch detects the user's presence, and starts the
experience. The user then navigates through the interactive database
using a hand-held piezoresistive mouse that detects the center
of pressure of the thumb, and moves the pointer accordingly on
an embedded color LCD VGA display.
2) Harmonic Driving
The presence of a seated participant is detected when a light
beam pointed across the chair is interrupted. The user controls
the experience with a novel joystick made from a large, bendable
spring. Two-axis bending angles are measured using capacitive
sensing to detect the relative displacement between the spring's
coils at its midpoint. Twist is also measured with a potentiometer
that rotates through the relative angle between the top and bottom
of the spring.
3) Melody Easel
We use a pressure-sensitive IntelliTouch Screen from ELO, based
on ultrasound propagation through the screen surface. This device
produces 8 bits of pressure information, along with precise x
and y screen coordinates. The figure shows an earlier version
of the Melody Easel, where we postulated the addition of a shunt-mode
Fish sensor array placed around the monitor to sense noncontact
hand gesture above the screen. The Fish was dropped from the
final implementation, since the touch screen interface itself
produced ample parameters for the music generation software and
made a sufficiently satisfying interface.
4) Rhythm Tree
A simple microprocessor on each pad analyzes the signal coming from a piezoelectric (PVDF) strip, which picks up the strike. A set of simple parameters are extracted from a 5 ms sample of the PVDF signal after a significant peak has been detected, indicating a valid hit. These parameters include the polarity of the initial PVDF signal peak (indicating a top vs. side impact), the number of zero crossings detected (indicating a sharp elastic hit vs. a dull slap with the hand remaining on the pad), and the net integrated signal amplitude (producing 14 bits of velocity information). After a fast, bit-slice poll from the bus host, the struck pads send their data across a shared RS-485 serial network to a host processor, which formats the data into MIDI and passes it to the main computer running ROGUS music generation software. In order to simplify cabling, up to 32 pads can be daisy-chained (like a string of Christmas lights) onto a single host and bus line. We will have 10 such strings running in the Brain Opera Lobby. Each pad also houses a bright LED, which can be illuminated with a dynamically variable intensity. The pads are completely programmable via MIDI system-exclusive commands. We have written a Visual Basic application to adjust the parameters (i.e. trigger sensitivity, light flash options, trigger rejection flags, integration time,...) for individual pads or groups of pads, in order to rapidly configure the rhythm tree into a working configuration. In addition to these capabilities, the rhythm pad string used with the performance instruments is augmented with Gesture Wall sensors, enabling the hand motion above the pads to be tracked before the pads are struck.
5) Gesture Wall
A performer steps onto a plate that has a harmless low-frequency
(50 Khz), low-voltage (10 Volts) RF signal applied to it.
This signal is then couples through the performer's shoes and
is broadcast through their body to a set of four pickup antennas
located on goosenecks around the perimeter of the screen. These
signals change with the distance of the performer from the respective
sensors (an LED mounted in each sensor glows with increasing intensity
as the performer's body approaches). The sensor data is transferred
to a PC running ROGUS, where it is analyzed for gestural characteristics.
Before starting the experience, the user must calibrate out the
coupling strength of their shoes and body mass, which vary considerably
from person to person. This is accomplished by touching a reference
pickup electrode, which adjusts the transmitted signal such that
every participant radiates equally. A Fish sensor with linearizing
log amplifiers is used for the sensing hardware, just as with
the Sensor Chair. Hand-on-calibrator state is detected by breaking
an IR beam directed across the calibrator surface.
6) Digital Baton
A small microprocessor in the baton samples signals from 5 pressure-sensitive
resistors potted into the baton skin (to measure finger and hand
pressure) and 3 orthogonal accelerometers in the baton base (to
measure sweeping gestures and beats). These signals are sent through
a wire to the host computer running ROGUS. A camera housing a
position-sensitive photodiode looks at an infrared LED mounted
at the baton tip. This camera is only sensitive to the 20 kHz
signal emitted from the LED; all other light sources are ignored.
The photodiode in the camera directly produces a signal that determines
the horizontal and vertical coordinates of the baton tip; no video
processing is required. See below for an older conceptual implementation
of the Digital Baton.
7) Audience Sensing in the Performance Space
A Sensor Floor, composed of a 6 x 10 foot mat surface atop a matrix
of 64 pressure-sensitive piezoelectric (PVDF) wires, measures
the position and intensity of footsteps, turning them into MIDI
note events. The upper body motion is sensed above this region
with a pair of quadrature-demodulated 2.6 GHz Doppler radars with
beams formed by flat, 4-element micropatch arrays. These devices
produce MIDI controller values corresponding to amount of motion,
velocity, and direction of motion, projected normal to the radiators.
In addition, an 8-channel MIDI-controlled ranging sonar system
(using simple pulse-echo detection with 40 kHz piezoceramic heads
and a TVG) has been developed to monitor remote distance to people
and objects from 1 to 25 feet away.
8) The Sensor Chair
As labeled on the chair layout diagram, above, the copper plate (A) affixed to the top of the
chair cushion is a transmitting antenna being driven at roughly 70 kHz. When a person is
seated in the chair, they effectively become an extension of this antenna; their body acts as a
conductor which is capacitively coupled into the transmitter plate.
Four receiving antennas (B) are mounted at the verticies of a
square, on poles placed in front of the chair. These pickups receive
the transmitted signal with a strength that is determined by the
capacitance between the performer's body and the sensor antenna.
As the seated performer moves his hand forward, the intensities
of these signals are thus a function of the distances between
the hand and corresponding pickups. The pickup signal strengths
are digitized and sent to a Macintosh computer, which estimates
the hand position. A pair of pickup antennas are also mounted
on the floor of the chair platform, and are used to similarly
measure the proximity of left and right feet, providing a set
of pedal controllers. In order for a performer to use these sensors,
he must be seated in the chair, and thus coupled to the transmitting
antenna. Other performers may also inject signal into the pickup
antennas if they are touching the skin of the seated individual,
thus becoming part of the extended antenna system. The sensor
antennas are synchronously demodulated by the transmitted signal;
this produces a receiver tuned precisely to the waveform broadcast
through the performer's body and rejects background from other
sources.
A pair of footswitches (D) are incorporated in this system to
provide sensor-independent triggers. These are used for changing
parameters when the foot pedals are dedicated to generating musical
sounds, or for instigating t riggers when the performer is not
seated, hence is unable to use the sensors.
The hand sensor antennas (B) are composed of a copper mesh encased
inside a translucent plastic bottle. A halogen bulb is mounted
inside this mesh which is illuminated with a voltage proportional
to the detected sensor sign al (thus is a function of the proximity
of the performer's hand to the sensor), or driven directly by
the Macintosh computer as a MIDI light-instrument. Four lights
are mounted below the platform (F); these are correspondingly
driven by the foot-sensor signals or directly through MIDI. A
digital display (E) is also mounted on one of the sensor posts;
this is similarly defined as a MIDI device, and is driven by the
Macintosh to provide performance cues (i.e. amount of time or
triggers remaining in a particular musical mode, etc.). The sensors
are used to trigger and shape sonic events in several different
ways, depending on the portion of the composition that is being
performed. The simplest modes use the proximity of the performer's
hand (or head in the case of Teller's closing bit) to the plane
of the hand sensors (z) to trigger a sound and adjust its volume,
while using the position of the hand in the sensor plane (x,y)
to change the timbral characteristics. Other modes divide the
x,y plane into many zones, which contain sounds triggered when
the hand moves into their boundary (i.e. the percussion mode).
Several modes produce audio events that are also sensitive to
the velocity of the hands and feet.
9) Summary of Embedded Electronics Cards Designed for this
Project
Fishbrain Board: HC11 with MIDI, RS-232, bootload, 4-chnls
of analog conditioning, proto area, user port, etc. Essentially
a functional block for embedded MIDI or serial controllers.
Updated Fish: Fish, with bootload and a few minor tweaks
added.
Gesture Wall Utility: Card with 8-channel light driver,
Fish autocalibrator, MIDI input interface, and connector changeouts.
Plugs into Fish.
Calibrator Corrector: Adjustable nonlinear correction
for Gesture Wall autocalibrator.
Autocalibrator Hand Sensor: Card with sensor electrode
matched to hand size, with optical hand-down detection, buffer
amplifier, and LED drivers.
Brain Opera Drivers: Small cards based around AD712, which
buffer fish signals and send them down RJ-11 cable. Flat frequency
compensation and low front-end gain will produce very few drift
problems. LED also present, which the Gesture Wall utility lights
to make glowing sensors.
New Log Amps: New, more accurate, simpler 4-channel log
amps based around Burr Brown Log100JP. Used in the Gesture Wall.
Harmonic Driving Utility: Card with 8-channel light driver,
MIDI input interface, LED drivers, and buffers for twist pot and
occupant-detector photocell. Plugs into Fish.
Quad Buffer Amplifiers: 4 channels of Fish buffer amplifiers
based on AD713, also with frequency flattening and low front-end
gain for low drift.
Digital Drumpad: PVDF sampling and LED drive with PIC
on RS-485 bus.
Drumpad Concentrator: Card that sits with a Fishbrain
at the base of the drumpad bus; essentially an intelligent UART
for the HC11 with RS-485 drive.
Sensor Floor Interface: Card with buffer amplifiers, peak
detectors, and multiplexers to interface to a Fishbrain and scan
64 channels of analog input from PVDF wires impregnated into a
carpet to detect footsteps and measure their energy/location.
Doppler Radar Head: Card with micropatch phased array
and electronics (local oscillator, quadrature diode demodulator,
opamp driver) to measure motion.
Doppler Radar Analog Processor: Card with direction determinator
and envelope followers and filters to convert the doppler signals
into a triad of voltages that conform to direction of motion,
amount of motion, and rapidity of motion.
Pulsed Sonar Head: Card to manage a simple TOF sonar, and
produce an output gate (going low when ping goes out, high on
return), signal envelope, and an analog voltage proportional to
range. These signals, plus power and trigger inputs are applied
via an RJ-45 cable.
8-Channel Sonar Dispatcher: Card to allow a Fishbrain to
manage up to 8 independent pulsed sonar heads.
Warning; not all of these cards are currently ready for flawless
production; some require hand-patches.
10) Students and Collaborators
Forest Stations: Patrick Pelletier, Will Oliver
Harmonic Driving: Matt Gorbet
Melody Easel: Kai-Yuh Hsiao
Rhythm Tree: Ara Knaian, Josh Smith, Matt Reynolds
Digital Baton: Theresa Marrin, Chris Verplatse
Sensor Floor: Craig Alber
Doppler Radar: Matt Reynolds
Sensor Chair: Ed Hammond, Pete Rice, Eran Ergozy
Object Design: Maggie Orth, Ray Kinoshita, Sharon Daniel
Electronics Fabrication: Rick Ciliberto, Joel Rosenberg