Miburi
The MIBURI
The MIBURI was released commercially by the Yahama Company’s Tokyo-based experimental division in 1994[1]. The MIBURI is characterized as an “inside-in” system according to Alex Mulder’s three categories of motion sensing systems:
- inside-in - sensor(s) and source(s) that are both on the body;
- inside-out - on-body sensors that sense artificial external sources;
- outside-in - external sensors that sense artificial sources on the body [2]
It conforms to what Todd Winkler refers to as the ‘body sensor’ group of controllers (the other are spatial sensors, acoustic models and ‘new instruments’) [3]. (two types R3 and??)
The MIBURI system comprises a vest with embedded flex-sensors, two hand-grips (with buttons), shoe inserts with pressure sensors, and a belt-worn signal distribution unit joined by a cable to a small synthesizer/MIDI converter. A wireless version, conforming to Japanese wireless frequency regulations was available within Japan only.
The MIBURI’s belt unit, “MBU-20”, processes data from the sensors into MIDI pitch and velocity information. The unit can be programmed to interpret the data using three ‘trigger’ modes: ‘Cross-point’ mode; ‘Stop’ mode and ‘All’ a combination of both modes. ‘Cross-point’ mode measures the speed of the transducer’s flexion asit traverses its zero point (when the flex sensor is straight). The six ‘flex’ sensors send 12 notes – this is because they measure inward and outward movement of each joint as separate notes. ‘Stop’ mode sends note and maximum velocity values at the conclusion of a gesture. ‘All’ interprets sensor data in both modes simultaneously [4].
The mapping of each sensor is highly programmable. Each sensor can be mapped on the synthesizer unit, “MSU-20”, to any MIDI note, interpreted in any of the three modes outlined above according to 48 different response modes. The response modes (preset by Yamaha) define the manner in which the sensor’s output is graphed to velocity. All the above definitions are components of a single Map ‘Preset’, there are 32 programmable preset positions available.
These features make the MIBURI extremely effective as a controller. However the MIBURI’s synthesizer unit is limited in its possibilities as a sound source and more importantly is only able to process gestures in a direct relationship to the sounds they produce.
Teresa Marrin assesses these shortcomings in the following way:
‘One place where the Miburi might be strengthened, however, is in its reliance on triggers; the synthesizer unit which plays the music has no "intelligence" other than to provide a one-to-one mapping between actions and sounds. In fact, the Miburi's responses are as direct as the Theremin's. But because it has so many more controls available for the user than the Theremin it might be advantageous to make use of those controls to create or shape more complex, autonomous musical processes. [5].
The need to ‘tether’ the MIBURI to its synthesizer unit is also clearly a drawback for movement detection and a restriction for the dancer. However, the MIBURI has the robust design, and very predictable sensor output that might be expected from one of the principal electronic instrument manufacturers.
The Miburi represents a rare excursion by a large, commercial organization into the uncharted waters of interface design. [6].
The MIBURI may be combined with more sophisticated sound sources and software-based interactive mapping such as MAX/MSP. Extensions of its basic functions include control of video [7], lighting [8], utilization as a component of a “multimedia orchestra” [9] and “to help children engage their whole bodies while interacting with computers” [10].
Composers of music for the MIBURI
Saburo Hirano http://www.nn.iij4u.or.jp/~shirano/ Hitomi Kaneko Hiroshi Chu Okubo http://www.miburi.org/ Professional Miburist Lindsay Vickery
References
- ^ Marrin, Teresa and Paradiso, Joseph “The Digital Baton: a Versatile Performance Instrument, International Computer Music Conference, Thessaloniki, Greece, pages 313-316, 1997”
- ^ Axel Mulder, Human movement tracking technology, Hand Centered Studies of Human Movement Project, Technical Report 94-1, 1994. “[1]”,
- ^ Todd Winkler, “Composing Interactive Music: techniques and ideas using Max. Cambridge Massachusetts, MIT Press p. 315-8”, 1998
- ^ “Yamaha MIBURI Manual, publisher? p. 41”, date?
- ^ Teresa Marrin, 1996. “Toward an Understanding of Musical Gesture: Mapping Expressive Intention with the Digital Baton, Submitted to the Program in Media Arts and Sciences, School of Architecture and Planning, in partial fulfillment of the requirements for the degree of Master of Science in Media Arts and Sciences at the Massachusetts Institute of Technology”, 1996 “[web.media.mit.edu/~marrin/Thesis.htm]”
- ^ Teresa Marrin, 1996. “Toward an Understanding of Musical Gesture: Mapping Expressive Intention with the Digital Baton, Submitted to the Program in Media Arts and Sciences, School of Architecture and Planning, in partial fulfillment of the requirements for the degree of Master of Science in Media Arts and Sciences at the Massachusetts Institute of Technology”, 1996 “[web.media.mit.edu/~marrin/Thesis.htm]”
- ^ http://music.columbia.edu/fest99/festreport/multimedia/Ping_Bang/description.html
- ^ Vickery, Lindsay, “The Yamaha MIBURI MIDI jump suit as a controller for STEIM’s Interactive Video software Image/ine”, Proceedings of the Australian Computer Music Conference 2002, RMIT Melbourne”
- ^ Nishimoto, Kazushi, Mase, Kenji, Fels, Sidney, “Towards Multimedia Orchestra: A Proposal for an Interactive Multimedia Art Creation System. ICMCS, Vol. 1 1999: pp. 900-904”, 1999
- ^ Zigelbaum, Jamie, Millner, Amon, Desai, Bella, Ishii, Hiroshi, “BodyBeats: whole-body, musical interfaces for children. CHI Extended Abstracts 2006: pp. 1595-1600”, 2006
External References
- http://www.yamaha.co.jp/design/pro_1990_08.html
- Geers, Doug, (2000). “Multimedia Interactive Works, Miller Theater.(Columbia Interactive Arts Festival), in Computer Music Journal . 24.2 (Summer 2000): p85(3).”
- Marrin, Teresa, “Inside the "Conductor's Jacket": Analysis, Interpretation and Musical Synthesis of Expressive Gesture. Ph. D. thesis, Massachusetts Institute of Technology:, 2000.
- http://www.yamaha.co.jp/design/pro_1990_08.html
- Marrin, Teresa, “Toward an Understanding of Musical Gesture: Mapping Expressive Intention with the Digital Baton. M.S. Thesis, Media Laboratory. Cambridge, MA, MIT”, 1996
- Mulder, A., 1994. Human movement tracking technology, Hand Centered Studies of Human Movement Project, Technical Report 94-1
- Nagashima, Yoichi, "Real-Time Interactive Performance with Computer Graphics and Computer Music, in
Proceedings of the 7th IFAC/IFIP/IFORS/IEA Symposium on Analysis, Design, and Evaluation of Man-Machina Systems, IFAC” 1998.
- Nishimoto, Kazushi, Mase, Kenji, Fels, Sidney, “Towards Multimedia Orchestra: A Proposal for an Interactive Multimedia Art Creation System. ICMCS, Vol. 1 1999: pp. 900-904”, 1999
- Vickery, Lindsay, “The Yamaha MIBURI MIDI jump suit as a controller for STEIM’s Interactive Video software Image/ine”, Proceedings of the Australian Computer Music Conference 2002, RMIT, Melbourne.
*Yamaha Corporation, “MIBURI R3 Manual. Tokyo, Japan: Yamaha Corporation”, 1996" http://www2.yamaha.co.jp/manual/pdf/emi/japan/port/MiburiR3J.pdf.
- Winkler, Todd, “Composing Interactive Music: techniques and ideas using Max. Cambridge Massachusetts, MIT Press”, 1998.
- Zigelbaum, Jamie, Millner, Amon, Desai, Bella, Ishii, Hiroshi, “BodyBeats: whole-body, musical interfaces for children. CHI Extended Abstracts 2006: pp. 1595-1600”, 2006