Jump to content

Miburi: Difference between revisions

From Wikipedia, the free encyclopedia
Content deleted Content added
Crudely fill 1 bare URL ref to website homepage, using title 'Home'
GreenC bot (talk | contribs)
 
(One intermediate revision by one other user not shown)
Line 25: Line 25:


== Composers of music for the Miburi==
== Composers of music for the Miburi==
*[[Saburo Hirano]]<ref>{{cite web |url=http://www.nn.iij4u.or.jp/~shirano/ |title=Archived copy |accessdate=2010-02-12 |url-status=dead |archiveurl=https://web.archive.org/web/20110305160623/http://www.nn.iij4u.or.jp/~shirano/ |archivedate=2011-03-05 }}</ref> "Ping Bang" (1995) for solo MIBURI - Uses the Miburi as a Multimedia controller
*[[Saburo Hirano]]<ref>{{cite web |url=http://www.nn.iij4u.or.jp/~shirano/ |title=Saburo HIRANO Home page |accessdate=2010-02-12 |url-status=dead |archiveurl=https://web.archive.org/web/20110305160623/http://www.nn.iij4u.or.jp/~shirano/ |archivedate=2011-03-05 }}</ref> "Ping Bang" (1995) for solo MIBURI - Uses the Miburi as a Multimedia controller
*[[Susumu Hirasawa]]
*[[Susumu Hirasawa]]
**[[P-Model|P-MODEL]]
**[[P-Model|P-MODEL]]
Line 39: Line 39:
*Mulder, A., 1994. "Human movement tracking technology, Hand Centered Studies of Human Movement Project, Technical Report 94-1". http://www.xspasm.com/x/sfu/vmi/HMTT.pub.html
*Mulder, A., 1994. "Human movement tracking technology, Hand Centered Studies of Human Movement Project, Technical Report 94-1". http://www.xspasm.com/x/sfu/vmi/HMTT.pub.html
*Nagashima, Yoichi, "Real-Time Interactive Performance with Computer Graphics and Computer Music, in Proceedings of the 7th IFAC/IFIP/IFORS/IEA Symposium on Analysis, Design, and Evaluation of Man-Machina Systems, IFAC” 1998. http://nagasm.suac.net/ASL/paper/ifac98.pdf
*Nagashima, Yoichi, "Real-Time Interactive Performance with Computer Graphics and Computer Music, in Proceedings of the 7th IFAC/IFIP/IFORS/IEA Symposium on Analysis, Design, and Evaluation of Man-Machina Systems, IFAC” 1998. http://nagasm.suac.net/ASL/paper/ifac98.pdf
*Nishimoto, Kazushi, Mase, Kenji, Fels, Sidney, “Towards Multimedia Orchestra: A Proposal for an Interactive Multimedia Art Creation System. ICMCS, Vol. 1 1999: pp. 900-904”, 1999, http://ieeexplore.ieee.org/iel5/6322/16911/00779322.pdf?arnumber=779322
*Nishimoto, Kazushi, Mase, Kenji, Fels, Sidney, “Towards Multimedia Orchestra: A Proposal for an Interactive Multimedia Art Creation System. ICMCS, Vol. 1 1999: pp. 900-904”, 1999, https://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=779322
*Vickery, Lindsay, “The Yamaha MIBURI MIDI jump suit as a controller for STEIM’s Interactive Video software Image/ine”, Proceedings of the Australian Computer Music Conference 2002, RMIT, Melbourne.
*Vickery, Lindsay, “The Yamaha MIBURI MIDI jump suit as a controller for STEIM’s Interactive Video software Image/ine”, Proceedings of the Australian Computer Music Conference 2002, RMIT, Melbourne.
*Yamaha Corporation, “MIBURI R3 Manual. Tokyo, Japan: Yamaha Corporation”, 1996" http://www2.yamaha.co.jp/manual/pdf/emi/japan/port/MiburiR3J.pdf.
*Yamaha Corporation, “MIBURI R3 Manual. Tokyo, Japan: Yamaha Corporation”, 1996" http://www2.yamaha.co.jp/manual/pdf/emi/japan/port/MiburiR3J.pdf.

Latest revision as of 04:23, 31 July 2024

The Miburi is a wearable musical instrument which was released commercially by the Yamaha Corporation’s Tokyo-based experimental division in 1995.[1]

Categorisation and functions of the Miburi

[edit]

The Miburi can be characterized as an “inside-in” system according to Axel Mulder’s three categories of motion sensing systems:

  • inside-in - sensor(s) and source(s) that are both on the body;
  • inside-out - on-body sensors that sense artificial external sources;
  • outside-in - external sensors that sense artificial sources on the body [2]

It conforms to what Todd Winkler refers to as the ‘body sensor’ group of controllers (the other are spatial sensors, acoustic models and ‘new instruments’).[3]

The Miburi system consists of a vest with embedded capacitive displacement sensors, two hand-grips, and shoe inserts with pressure sensors, and a belt-worn signal distribution unit joined by a cable to a small synthesizer/MIDI converter. A wireless version, conforming to Japanese wireless frequency regulations was available within Japan only.

The Miburi's belt unit, “MBU-20”, processes data from the sensors into MIDI pitch and velocity information. The unit can be programmed to interpret the data using three ‘trigger’ modes: ‘Cross-point’ mode; ‘Stop’ mode and ‘All’ a combination of both modes. ‘Cross-point’ mode measures the speed of the transducer’s flexion as it traverses its zero point (when the flex sensor is straight). The six ‘flex’ sensors send 12 notes – this is because they measure inward and outward movement of each joint as separate notes. ‘Stop’ mode sends note and maximum velocity values at the conclusion of a gesture. ‘All’ interprets sensor data in both modes simultaneously.[4]

The mapping of each sensor is highly programmable. Each sensor can be mapped on the synthesizer unit, “MSU-20”, to any MIDI note, interpreted in any of the three modes outlined above according to 48 different response modes. The response modes (preset by Yamaha) define the manner in which the sensor’s output is graphed to velocity. All the above definitions are components of a single Map ‘Preset’, there are 32 programmable preset positions available.

Evaluation of the Miburi

[edit]

These features make the Miburi extremely effective as a computing input device. However the Miburi’s synthesizer unit is limited in its possibilities as a sound source and more importantly is only able to process gestures in a direct one-to-one relationship to the sounds they produce.

The need to ‘tether’ the Miburi to its synthesizer unit is also clearly a drawback for movement detection and a restriction for the dancer. However, the Miburi has the robust design, and very predictable sensor output that might be expected from one of the principal electronic musical instrument manufacturers.

The Miburi may be combined with more sophisticated sound sources and software-based interactive mapping such as MAX/msp. Extensions of its basic functions include control of video,[5] lighting,[6] utilization as a component of a “multimedia orchestra” [7] and “to help children engage their whole bodies while interacting with computers”.[8]

Composers of music for the Miburi

[edit]

References

[edit]
  1. ^ Marrin, Teresa and Paradiso, Joseph “The Digital Baton: a Versatile Performance Instrument, International Computer Music Conference, Thessaloniki, Greece, pages 313-316, 1997”
  2. ^ Axel Mulder, Human movement tracking technology, Hand Centered Studies of Human Movement Project, Technical Report 94-1, 1994. “[1]
  3. ^ Todd Winkler, “Composing Interactive Music: techniques and ideas using Max. Cambridge Massachusetts, MIT Press p. 315-8”, 1998
  4. ^ Yamaha Corporation, “Miburi R3 Manual. Tokyo, Japan: Yamaha Corporation”, 1996.
  5. ^ "Ping-Bang II Description".
  6. ^ Vickery, Lindsay, “The Yamaha Miburi MIDI jump suit as a controller for STEIM’s Interactive Video software Image/ine”, Proceedings of the Australian Computer Music Conference 2002, RMIT Melbourne”
  7. ^ Nishimoto, Kazushi, Mase, Kenji, Fels, Sidney, “Towards Multimedia Orchestra: A Proposal for an Interactive Multimedia Art Creation System. ICMCS, Vol. 1 1999: pp. 900-904”, 1999
  8. ^ Zigelbaum, Jamie, Millner, Amon, Desai, Bella, Ishii, Hiroshi, “BodyBeats: whole-body, musical interfaces for children. CHI Extended Abstracts 2006: pp. 1595-1600”, 2006
  9. ^ "Saburo HIRANO Home page". Archived from the original on 2011-03-05. Retrieved 2010-02-12.
  10. ^ "Home". miburi.org.
  11. ^ Vickery, Lindsay, “The Yamaha Miburi MIDI jump suit as a controller for STEIM’s Interactive Video software Image/ine”, Proceedings of the Australian Computer Music Conference 2002, RMIT Melbourne”

Notes

[edit]
[edit]