Time governs all aspects of music; the pulse, meter, rhythmic patterns, syncopation and length or format of any given composition, let alone the critical need for musicians to synchronise parts during a performance. In networked music environments, the importance of time is magnified. The Musical Instrument Digital Interface (MIDI), introduced in 1983, remains one of the most efficient and adaptable protocols for real-time music over the Internet. Unlike audio data, which demands significant bandwidth and thus a greater exposure to network latency, MIDI is inherently superior by offering an infinitesimally small data size by comparison, while retaining musical precision alongside its capacity for innovative, multi-modal artistic expression.
MIDI and the Multi-Dimensional Nature of Time
Time is not a singular entity but a multi-dimensional phenomenon. Scientific time for example, governed by clocks and entropy, differs from musical time, which is cyclical and fluid (Pressing, 1993). Human perception of time further complicates this, introducing subjective experiences of rhythm, pulse, and duration; as we often observe events demonstrating a beginning, a middle, and an end. MIDI is uniquely positioned to navigate these layers of time, facilitating precise temporal coordination without the restrictions imposed by continuous audio streaming.
Clock time is measurable, continuous, and directional, providing the foundation for synchronisation in all digital systems. However, musical time operates on multiple levels: the moment-to-moment interplay of notes, the structural repetition of motifs, and the broader cyclical patterns of loops and rhythmic structures (Pressing, 1993). MIDI excels in all these domains, transmitting both note data and time-sensitive control signals to shape musical expressions in real-time.
Why MIDI is Superior to Audio for Data Transmission
Unlike the high-bandwidth streaming of audio, MIDI transmits only the essential information needed to recreate a performance. A single stereo audio file can require over 1 Mbps per channel, whereas MIDI messages operate in the range of kilobits per second (Grasso, 2020). This opens the door to near-instantaneous transmission across a network, and coupled with the singular MIDI messages for Note ON/Velocity/Note Off commands (compared to audio’s singular, ongoing data stream), MIDI is far less susceptible to network congestion and latency-related artifacts.
Audio streaming faces fundamental constraints in WAN (Wide Area Network) environments. The Internet prioritises data throughput over low-latency communication, meaning that audio packets are often subject to buffering and dropouts (Wilson, 2018). In contrast, MIDI transmits discrete event messages, ensuring that musical intent is preserved even under variable network conditions. Furthermore, MIDI data can be timestamped and adjusted in real time, allowing for predictive latency compensation that is impossible with raw audio (Lazzaro & Wawrzynek, 2011).
The Role of Repetition, Loops, and Synchronisation in MIDI Performance
Repetition is a core element of musical time, manifesting as loops, motifs, and rhythmic cycles. MIDI’s capacity to encode and manipulate repeated sequences allows for dynamic interaction between performers, ensuring that even remote collaborations maintain cohesion. Unlike audio, which must be streamed in full, MIDI can instantaneously trigger and modify existing sound sources, thus preserving the nuances of live performance in a timely manner.
Additionally, MIDI’s precision in timing extends beyond musical events to control elements such as modulation, synchronisation of visual elements, and interaction with external hardware. Through protocols like Open Sound Control (OSC), MIDI can drive complex multimedia experiences, reinforcing its role as the definitive tool for real-time artistic synchronisation (Clark & Basman, 2017).
The Future of MIDI in Telematic Music Performance
Despite advances that improve audio transmission, MIDI remains unparalleled in its efficiency and adaptability. Future developments in Telematic Music Performance will further leverage MIDI’s strengths, incorporating AI-driven predictive timing and adaptive jitter correction to refine real-time interaction. As the foundation of digital music, MIDI continues to bridge the divide between scientific time and human musical intuition, ensuring that the future of live online collaboration remains synchronised, expressive, and limitless.