ZOUND live project initiated

MIDIaudiohackdayzound

Last week, I initiated, with my Zenexity Hackday team, "ZOUND live" following the previous "ZOUND" experiment but being much more ambitious this time: using both the Audio API, the new MIDI API and electronic music software experience, we start our own web collaborative audio modular tracker.

Live demo of the Hackday application

Inspiration

A lot of features have been inspired from existing software like SunVox or Renoise. However, our version uses 100% web technologies and add collaborative and real time aspects.

Our Tracker

The application has a tracker where you can put notes.

Our Audio modules

The application integrates a modular music concepts.

The web techs

About Web MIDI API

We bought a few cheap MIDI controllers to interact with our application.

MIDI means Musical Instrument Digital Interface, it is the protocol used by a lot of electronic musical instruments for a few decades.

The Web MIDI API is a recent specification which makes MIDI devices accessible from a web page, via a Javascript API.

Recently, Chrome has started to implement it and it is available under Chrome Canary (the dev version) via a flag that you need to enable.

This is the perfect time to start experimenting it!

However, what I feared the most happened on the Hackday: the MIDI API was broken on the morning after a Chrome update during the night! A first version of a browser MIDI permission was implemented but I never succeeded to make it working. The state of the API seems to be still broken on Mac as of writing.

Well, that was already too late for the Hackday, Fortunately we fallbacked on an alternative which relies on a Java applet to access MIDI devices, it was a laggy polyfill though...

Lesson learned: a nightly feature is a nightly feature, never assume features you add via flags are stable (I never did, but it was a Hackday afterall!).

BTW, cheers to @toyoshim who is implementing the MIDI API in Chrome :-)

Using Web Audio API

The Web Audio API is a high-level JavaScript API for processing and synthesizing audio in web applications.

The good thing about this API: it is already an modular audio API, so it's not so hard to build a modular audio application on top of it!

Playframework

Playframework has been used for broadcasting events between clients via WebSocket and synchronize everything on the interface. It is only broadcasting and does not save the song yet.

Backbone.js

Backbonejs was used for the models, views and its nice event system. It was a good library for prototyping and architecture the different parts of the application.

I found Backbone.js especially good when linking all parts together and especially for the network logic. This leads to a very reactive style of programming:

The team

This project has been started during our monthly Hackday at Zenexity, I want to thank my 6 awesome coworkers for being part of the project:

  • @mrspeaker for his awesome electronic music knowledge.
  • @bobylito for his brilliant ideas and his JavaScript skills.
  • @mandubian for his playframework experience and JSON superpower!
  • @etaty for helping with the server synchronization.
  • @skaalf for his cool DrumBox module.
  • @Noxdzine for his talentuous design.

This was actually my first real project managment and it was quite cool!

Hackday is only one day and such an ambitious project is hard to achieve one in a row, the project architecture needed to be a bit ready and having a PoC working before the Hackday. Also I wanted everyone to have fun by experimenting with the Audio API parts and not to be blocked on boring parts.

As a team manager, I also had to define goals to achieve for the Hackday.

Woo, I realize that's not an easy task to manage a team when running out of time!

But fortunately, I think we fulfilled it just in time!

We ended the Hackday with a Real Time demonstration of our application with 4 people interacting together with MIDI controllers.

More to come!

Today, we have a first working version of a collaborative tracker with basic modular audio features:

  • MIDI note support + MIDI control assignation allowing to change module properties.
  • a unique tracker with a 32 lines loop and 23 tracks.
  • Synchronisation of everything: the tracker and modules for all connected clients.
  • off-mode allowing one user to prepare a track which is muted for other users.
  • play/pause and record mode!
  • cursor of users displayed on the tracker.

Stay tuned because there is so much features to come!

The project on Github

As a generative plotter artist, I use code to generate art (creative coding) and physically create it with pen plotters, which is itself a generative process – each physical plot is a unique variant. I love dualities, like digital vs analog physical, abstract vs figurative, orthogonal vs polar, photo vs noise,...