Using Processing and an open source machine learning software called the “Wekinator”, I was able to create a program that reads input in from an accelerometer and pipes that information into a Processing sketch that animates a cube and acts as a very basic synthesizer. I programmed a BBC:microbit to send out its x, y, z accelerometer values and pipe those into a Processing sketch which translates the serial accelerometer data into OSC (Open Sound Control) messages. I then pipe the OSC messages containing the accelerometer values into the Wekinator and use it to “train” my program. Once the Wekinator is done training, I can run the program and control my Processing sketch which draws a square and produces sound directly with my accelerometer.
Minimum: make a more complex application with the wekinator
Target: create a wearable or a way to use dance moves to train the Wekinator to produce visuals/actions
Stretch: create a program that allows me to use a MIDI controller to train the Wekinator to produce visuals