Visualizations using Machine Learning

Overview
Using Processing and an open source machine learning software called the “Wekinator”, I was able to create a program that reads input in from an accelerometer and pipes that information into a Processing sketch that animates a cube and acts as a very basic synthesizer. I programmed a BBC:microbit to send out its x, y, z accelerometer values and pipe those into a Processing sketch which translates the serial accelerometer data into OSC (Open Sound Control) messages. I then pipe the OSC messages containing the accelerometer values into the Wekinator and use it to “train” my program. Once the Wekinator is done training, I can run the program and control my Processing sketch which draws a square and produces sound directly with my accelerometer.

System Diagram

miniproject

Project Goals
Minimum: make a more complex application with the wekinator
Target: create a wearable or a way to use dance moves to train the Wekinator to produce visuals/actions
Stretch: create a program that allows me to use a MIDI controller to train the Wekinator to produce visuals

Annie Kelly

Advertisements
This entry was posted in assignments, Project 2.1. Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s