Gesture Based Interaction

Abstract:

As our computers evolved, we need a better way to interact with them rather than using keyboards or touch screens. Gesture based interaction with mobile based devices is becoming more and more popular. The idea is that to interact with our host devices by creating a set of predefined gestures. To processing the gestures of a user people following different approaches which include using depth cameras, accelerometers, gyroscopes, etc.

In this project I’ve created a glove which you can wear it and then control the sound volume of your laptop by rotating your hand to left (to increase the volume) or to right (to decrease the volume). There is an accelerometer attached to this glove that we set the volume of host device based on the absolute value of x-axis of this accelerometer.

Implementation:

For this project we used a “Light Blue Bean” which is an arduino board that has an accelerometer and can be connected to a host computer through a Bluetooth connection (Figure 1).

lightblue bean

Figure 1

We also connected a push button to this bean (Figure 2 shows the circuit diagram). As you can see in figure 2 by holding the button it connects pin #4 to vcc so that the digital value of pin #4 is “1” while we hold the button. We also programmed the bean with this code available on github.

beanbutton_bb

Figure 2

Then we used Node-RED to read the value of pin #4 and accelerations every second on the host computer. Node-RED provides a browser-based flow editor that makes it easy to wire together flows using the wide range nodes in the palette. Figure 3 shows our flow in Node-RED. “timestamp” generates a message every second and sends it to “bb3 read scratch”. It causes that we read the value of pin #4 every second to check if the button is pushed or not. Then we check if the value of pin #4 is “1” or not. If it’s “1” the orange block will send a message to “bb3 accel” block. As a result that block will read the accelerations data from the bean and send them to a script written in python over a tcp socket.

node_red

Figure 3

In the python script we check the value of acceleration for X-axis. it can be a number between 1.13 and -1.13. Then we will convert x acceleration to a number between 0 and 100 (vpercent) and then we use volume control api for linux to change the volume on host computer by using this command:

os.system(“pactl set-sink-volume 0 “+`vpercent`+”%”)

Figure 4 shows the actual glove that we have created:

DSC_0251

Figure 4

 

You can find all the codes on github:

https://github.com/S-Mohammad-Hashemi/Gesture/

You also can find more information about Node-RED at http://nodered.org/.

Goal:

Minimum: The minimum goal is to create a gesture based music player so that a user can wear this glove and be able to control the volume and change the tracks (going to next music or previous music). We also want to add other common functionalities of a music player to this glove such as play, pause, stop, fast forward and rewind. This glove will work based on some predefined gestures. The acceleration data of accelerometers will be sent to the host device and then on that device we will create a feature vector of these accelerations and we will feed our Convolutional Neural Network with them to classify different gestures.

Target: Instead of predefined gestures we can let the user to train this glove for himself and then match a specific gesture with a specific task on host machine. For example user can define “thumbs up” gesture to open a  web browser.

Stretch: Instead of communicating with a Linux machine user should be able to use this glove to do some specify task on an Android device. The biggest challenge for this part is that currently I didn’t find any API to connect these beans to an Android phone.

Idea for a larger project: User should be able to use this glove to control his smart home. So by defining a specific gesture user can turn on/off the lights in house or change TV channels or …

 

Advertisements
This entry was posted in assignments, Project 2.1. Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s