Kinect Invisible Drumset

This program uses the RGB Camera and Tracking program to display a tracked user in front of a background. A drum set image is then displayed over the user. When the kinect is tracking the user, the z coordinate of the drum area is relative to the user’s right knee. Certain x/y coordinates are then used to activate sounds that go along with the drum image.

This program has a problem in that when the user sits down, the kinect has trouble tracking the user at a high confidence level. This seems to be a SimpleOpenNI limitation. The kick drum is also not very accurate. A physical button to push would be better to use for the kick drum. Lastly, the way processing is setup, only one sound can be made at a time. A way around this would be to have a separate sound for each combination of drums you can hit. From there, a bunch of if-else statements can be written to see if multiple drums are being hit to activate the new sounds with multiple drums being hit. Unfortunately, I don’t have all this implemented as it would require a lot of code writing/rewriting. Mainly, having each combination of drum hits returning a boolean value to look through to activate the corresponding sound. A number of files including the background, drum, and sounds files are needed.

The code is long so I’ve updated the code to my github repository here.

Leave a Reply

Your email address will not be published. Required fields are marked *