SkrillTrex:The Dubstep Dino with Arduino and Processing | BitchWhoCodes | Stacey Mulcahy - a Microsoft Technical Evangelist

SkrillTrex:The Dubstep Dino with Arduino and Processing

Sometimes you have an idea. It makes no sense. You can think of zero practical application for it, except for it’s potential for amusement. It’s a “because I can” project. This is one of those.

What is it? It’s a TRex that drops bass. A dubstep dino. SkrillTrex. Late one night, actually it was early morning at this point, I was at a hackathon with several of my co-workers for TechDisrupt. It was around 3 am and a huge case of the ‘let’s get loopy’ had settled in. One of my colleagues was walking around, dropping bass, yelling “WUB WUB WUB” as she circled our work table. That is where the idea started.

WUB WUB WUB

What if I could create dubstep? What if I could just drop bass whenever I wanted? What if it was a different type of controller – something different that the typical midi boards & controllers dj’s used. What if it was a puppet?

I started to search Amazon for puppets. I found an alligator. Not bad. A raven. Yep, getting closer. Then I found it – a TRex.Yep, I could make SkrillTrex, the dubstep dino. I could add some bad hair that looked like a mop. I could find some hipster glasses. Apparently other people thought that a TRex and Dubstep seemed like a good combo.

dubstep-dino-skillrex-skrilltrex

One click and it was paid for, and on it’s way to me, while my Amazon purchasing history was irreparably screwed.

I bought a flex sensor from Adafruit and we realized it would need a bit more structure with some cardboard tofit nicely in its mouth.
photo1

A flex sensor is an analog sensor. It will return a range of values, so you can hook it up and see what values you are getting to help calibrate the range. I hooked it up to an Arduino Uno, where I started to note the ranges I was getting from the sensor from open ( not bent ) to closed ( bent in half). Once I had those values, I could do whatever I wanted with them.

2014-10-01_00008

Originally, and I still intend to, map those values and use them in Ableton – a music program many people use to create Dubstep as you can connect to it through MIDI. I discovered that making Dubstep was an something I could potentially learn from the educational platform of the interwebs, Youtube. But for the purpose of just getting a prototype out, I chose to use Processing.

With Processing, I captured the video camera and drew it back to the canvas, pixelating the image based on a grid size and finding the average color of that area. Then I’d draw back the video capture using shapes, something super simple to do in Processing.

output

Processing had a great library for manipulating and playing sound called Minim. Minim can be used outside of Processing as well. To use Minim in Processing, you need to import the library.A library can be imported by going Sketch>Import Library > Add Library, and then searching for Minim. Bonus is that you get a bunch of examples that show you how to work with Minim.

With Minim, I would change the volume, and the position of a random dubstup audio clip based on the flex sensor input. Minim has the ability to change volume or gain. Not all computers and configurations support these functions from the Minim library, so I had to check what it supported by inspecting the object, and discovered I’d have to change the gain. You can check to see if Minim has a control by doing something like : hasVolume = player.hasControl(Controller.VOLUME);

After realizing that I’d have to use gain, I had to find out what range that gain was operating within. For me gain had a scale from -80 to 0 , so I had to map the values from the flex sensor to that range.

Drawing an audio waveform based on the left and right channels of the sound buffer is easy to do in Minim with some built in functions. I would then layer on the waveform every time the dino dropped some bass. Wub wub wub.

output223722717181.13477

I recorded a SkrillTRex Demo and put it up on Youtube.

Right now, this a prototype using Processing, but I hope to port it using a Spark Core and create a few more puppets to control different aspects of the composition integrating with Ableton. Finally, I hope to capture a recording of the session as well. So still lot’s to do, but something small and fun to execute to get the idea across. Imagine a few people creating music together, using puppets. Using a TRex, an Owl, a Beaver – imagine if I could find a Honey Badger puppet?

photo3

Curious about the code? I posted a Gist of the sketch here and below. If you are interested in getting the package including sound for the entire project, just hit me up on twitter.


  • Steven Edouard

    I love it!!! ‘Because I can’ projects are the best!