Final Documentation – Beats Exposed

The Project

Beats Exposed is an interactive performance experience that breaks down the barrier between audience and performer. By exposing the body’s vital signs, the performer invites the audience to see beyond the polished act and into the extreme physical and personal effort.

Beats Exposed is built to be used in performance on, or off, stage. It is lightweight and able to run in a variety of settings.

Our current iteration of the project is performed with an aerialist. It exposes the exertion in an artform that is extremely demanding, yet typically meant to appear effortless.

Or project senses the aerial performers heartbeat with a Polar pulse sensor, and sends this pulse wirelessly via two Moteino boards. The pulse is then used in a P5 javascript sketch to effect audio and visualizations.

In this experience, the audience hears the sound of a heartbeat timed with the performer’s pulse. The visualization, also reacting to the pulse, projects from the ceiling onto the performer, surrounding area, and any audience members that have come in close. The resulting experience is intimate, personal and engaging.

_A1A0183
_A1A0206
_A1A0236
The polar band is able to send a reliable heart pulse even during movement
The polar band is able to send a reliable heart pulse even during movement

Our method for measuring the heart beat of the performer changed drastically over the course of our testing. The first version we used worked via sticky electrodes attached to the chest and connected to an EKG operational amplifier which amplifies the minute voltages generated by the heart. We quickly ran into noise issues when the performer made any slight movement. After trying some simple signal processing in arduino, as well as modifying our EKG breakout board to include heavy filtering, we found it very hard to get a reliable signal.

After a lot of research into portable EKG units worn by the ‘patient’, I began to learn the importance of using an accelerometer alongside the electrodes in order to filter out movement noise. The first time I heard of such a thing was buried in a youtube video:

 

Screen Shot 2015-12-15 at 11.53.24 AM

Combining the operational amplifier with an accelerometer via a small microcontroller is also in the datasheet for the heartrate sensor.

Trying to implement this kind of digital signal processing ourselves was quickly starting to look too difficult for a project due in 2 weeks!

Trying to change around the circuit on the breakout board was a challenge!
Trying to change around the circuit on the breakout board was a challenge!

Eventually we found a product–the Polar T34 pulse monitor that handles this processing. By using this band as our sensor, we were able to get a very accurate heartbeat pulse that was free from noise even during movment.

Our final human interface works by receiving a pulse signal from the Polar monitor for every heartbeat, which is then transmitted wirelessly via a Monteino transmitter (over a 915MHz serial bridge) that is worn by the performer. This 915MHz signal is then received by a second Moteino that sends the pulse along to P5 via serial. In order to make sure that P5 sees every pulse, each heart beat is about 320ms long.

Our project used lots of Kapton tape to insulate between the battery and three circuit boards
Our project used lots of Kapton tape to insulate between the battery and three circuit boards

Our original proposal included a breath/stretch sensor in addition to the pulse sensor. We thought implementing this would be relatively simple. In reality, the stretch sensor was very sensitive to movement and we were unsuccessful at getting a usable reading.

It took three iterations over about one month to get the pulse sensor working as we intended. We tried an optical pulse sensor, a three electrode EKG heart monitor, and a two electrode heartbeat monitor. All three of these sensors were plagued with electrical noise whenever the performer moved. Finally we arrived at the Polar T34 heartbeat band, which is designed to produce a reliable heartbeat even during times of heavy physical movements. We found this band provided an extremely accurate representation of the performers heartbeat, even when they were moving.

We spent a lot of time working with the the serial communication to get a reliable signal. Once the signal was in P5, we found we had to simplify our sketches significantly in order for them to run reliably. In the future we would likely try to use Processing for visuals.

Our final code for the project can be found on github: https://gist.github.com/lisajamhoury/e8a7de48f8155078efe2

Interestingly, some of our testers had a negative reaction to the sound of the heartbeat played over headphones. We don’t yet understand why some find the sound unbearable, while others find it centering and calming. We will continue to refine the sound and user test to get a better understanding of this sensitivity.

 

Final Project – More updates

Our final project is coming along great so far. The biggest change we have introduced is switching to the Polar heart band instead of the electrodes and AD8232 heart monitor we started with. We found a second circuit in the data sheet that would filter noise. It was hard to follow tiny traces and components on the sparkfun breakout board.

Screen Shot 2015-12-02 at 11.11.17 AM

Even after modifying the the AD8232 breakout board to the heavy noise filtering circuit, we were still unable to get a reliable signal
Even after modifying the the AD8232 breakout board to the heavy noise filtering circuit, we were still unable to get a reliable signal

After extensive research, we learned that in order to monitor the tiny voltages associated with heart rate reliably during movement, we would need an accelerometer to get movement data to process and filter noise from the signal. Luckily a product exists that takes care of a lot of this front-end work already: the Polar heart band!

The polar band is able to send a reliable heart pulse even during movement
The polar band is able to send a reliable heart pulse even during movement

The polar sends a low frequency 16ms pulse for every heartbeat. We receive this with a proprietary polar receiver, which then pulls an input pin high on the wearable moteino for every beat.

This beat happens a little fast for p5, so we increase its duration to 320ms before outputting it to serial:

Screen Shot 2015-12-02 at 11.25.36 AM

Here is our updated BOM for the wearable:

Screen Shot 2015-12-02 at 11.40.03 AM

Final Project – Riser and BOM updates

This week we honed in our Bill of Materials and riser diagram for our final. For the wireless serial aspect of our project we are using the RFM69 chipset at 915MHz.

We decided to eliminate the upper dome and just go with the bottom one. Currently I am experimenting with mounting the projector in the ceiling vs. in the dome with a mirror in the ceiling. Either way our new goal is for the projections to be on the floor.

FullSizeRender 9

 

Messing with projectors and mirror squares in my apartment:

IMG_1274

Stop Motion animation – A journey through P comp

We moved ahead with our stop motion animation using Karl the lego construction man.

The process was very time consuming, but I enjoyed it and I felt the three of us (Katy, Mathura and myself) all worked very equally on it. From setting up to editing the process took us 12 hours for a 49 second video!

The music was taken from chips challenge, an old computer game:

 

Physical Computing Final – Humanizing Performance

The key ingredient in performance art of any kind is the human performer. For our Physical Computing final, Lisa Jamhoury and myself, and Danielle Butler  would like to use electrical sensors and computational media to better translate the action and emotion of a performer to his or her audience. For this project the focus is for an aerial performance. We would like to draw an audience closer to an aerial performer, and to do so outside the confides of a traditional performance space. To meet these goals we will create a portable device that will take inputs from the human performer and translate them to lighting and projected visuals. 

some of our original sketches:

FullSizeRender 4

FullSizeRender 5

DSCF0310-700

One main priority is to only use physical inputs that will produce definite translations. I think using inputs that we can fully count on and coding them work as fluid as possible is key. (I’m not sold on the xbox kinect).

The basis of my idea of interacting with the performer comes from my work restoring a sculpture designed by Robert Rauschenberg. In 2012, I was asked by Trisha Brown Dance Company to restore the electronic set pieces for Astural Converted. The set was composed of eight wireless, aluminum framed towers. Each contains light sensor activated sound and lighting. Designed by Robert Rauschenberg in 1989 and constructed by engineers from Bell Laboratories, the towers turn off and on their lamps based on the performers movements. Each tower also plays a different part of the musical score via individual tape decks. .

Screen Shot 2015-10-28 at 10.54.34 AM

Screen Shot 2015-10-28 at 10.54.42 AM

Im excited about taking this idea of car headlights  and tape decks controlled by a dancers movement, and scaling it up to and digital sensors and stronger visuals.

FullSizeRender 8

Here is our initial block diagram:

FullSizeRender 6

Our project will take a stretch sensor and heart rate sensor from a performer, and translate them to RGBW LED strips and projected images running from a p5 sketch.

Here is our timeline and BOM so far:

Screen Shot 2015-11-04 at 10.31.38 AM

Screen Shot 2015-11-04 at 10.33.50 AM

ICM Final – Humanizing Performance art

The key ingredient in performance art of any kind is the human performer. For our ICM final, Lisa Jamhoury and myself would like to use computational media to better translate the action and emotion of a performer to his or her audience. This goal is part of a bigger project that Lisa, Danielle Butler and myself have dreamed up whose focus is for an aerial performance. We would like to draw an audience closer to an aerial performer, and to do so outside the confides of a traditional performance space.

FullSizeRender 4

Every idea starts with chicken scratches (and coffee)

The part of this project we will be executing in ICM will be projected visuals that are manipulated by sensors placed on the performer and the arial apparatus, and possibly ones on/directed to the audience.

My goal is to only use physical inputs that will produce definite translations. I think using inputs that we can fully count on and coding them work as fluid as possible is key. (I’m not sold on the xbox kinect).

Some of what I’ve been doing in p5 earlier this semester will prove to be very helpful as basic stepping stones in this project.

Manipulating p5 elements via serial data and physical sensors:

visuals based on sound:

Screen Shot 2015-10-27 at 8.46.34 AM

 

A lot of visuals that our classmates have done come to mind:

Jamie Charry:

http://itp.jcharry.com/category/visual_language/

Screen-Shot-2015-10-12-at-11.00.51-PM

Yao:

http://www.wangxinyao-design.com/?p=110

屏幕快照-2015-09-22-2.49.59-AM

My humble moon that Yao helped me with:

IMG_0945

 

The basis of my idea of interacting with the performer comes from my work restoring a sculpture designed by Robert Rauschenberg. In 2012, I was asked by Trisha Brown Dance Company to restore the electronic set pieces for Astural Converted. The set was composed of eight wireless, aluminum framed towers. Each contains light sensor activated sound and lighting. Designed by Robert Rauschenberg in 1989 and constructed by engineers from Bell Laboratories, the towers turn off and on their lamps based on the performers movements. Each tower also plays a different part of the musical score via individual tape decks. .

Screen Shot 2015-10-28 at 10.54.34 AM

Screen Shot 2015-10-28 at 10.54.42 AM

Im excited about taking this idea of car headlights  and tape decks controlled by a dancers movement, and scaling it up to p5 and digital sensors.

Physical Computing – final project ideas

Im now starting to brainstorm some ideas to implement for the physical computing final. Since we only have 4 or 5 weeks to create something, Im hoping to close in on a plan that I can use as a stepping stone for other projects I do here.

As of now I am between two ideas:

  • A portable light and sound device that is used in conjunction with a performance piece
  • An interactive light that is powered by salt water and is made predominately of reusable / biodegradable materials

The portable performance piece idea I have is most certainly a borrowed one. In 2011 and 2012, I worked on restoring a sculpture developed by Robert Rauschenberg and Bell Labs engineer Billy Kluver. Completed in 1989, they created eight battery powered towers on wheels that lit and produced sound for a dance piece choreographed by Trisha Brown. Both sound and light were controlled by the dancers movement.

Screen Shot 2015-10-28 at 10.54.34 AM

Screen Shot 2015-10-28 at 10.54.42 AM

My idea is to produce a similar portable device or set of devices that create light and sound for a performance piece. In the 1980’s, Rauschenberg’s sculptures certainly had a more futuristic feeling to them than they did in 2012. I think it would be great to take this concept and give it a more modern feel, or at the very least use more contemporary components than photocells, car headlights, and cassette decks. An important part of this project for me would be to make it as portable as possible, while still remaining as scaleable as possible.

 

My second idea is very different! I have been working on a light operated by saltwater for a few years. My concept for this is rooted in developing a light that can be used by anyone in the world, anywhere in the world that saltwater is available. There are several lanterns and kickstarter projects that operate on saltwater:

51xogt40RBL

i2zva2xeqzm1fknjx3gn

 

My plan would be to utilize the saltwater battery part, but add an interactive element to it, use materials other than plastic, and keep its internal components as simple as possible. I want my project to produce more than simple utility light. It needs to be fun as well. Working with materials that are friendly to the environment is also an important part.

 

 

Hipster Oscilloscope

This week I used the p5 sound library to map an oscilloscope to the microphone on my computer. This came after a week of frustration trying to work with RGB pixel tape and p5. I definitely need some more time to get that working! I decided learning about routing sound in p5 would be a good fall back.

The code used the waveform generator in the p5 sound library to draw three waveforms:

By toggling a DOM button, you can switch between two visualizations.

Screen Shot 2015-10-27 at 8.46.25 AM

Screen Shot 2015-10-27 at 8.46.34 AM

The slider controls the volume / amplitude.

Here is the sketch:

http://www.aaronparsekian.com/projects/oscilloscope/

RGB proximity candle

For the midterm Jarrett and I made an LED candle that is controlled with a proximity sensor. The proximity sensor controls the hue of the RGB led, and via serial over bluetooth it controls the volume of a sound file in p5.

Here is an initial test:

IMG_1141

Here is the code on the arduino:

The function that I used to convert hue as one value to RGB is very elegant. It was written by Harley Pebley and can be found here: http://www.skylark-software.com/2011/01/arduino-notebook-rgb-led-and-color_21.html

And here is the p5 sketch used to control the sound via serial: