Part 10 – Immersive Projections – Wii Balance Board & Processing

In this work I create an immersive experience allowing physical movement to directly change and impact a persons immediate surroundings. By practicing mediation or yoga on a balance board the values of weight distribution have an influence on the position, scale and rotation of projected shapes. I created the shapes dataset by using my machine learning generated model and scanning it through object detection which is where the machine looks for things it recognises in an image and highlights the shape with a label and confidence percentage. Created using Processing.org, Runway ML and OSCulator.


I wanted to come back to my work where movement tracked on the wii balance board was directly linked to my processing sketch and the visual projection of shapes.

I started to refine this work by revisiting the sketch which works without the balance board. I wanted to begin with adding a coloured background that could potentially be changed with the movement on the balance board. For example the more someone is directly on the middle of the matt the brighter the colour or maybe it moves through different shades.

I began by adding a background with the random function choosing the green value and simplifying the code to only draw one shape at a time.

Dan Shiffman ‘What is lerp? (Linear Interpolation)’ : https://www.youtube.com/watch?v=8uLVnM36XUc

Thursday 15th April

I have been considering how I should present this work and I imagine it to be an immersive experience where light and sound compliment yoga practice. I like the idea that the movement of the body is somehow affecting and changing the immediate surroundings and environment of the participant. This allows someone seeking meditation and breathing practice combined with intentional movement a certain amount of control of their environment. Often yoga focuses on observing and appreciating the environment we practice in and I would like to explore how our mindfulness practice can have a direct impact on our space.

I believe that in the documentation of this work it is vital to show a person (me) in this immersive experience since people won’t be able to physically experience it I have to show this experience in a planned and thought-out way. When I was making test videos using my phone in these early stages I did some stretching exercises to visualise how this projection with look with someone practicing yoga or meditation on the Wii balance board infront of the projection. I wore black clothes which don’t allow for the projection to display and I realized I really enjoy how the projections looked on my skin therefore I am thinking about trying this again with a lighter outfit and wearing something white or possibly beige allowing for the vibrant colour to appear on my body. I believe this adds another dimension to the work being fully immersed in light and colour when practicing these movements and breathing techniques allows for a new experience.

Since I want this work to be an immersive experience I was aiming to have generative sound and audio in some way affected but movement on the balance board. I looked at the sound library (https://processing.org/tutorials/sound/) and implemented an example of a synthesiser with sine waves and oscillators and instead of being controlled with a mouse it is controlled with the VirtualX and virtualY values of the balance board.

Thursday 22 April

It was my studio day and Jen let me use a projector upstairs so I did a few experiments projecting onto plinths and considering how to stage the outcome for this work.

Jen is going to help me with my code making the transitions smoother so the changes between background colours and shapes are not so abrupt. Over the weekend I will need to think about which audio files I want to use and manipulate and also consider how to stage the final work and outcome. Ailsa also gave me a great idea of green screening a video if the projection projecting on me which could be interesting and a direct link to my latent space walk works.

Monday 26th April

Jen has provided me with much clearer code to make the shapes fade onto the screen and also added a top with sound where audio mixes between 4 sound files depending on mouse x position. I am still deciding on which sound files I want to use but in the meantime I changed one of the files to the sound of waves I recorded in semester 1.

This morning I am working on the code to make it compatible with OSCulator and the wii balance board meaning that instead of the mouse the values from the board will be affecting the code.

I will include a screen recording of these early stages playing with the code so far it is responding to the wii balance board but I still need to make changes to the audio and the placement of the shapes needs to be mapped better. I have also realised that there might be a problem with the dataset where too much empty space is saved around the shapes resulting in weird placement which I will look into.

Early stage testing of balance board affecting display of shapes and audio.

Tuesday 27th April

Today I had my support session with Paul and he spent a lot of time helping me with the code working on adding rotation, lerping and colour modes to create more interesting visual results and the balance board now has more effect on the visual outcomes and there is more control over the placement of the shapes.

Paul’s sketch creating rectangles and drawing them, rotating and scaling according to values from the balance board.
Adding my dataset of shapes to Paul’s sketch.

Using illustrator to turn my dataset of shape images into vector shapes. This will allow me to change the colour of my shapes using code in processing. Create new > 500px x 500px > scale > Image Trace > silhouettes > Save As > .SVG > CSS Properties : Entity Refrences

Playing with choosing colour, colour modes, transparency and trying stroke.

Sped up screen recording while drawing shapes using balance board.

Wednesday 28th April

Today I have been focusing on adding sound and audio to my sketch. I first tried adding the pure tonnes I have worked with earlier in the semester and I was thinking about getting some audio of crystal bowls I even thought about if I had the means to I could have build a sculpture where the wii balance board could directly play the crystal bowls. In the end I decided to use an instrument called Kalimba because to me it makes very calming yet interesting sounds and chords so I wanted to try it out with my sketch.

I asked for my boyfriend’s help and we recorded some short audio clips of various chords. I then added the sound code from Jen’s example to the sketch I have been working on with Paul yesterday and lucky I managed to make it work and map it to the wii balance board! Here is a quick video of me testing the new code with sound on the wii balance board.

Sounds used in the processing sketch.

I really enjoy the audio a couple of people mentioned it sounds like wind chimes which I like since I am trying to achieve a melodic and calming experience. The movement on the balance board has a slight impact on the sound and volume however this a consistent dominant cord playing every few seconds this makes for a great focus for breath work timing which I am extremely happy about it means that practicing mediation on the board doesn’t make crazy distracting sounds and throwing the person completely out of rhythm instead the audio is there as a guide and allows for relaxation.

I had Ailsa and Amber test out the wii balance board with the new visuals in the studio today and they enjoyed it. I am happy I was able to have a couple of people interact with it especially since they were unfamiliar with this work. Sadly I couldn’t get access to a projector today so the visuals were on my laptop.

When I got home I was considering how I can present my immersive experience at home since I will need it to be dark and studio closes at 5pm and I was unable to use necessary equipment and I’d need someone to do yoga and that is difficult to organise in the studio so I will need to organise a final for this work at home. I realised my small hallway is all white surrounded by doors providing a very small space which made me think that it would be a great space to create the immersive ambience and experience I want to convey. Since I made the shapes into vectors they show up crisp and clear in great quality even though I am using a cheap projector. Over the next couple of days I am hoping to get some expressive videos of a person doing yoga on the board in the corridor and becoming immersed in the blue shapes.

Test printing stills from processing sketch.

I realised as much as I want this work to be an immersive experience I also really enjoy the stills that it produces. I am really happy with my decision on colour, transparency and stroke and I enjoy seeing these shapes as clear vectors. Back in January I was interested in creating an experience where a persons practice of mediation and physical movement have a direct link with a visual output and I am extremely happy I have managed to achieve that. These still images are also data visualisations a staple in time of someones physical movement.

The last thing I wanted to attempt before submission was to create a small Virtual Gallery of my works. I took some screenshots along the way and the final can be found in the Final Artefacts Documentation part of my learning journal. I exported two builds one for MacOS and one for Windows since my little sister uses a PC and I wanted someone to test and try it out for me before submission.

Video showing how the wii balance board shapes dataset drawing works.

Immersive Projections provides a new way to practice yoga where physical movement has an impact and influence on immediate surroundings. The sound used is slightly adjusted with the movement however a steady chord plays helping with guided breathing during practice. This change to surroundings allows for a tranquil and calm practice so that even though staged at my home the space is completely transformed and full attention can be placed on mindfulness practice. The shapes are unique yet organic even though they were ultimately selected through machine learning object detection.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s