Part 8 – Slitscan – Transforming Yoga Practice Videos

I wanted to transfigure the visuals of practicing yoga stepping away from realistic images of me holding poses and moving into creating abstract forms and illusions of how a machine might visualise my movement. Slit-scans created using processing.org, original videos are of me practicing yoga.


Monday 15th March

Today I was looking through some example sketches on processing and I found the slitscan example which uses the web cam to create a slit-scan image almost like a sideways panorama mode on phones.

Testing the slitscan example on processing.

I had the idea that this could be really interesting if I fed in a yoga practice to see how that kind of movement would be registered, I think this would tie in nicely with some of my Runway outputs.

I started with just using the sketch from the examples which reads in video from my laptop webcam.

Tuesday 23rd March

These are some screenshots of me testing the processing slitscan sketch while doing yoga.

Rebecca McSherry lent me some code she was working on at the start of the year which allows a movie to be read in instead of the webcam camera which meant I could begin experimenting with higher quality videos in the slitscans.

A screenshot of a yoga video in the slitscan processing sketch.

My vision for these visuals is to project them on large scale abstract paintings I will paint. I aim to have large white rectangles int he backgrounds of the paintings so that I can clearly project the moving slitscans on to them.

6 April 2021

Above I was figuring out how to play video through vpt as a projection but the next step was to figure out how to make a processing sketch play through vpt.

It took me some time to figure out as the last time I used vpt was in first year but I managed to figure out I needed to download a processing library and add some extra code to my sketch to make it work.

Friday 9 April

Over the last week I have been working on finishing my paintings and now I can experiment with projecting onto them.

I first mapped out where the paintings were with ‘solids’ since I have to project in the dark I will be illuminating the paintings with the projector so the white solids will actually be making the paintings visible.

I have actually decided to change the slitscan which will be projected onto the paintings. Previously I started with quite a smooth sliscan shown below.

After trying this one I would have been happy with it in a gallery space where the 20 minute video could play in a loop however since everything is online and digital this year I won’t be posting a 20 minute video of the slit can so I have decided to adapt. Instead I took the 20 minute video of me doing yoga and turned it into a time-lapse of 1minute. So this sped up version is what will be fed into processing and then projected onto the paintings. I also didn’t want to project the same thing 3 times onto the different paintings so the middle paining will be displaying the slitscan taken from the middle pixel of the video, the left paining from the left and the right paint from the right pixel. I did most of my practice on the middle of the mat so the left and right windows are more likely to be filled with white but I enjoy this since the movement that does appear seems subtle. The speeding up of the video also added another change since everything looks more pixelated I actually enjoy this as it provides a contrast to the abstract ‘blob’ shapes which I painted.

Preview from vpt8.

I had the 3 sketches running in processing and connected to vpt8 through the syphon library so it was running real-time not from pre-recorded videos which I was quite happy about.

setting up the paintings to project onto.

Slit-scans created using processing.org, original videos are of me practicing yoga. Each of these 3 slit-scans takes a vertical line of pixels from the original video and places them one by one next to each other in turn deconstructing the original video. The original video of me practicing yoga was around 20mins long I turned this into a 1min time-lapse which created the pixelated effect. Below is a video of a slightly smoother version of the video titled ‘Smoother_slitscan’.

I enjoy these slitscans and I believe the visual outcomes are very reminiscent of my works using machine learning. I think using slitscans was a fun way to create an illusion of how a machine might visualise the movement of the human body, I am always very aware that any movement shown on a computer is made up of pixels and frames and images therefore in my mind it never really truly evokes human movement like that of yoga practice. This is why I felt it would be interesting to exaggerate this weakness machines and computers where they lack the depth and understanding of complex physical human movement and morphing of the body and present is as something deformed.

1 Comment

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s