Featured

Synesthesia-like Mappings of Lightness, Pitch, and Melodic Interval

“Synaesthesia is a condition in which someone experiences things through their senses in an unusual way, for example by experiencing colour as a sound, or a number as a position in space.”

Cambridge Dictionary

Lighter stimuli were more associative with higher pitches, and likewise for darker stimuli, they were for lower pitches.

Results from Experiment 1

The patterns of stimuli seemed to be stronger against a black background as opposed to a white background (something to consider for the project). However, the strength here dissipates when there is a large set of lightness levels “from which to choose the visual lightness level”.

Larger melodic intervals produced extreme choices in stimuli.

Lighter stimuli were chosen for ascending melodic intervals, and like with pitch, darker stimuli were chosen for descending melodic intervals.

Results from Experiment 3

Nonsynesthetes are reliably able to correctly reproduce (or imagine?) the same qualities that synesthetes experience which leads to whether, as Marks (1974, 1974) points out, stimuli in different senses might tap a “common connotative meaning mediated by higher cognitive processes”. –> This could lead to an application of this project outside of the production of the project.

Pitch is not unidimensional (one dimensional) but has at least two different dimensions: pitch height (absolute frequency) and tone chroma (relative location of the pitch within a scale collapsed across octaves). –> To build on this, Bachem (Tone Height and Tone Chrome as Two Different Pitch Qualities, 1950) looked at tone height and tone chroma. “the general term ‘tone chrome’ refers to the quality common to all musical tones with the same denomination.” Within his summary he says that Tone chroma is an exact logarithmic function of frequency, represented by the mantissa (part after decimal point) in log base 2.

References:

https://dictionary.cambridge.org/dictionary/english/synaesthesia

Hubbard, Timothy L. ‘Synesthesia-like Mappings of Lightness, Pitch, and Melodic Interval’. The American Journal of Psychology 109, no. 2 (1996): 219–38. https://doi.org/10.2307/1423274.

Bachem, A. ‘Tone Height and Tone Chroma as Two Different Pitch Qualities’. Acta Psychologica 7 (1 January 1950): 80–88. https://doi.org/10.1016/0001-6918(50)90004-7.

Adding a Pause Button

Toby suggested that adding a pause button would be beneficial to the program. In order to do this I needed to look in to noLoop() and loop(), which are both Processing functions to stop Processing from calling draw() and then to resume calling draw().

This lead me to https://forum.processing.org/one/topic/start-stop-pause-processing.html which user GoToLoop describes how to use the noLoop() and loop() functions, as well as the internal variable “looping” which looks to see if draw() is active.

They did it using a key press, but I decided to use a ControlP5 button to handle this functionality.

To use ControlP5 buttons you have to have a function with the same name as the button. I then made a pause() function and used the same logic described by GoToLoop. I had to add some additional functionality to pause and play the song when the button is pressed.

ControlP5 buttons are pressed upon the sketch starting. This meant that the sketch would pause and I would have to unpause it using a key press before I could play a song or properly start my sketch. This lead me to https://forum.processing.org/two/discussion/1368/controlp5-buttons-controls-trigger-automatically-on-sketch-start, in which user Sojamo describes how to stop this from happening using .setBroadcast(). By setting it as false and then initialising the other properties of the button before setting it back to true meant that it does not carry out any functionality when the sketch is loaded up.

Threading and Phillips Hue

Updating the Phillips Hue was giving some performance issues, since it would pause the sketch until it sent its PUT HTTP request. Although this didn’t happen every time, when it did happen it would pause the visualisation for about half a second which took away from the immersive experience I was looking to create in the first place.

In order to solve this problem, I thought about placing the HTTP requests that I needed to make in to separate threads so that the requests can be sent in an asynchronous manner with regards to the visuals.

To do this I watched:

Here, threads in Processing are explained thoroughly, and I was able to incorporate threads in to my sketch which has improved the performance issue with the visuals no longer being paused.

I found that for it to work effectively, I had to use “if (frameCount % 10 == 0)”. If i took that statement out and put the thread just in draw() then it would lag and wouldn’t update the light for about one second. Not sure why.

Updating the Lissajous Figure

I have updated the Lissajous figure so that it will come out of the FFT spectrum in the middle of my sketch.

The FFT spectrum used to look like this:

To incorporate the Lissajous figure, I found where the parametric equations gave a line that was near the FFT spectrum and used an if statement to “activate” the Lissajous figure. It now looks like:

Initially, six lines will be in the Lissajous figure (and six lines taken out of the FFT spectrum). After one full cycle of the figure, three lines will be reincorporated in to the FFT spectrum. After another full cycle, all lines will be incorporated in to the spectrum.

The next step for this is to figure out how to ensure this happens continuously and not hard coded in.

Incorporating Phillips Hue

Phillips Hue uses the CIE colour mode; it takes just two float inputs between 0 and 1 instead of the three inputs for RGB. Therefore, I need to be able to convert from RGB to CIE. Luckily, Tim on StackOverflow has uploaded a function to convert RGB to CIE specifically for Phillips Hue (https://stackoverflow.com/questions/22564187/rgb-to-philips-hue-hsb).

I have then incorporated this function in to my sketch to translate colours from my sketch to colours for the light. At the moment, the colour for the light is set to replicate what the background colour from the sketch is. I tried to update the light 60 times per second but this would slow down the visualisation so I have cut it down to six times per second. From what I can see there is no discernable disadvantage in terms of responsiveness from the light but it does still stick sometimes. I might try and drop it to three times per second.

On top of this, I have also altered the brightness; the brightness is by default at a value of 75 out of 255. When there is a beat detected, the brightness goes up to 255.

Something I need to think about, or possibly ask for help with, is that the range of colours my background can go to is limited really to blues and yellows. After adding in the keyboard functionality to change a random value I can then only really get pinks and greens but no other colours.

Controlling Phillips Hue using Processing

Firstly, I used https://developers.meethue.com/develop/get-started-2/ to find the IP address that my Phillips Hue Bridge was on, generate an API key, as well as looking at what lights I have connected.

There is also some information on what HTTP requests are used to control the light; I have used GET requests to see what state the light is in, such as whether it’s turned on, the saturation, brightness, and hue levels, as well as the colour. This information is displayed in a JSON format.

Some of the attributes of my Hue light.

To set the state of the light I have used PUT requests. I then did some experimenting using the CLIP API debugger.

The documentation for what actions the different requests do can be found at https://developers.meethue.com/develop/hue-api/.

To control the light in Processing I needed to figure out how to do PUT requests. I thought I had found a Processing sketch I could’ve used but it didn’t work. This led me to https://forum.processing.org/two/discussion/23026/how-to-send-http-put-request and eventually this library: https://github.com/acdean/HTTP-Requests-for-Processing. Getting the library to work properly was a little confusing, I had to include the PostRequest Java file directly in the sketch since that has a method, method(), that isn’t in the version that is in the library.

Once I had the PUT requests working, I was then able to interface with the light through Processing.

A simple sketch I wrote to randomise the colour and loop through the brightness level of the light:

import http.requests.*;
public void setup()
{
smooth();
GetRequest get = new GetRequest(“http://192.168.1.201/api/aN-2gDI1JkTvIyZ-YoPoK5tam-JKHDN2KMfRpwqS/lights/1”);
get.send();
println(“response: ” + get.getContent());
}
public void draw() {
for (int i = 0; i < 255; i++) {
String input = “{\”xy\”:[” + random(1) + “, ” + random(1) + “], \”bri\”:” + i + “}”;
PostRequest put = new PostRequest(“http://192.168.1.201/api/aN-2gDI1JkTvIyZ-YoPoK5tam-JKHDN2KMfRpwqS/lights/1/state&#8221;);
put.method(“PUT”);
put.addJson(input);
put.send();
}
}

The format for sending PUT requests to the light is http://<IP ADDR>/api/<API KEY>/lights/<LIGHT NUMBER>/state

The format for sending GET requests is http://<IP ADDR>/api/<API KEY>/lights/<LIGHT NUMBER>

The next step is to incorporate some of these requests in to the visualisation tool. Namely, how convert from RGB colour mode to CIE (https://medium.com/hipster-color-science/a-beginners-guide-to-colorimetry-401f1830b65a) which is what the Hue uses.

Parametric Equations

A parametric equation is an equation that are mainly used to express point co-ordinates. The x and y values are expressed in terms of another independent variable; usually t. This independent variable is then usually expressed as a trigonometric function such as sine or cosine.

I have used parametric equations to create a 3D looking shape that traverses in and around the screen. When there is a beat it assigns the shape a random colour and alpha value meaning that it might not always be visible on the screen; this is a desired effect to draw the user’s eye when it is on the screen and hopefully stops the visualisation becoming predictable and boring.

A screenshot to show what the parametric equations are doing.

The parametric equations I have used are based on this video:

float x1(float t) { return sin(t/10 + PI/2) * 60; }

float y1(float t) { return cos(t/30) * 300 + cos(t/10) * 300; }

float x2(float t) { return sin(t/10) * 200 – sin(t) * 2 + sin(t/50) * 400; }

float y2(float t) { return cos(t/20) * 200 + cos(t/12) * 20 + cos(t/50) * 600; }

An example of what everything looks like together:

I have also added the blendMode() function with the “SCREEN” parameter. As shown by https://processing.org/reference/blendMode_.html, “SCREEN” “uses inverse values of the colours” when they overlap.

Adjusting the FFT Visuals

The main visualisation of the frequency bands from the FFT values is a circular ring in the middle of the screen where each line represents a frequency band.

I have added a change to how this is visualised. Instead of it being just a normal circular shape, I have added some effects so the shape looks like it is in 3D. To do this, I have just added some value to the x2, y2, points in the line() function.

The shape moves throughout in a circular fashion:

This shows how it is going towards the top right. The planned effect is that it is reaching out of the screen.

Here it is in the bottom left:

Circular Shapes

When I initially implemented the moving circular shapes in the background, the majority of the shape’s attributes were randomised.

I then went about changing the randomness and having each attribute be directly affected by an element of the music.

A new circle appears every time there is a beat. Once there are 20 circles on the screen, a new beat takes away a circle in the reverse order that they appeared in.

The size of the shapes correlates to the sum of the FFT readings at the time the beat occurs. On top of this, circles with a size of 150px or less have a larger range of colours than those larger than 150px. The colour is still random, and is something I am thinking about changing but I’m just unsure how to yet.

The stroke weight of the shapes is dependent on the size, with larger shapes having a smaller stroke weight.

The transparency of the fill and of the outside of the shape is also dependent on its size, with smaller shapes being more opaque.

The speed in which the shapes move is dependent on the BPM of the music; faster BPM means a faster speed and vice versa.

Morphing Shapes

I have been trying to get some of the background objects to morph between different shapes.

At the moment, I am drawing the background circles using the ellipse() function. I know you can morph between shapes after seeing this: https://processing.org/examples/morph.html.

I attempted to make a different kind of object that will flick between both a circle and a square whilst in the background. Firstly, I had to make the shapes. The example from the Processing website shows how the shapes are drawn with PVectors rather than from a ellipse shape to a rect shape. This poses a number of problems, especially for performance and ease of implementation.

The above image shows the morphing shapes with a black stroke. Although I still like the idea, I’m not sure it looks clean or smooth when the program is running. This could be because I am moving a lot of individual vectors and not a single object. Also, the sheer number of objects, both ellipses and morphing shapes means that the program runs noticeably slower when there is the maximum number of objects on the screen.

If I can find another way or change the appearance of the shapes then I think it could be a good visual effect. I could half the number of circles, and try and replicate the style of those shapes with the morphing shapes.

Dynamic Background

After my last meeting with Toby, a suggestion was made to fade different colours in the background. The colours being faded into would have some correlation to the sum of the frequencies within a 5 second period and would go on indefinitely while a song was running.

To do this, I first attempted to draw the background but quickly realised this probably was not the way to do it and did some more research, finally finding the lerpColour() function.

To get the desired effect I now sum the readings from the frequency bands over a course of 150 frames which is 150/60 = 2.5 seconds (Processing’s draw method runs at 60 times a second).

Every 2.5 seconds the sum is gathered and the average is found which is then placed in to a variable. Another variable holds the previous 2.5 seconds worth of frequency readings and the lerpColour() function is then used to go between the initial colour and sum and the next colour and sum.

For example, it blends from:

to:

Design a site like this with WordPress.com
Get started