My main inspiration for this project was a sketch from bgstaal in which he generates two objects in separate windows that are aware of themselves in space. When the windows overlap the objects move between each window. This reminded me of looking at bacteria through a microscope or the hypothetical conditions of a primordial soup. I had previously described Thierry Feuz's artwork in the same way and so figured I could connect the code with his visual aesthetic. AI and biological research seem to go hand in hand. There are many new methods for extracting and applying live information from organisms. Furthermore, healthcare is an industry that repeatedly utilises its benefits for diagnosis and treatment. Another big point of inspiration was Alexander Galloway's essay titled Uncomputer. In it he talks about whatever is "subordinated or excluded" by the standardized digital model. He proposes that an important part of computing is determining the practical limit of what can be done, but also incorporating the "indiscernible". He writes: According to Parisi, "error, indeterminacy, randomness, and unknowns in general have become part of technoscientific knowledge and the reasoning of machines." Thus I imagined a storyline where it is x number of years in the future and AI has been taught to understand, modify, and create life. In this project I am trying to create a visual representation of a machine modifying organisms in a lab, seeking an unknown outcome. What characteristics would it want to make? How would these new forms of life interact with each other? What would it mean for humans to allow AI to get to this point?

Things that come to mind:

  • generative art
  • procedural randomness for textures
  • anti-aliasing
  • errors and mutations
  • Two black screens showing an orange and red wireframe cube in each. The orange wireframe cube is half in one screen, half in another.

    I found these codepens for visual inspiration:

  • Fabio Ottaviani's Chewing gum
  • Liam Egan's GLSL: Coral Blooms
  • Firstly, I tried to use Ottaviani's chewing gum objects as a replacement from the default bgstaal cube. This was quite difficult because it took time to parse two other people's code and understand the techincalities of what they were actually trying to do. I have limited experience with three.js and window management but wanted to challenge myself to try something new. Blending the code together took a long time because of the scopes of the many variables and functions, so there was a lot of refactoring involved to improve readability for myself. My next steps will be to fix where the objects are in the space and add the coral bloom texture from Liam Egan's sketch.

    I managed to centre the bubbles by dividing the position by the window width. There are more bubbles than there should be (it should be one per window) but the effect of the error kind of looks like a nucleus so in keeping with Galloway's idea of the uncomputer: I'm okay with it! Now that the position is fixed I want to incorporate the coral bloom texture to make it look more like primordial soup.

    I rendered the coral blooms within the existing sketch, but lost the bubble shapes.

    I created an isolated context to figure out why I couldn't set the custom shader as a material for my bubbles. I successfully applied it onto a cube. This was my first time using shaders in three.js, so I didn't realise I hadn't assigned the u_resolution properly.

    I also learned to apply a shader onto a sphere shape you need normalized vectors for the lighting calculations. That's why I could only render cubes or planes.

    Now that the material works properly, there are other issues to address. The bubbles don't align as expected when the windows are brought together. The bubbles also no longer look 3D so maybe the lighting needs to be changed. Switching entirely to 2D is also an option after how much I have struggled with 3D.

    It almost works, but temperamentally. If you'd like to try the project you can here. Copy and paste the URL into a second window and move each window around to move the bubbles (it is very slow). By this point I realised I had aimed too high in the technical aspect and was running out of time to establish a narrative. This led me to find the Chrome Window Management API and use that to create the sketch. This was a lot simpler and faster to implement. It also just runs better than my previous iterations, but I think this had something to do with how many times I was applying the custom shader.

    I found these sphere shapes from a youtube tutorial by Creativeguru97 and used their code in my project. I like how they look like nuclei.

    What to add next?:

  • preloader for the main project page
  • styling to look like an experiment interface
  • user instructions
  • expected behaviours (what is the AI testing for?)
  • a fake error msg:
    1. after some time (of adjusting the sliders?)...error! mutation popup appears in the centre of the screen
    2. terminal style exit
    3. then closeAllWindows()

    sources: