Monday, September 27, 2021

Devlog 1 : Jitter Foundations

This week I've been studying the fundamentals of max/MSP, and the possibilities are already almost endless. I started by learning the jitter framework, a library for 2D and 3D visuals. I learned to play and manipulate footage in real time as well as 3d objects.

Some of the most important information I've learned so far is that There are two types of numbers in programming. Integers, which are whole numbers, and Floats, which are decimal values between 0 and 1. These numbers are useful in different situations. For example, a binary 1=yes 0=no command would be great for an on/off toggle, but wouldn't work at all for a dry/wet knob. For a knob, a float value between zero and one could be used to get a much wider range of values than the binary 0/1.

3d foundation

I started with OpenGL 3D, because starting in a 3-dimensional environment will make 2D come naturally, since its stripped down a dimension. I'll also be able to implement whatever 2D visuals I want within these 3d scenes.

My first 3d patch involved a plane with a Y position of -1 and a sphere that spun in mid air. This was to practice initializing a scene, placing objects in the scene, and subsequently modifying the parameters in real time. With this patch I also added float numbers to control the scale value, letting me squash and stretch the object on 3 planes.

Next, I continued developing this patch. The jit.gl.multiple variable allowed me to create multiple instances of the same cube without having to individually code each one's path. The horizontal movement is determined by two cosine equations, and the scattering and randomization of the path is done with a noise modifier applied to the Z axis of the object.

I continued this line of research, creating this patch of 3 unpredictably moving spheres that draw a white line segment to each other. This involved writing 3 equations, one for each connection.


2d foundation

It was during my exploration of this topic that I realized how powerful these tools really are. Built into max is the "Vizzie" Library, which takes a lot of the tedium out of video work in max while still keeping the modular nature and malleability of MSP.

My first venture with Vizzie was a simple video effect chain that had one effects bus. It was incredibly unorganized and unintuitive. With no design beforehand i went in blind and ended up with a dysfunctional tool. This was a learning experience.





After this, I had a much better understanding of it and the other tools available to me. After sketching a schematic and determining the specs I wanted,  I was able to design a much better video mixer. 

This design has footage folder support, 2 video input tracks, 4 effect sends per track, and a crossfader that uses math operators and a selection of functions to blend the footage cleanly.



 

With this quick homemade tool, I was able to generate visuals in real time that would normally take hours to render traditionally in after effects or premier pro. Since I'm programming with math and functions instead of dedicated effects, I can create a much wider range of visuals in real time. Its really incredible and already so rewarding. I'm looking forward to getting deeper and more confident with these frameworks to keep developing my own tools.

Im writing a separate post going into more detail about this particular process.

Audio exploration / Integration

While it isnt the focus of my project, live audio implementation is something I experimented with as well. for every frame, the computer measures the amplitude of the audio at that moment. This means a constant string of float numbers that can be mapped to any knob, effect, or function. Its a simple process and could bring another level of intractability to the visuals.








No comments:

Post a Comment