30 June 2016

EOYS


It's been a while since my last post. I've been recovering from the end of year show and working on doing wavelet turbulence up-resing on the GPU. Above is a video of some different presets that I'm cycling through and were used during the show.

I presented the current iteration as "Time Differential" and made an instruction manual for the Minilab that was printed and displayed next to the controller to solidify the idea that it is a video synthesizer. This was a quick and dirty 3d model I made in Houdini and rendered using wren.



An Apple Magic Trackpad 2 was used with multitouch input to add density and velocity. It wasn't exactly elegant, but it worked fairly well: I ran this really neat hack on my laptop which captures multitouch data and broadcasts them as OSC messages. A Processing sketch running along side it received the position and finger size information and sent them across an ad-hoc wifi network to my desktop, which Touch then assembled into a CHOP where 1 sample = 1 finger. That CHOP was then sent as an array to my shaders. It was surprisingly responsive and more or less reliable.

Overall it went pretty well with a good amount of interest and constructive feedback. The controller was available for the public to play with, and I've already made some changes to the knobs and layout as a result. A lot of people were interested in the sound-reactive capabilities of it which led to a lot of clapping and yelling. Plus, kids loved it, which is a good omen I guess.



09 May 2016

Tukey Window



This is a quick video demonstrating Tukey Windowing modulating the particle size using age as the input signal. Alpha (from the window function, not rgba) is set to .3 and it's been zoomed in to see the effect easier. Windowing the particle size will help ease transitions from particle birth and death so there's less noticeable pops. Here's some glsl code:

float tukeyWindow(float sig, float alpha) {
 if (alpha == 0.) // rectangular window
  return 1.;
 else if (sig < alpha / 2.)
  return 0.5 * (1. + cos(2.*3.1459265/alpha * (sig-alpha/2.)));
 else if (sig > 1.-alpha/2.)
  return 0.5 * (1. + cos(2.*3.1459265/alpha * (sig-1.+alpha/2.)));
 else
  return 1.;
}

Control




I've started integrating my Arturia Minilab to control various simulation parameters, inputs, and visualization options. Here is a quick video demonstrating the UI I've made that captures the midi signals for all of the knobs, pads, and sliders. Pads 1-4 select a different bank of knob settings, each with their own presets and color indicator. Values can be changed by both the controller and the on-screen UI. The keys work as well, but I'm focusing on these first as they will be the primary way of sculpting the behavior of the system. The keys will likely act as gates and triggers for adding new input (i.e. sound or image data).

24 April 2016

Particle trails



Messing around with feedback to get flow lines from the fluid sim. The color represents velocity direction and value is the magnitude of the velocity. Particles can bounce around obstacles and don't spawn inside them. 1 million particles @60 fps.

21 April 2016

GPU particles



Here's 250,000 particles being advected by the velocity field. This is being done using glsl shaders in TouchDesigner. The value of the particles is currently just the magnitude of the velocity, but I'll be adding a lot more color options later. I'm maintaining 60fps using a 512x512 fluid container.

I'm going to spend the next couple of weeks making big pushes in UI to prepare for a public demonstration at the end of year show. A lot to do still.

30 March 2016

Audio visualization, obstacle velocity



Here's a quick update that includes a couple new features: passing velocity in with obstacle data and rendering 3d geometry to a 2d image to use as input (although in this video, actual obstacles and density fields aren't present). I've got a simple particle simulation running on the CPU in TouchDesigner whose initial velocity and direction is being controlled by an audio waveform. I followed along with Derivative's 2016 Norway Workshop to get the system set up. Particle velocities are rendered as colors and piped into the CUDA sim. Adding these features went fairly smoothly.

Here it's very apparent that the simulation container doesn't have any interesting border conditions, which can be a good and bad thing. It's another feature to add. In the coming months, I'll be working on the UI, different composition techniques, experimenting with looks, and improving stability to prepare for MAT's End of Year Show. I've also become very interested in Ted's Wavelet Turbulence paper which would really send the visuals through the roof. No timeline on that yet, but it's going on the list.

16 March 2016

Boundaries and color



This update includes a lot of changes to the simulation algorithm as well as some new features. I ended up restructuring the velocity solve to mirror the one found in GPU Gems, which is still based on Stam's solution, but is more clear about the pressure and temperature fields, and simplifies the gradient subtraction a bit, so the performance is better.

For the reaction coupling, I diffuse two density fields with a laplacian kernel and apply the Gray-Scott equation, then advect them. Since there is a slight amount of diffusion from bilinear interpolation in the advection step, I suspect the Gray-Scott feed and kill values might be a little off. Nonetheless it appears stable. I'm passing a texture from Touch into cuda to define obstacles. The color is coming from the velocity, with hue mapped to its polar angle.

There are a lot of fields and variables to play with now. I'll have to decide if I want to start working on robustness and composition techniques, or to move forward with adding a particle system and pushing the resolution as far as possible.