30 June 2016

BCH Rotations



For about a year I've had a secret weapon I've been using in nearly all of my projects: a colorspace based in spherical coordinates called "BCH" which stands for Brightness, Chroma, Hue. It was described by Sergey Bezryadin and Pavel Bourov. There isn't much information available about it online, but these slides explain everything.

Everyday usage is similar to HSV, but BCH has many advantages. It is grounded in real-world light response according to the user's preferred standard white point reference (D65, D50, etc). The color is represented by a vector where the magnitude is the brightness, the inclination is the saturation, and the azimuth is the hue. The motivation behind its development was to have more realistic exposure and contrast adjustment control over low dynamic range images, such as standard JPEGs. I used this capability to great effect when performing Lukidus last year where it drove the realtime color correction of video micrography.



Since the color is a vector, hue is a rotation around 2 * pi. In the video above, the angle of the velocity is mapped to the hue and I modulate independent multipliers for the sine and cosine components of the rotation. These both start at 0, so we only see red, but as I open them up, more colors are introduced, eventually reaching a rainbow pattern (complete rotation), passing through interesting color palettes along the way. I'm really interested in exploring this colorspace further and have some ideas of different mappings that could benefit from it.

Here's a Shadertoy I made with a BCH implementation and comparison to HSV:
https://www.shadertoy.com/view/lsVGz1

EOYS


It's been a while since my last post. I've been recovering from the end of year show and working on doing wavelet turbulence up-resing on the GPU. Above is a video of some different presets that I'm cycling through and were used during the show.

I presented the current iteration as "Time Differential" and made an instruction manual for the Minilab that was printed and displayed next to the controller to solidify the idea that it is a video synthesizer. This was a quick and dirty 3d model I made in Houdini and rendered using wren.



An Apple Magic Trackpad 2 was used with multitouch input to add density and velocity. It wasn't exactly elegant, but it worked fairly well: I ran this really neat hack on my laptop which captures multitouch data and broadcasts them as OSC messages. A Processing sketch running along side it received the position and finger size information and sent them across an ad-hoc wifi network to my desktop, which Touch then assembled into a CHOP where 1 sample = 1 finger. That CHOP was then sent as an array to my shaders. It was surprisingly responsive and more or less reliable.

Overall it went pretty well with a good amount of interest and constructive feedback. The controller was available for the public to play with, and I've already made some changes to the knobs and layout as a result. A lot of people were interested in the sound-reactive capabilities of it which led to a lot of clapping and yelling. Plus, kids loved it, which is a good omen I guess.



09 May 2016

Tukey Window



This is a quick video demonstrating Tukey Windowing modulating the particle size using age as the input signal. Alpha (from the window function, not rgba) is set to .3 and it's been zoomed in to see the effect easier. Windowing the particle size will help ease transitions from particle birth and death so there's less noticeable pops. Here's some glsl code:

float tukeyWindow(float sig, float alpha) {
 if (alpha == 0.) // rectangular window
  return 1.;
 else if (sig < alpha / 2.)
  return 0.5 * (1. + cos(2.*3.1459265/alpha * (sig-alpha/2.)));
 else if (sig > 1.-alpha/2.)
  return 0.5 * (1. + cos(2.*3.1459265/alpha * (sig-1.+alpha/2.)));
 else
  return 1.;
}

Control




I've started integrating my Arturia Minilab to control various simulation parameters, inputs, and visualization options. Here is a quick video demonstrating the UI I've made that captures the midi signals for all of the knobs, pads, and sliders. Pads 1-4 select a different bank of knob settings, each with their own presets and color indicator. Values can be changed by both the controller and the on-screen UI. The keys work as well, but I'm focusing on these first as they will be the primary way of sculpting the behavior of the system. The keys will likely act as gates and triggers for adding new input (i.e. sound or image data).

24 April 2016

Particle trails



Messing around with feedback to get flow lines from the fluid sim. The color represents velocity direction and value is the magnitude of the velocity. Particles can bounce around obstacles and don't spawn inside them. 1 million particles @60 fps.

21 April 2016

GPU particles



Here's 250,000 particles being advected by the velocity field. This is being done using glsl shaders in TouchDesigner. The value of the particles is currently just the magnitude of the velocity, but I'll be adding a lot more color options later. I'm maintaining 60fps using a 512x512 fluid container.

I'm going to spend the next couple of weeks making big pushes in UI to prepare for a public demonstration at the end of year show. A lot to do still.

30 March 2016

Audio visualization, obstacle velocity



Here's a quick update that includes a couple new features: passing velocity in with obstacle data and rendering 3d geometry to a 2d image to use as input (although in this video, actual obstacles and density fields aren't present). I've got a simple particle simulation running on the CPU in TouchDesigner whose initial velocity and direction is being controlled by an audio waveform. I followed along with Derivative's 2016 Norway Workshop to get the system set up. Particle velocities are rendered as colors and piped into the CUDA sim. Adding these features went fairly smoothly.

Here it's very apparent that the simulation container doesn't have any interesting border conditions, which can be a good and bad thing. It's another feature to add. In the coming months, I'll be working on the UI, different composition techniques, experimenting with looks, and improving stability to prepare for MAT's End of Year Show. I've also become very interested in Ted's Wavelet Turbulence paper which would really send the visuals through the roof. No timeline on that yet, but it's going on the list.