30 June 2016

BCH Rotations



For about a year I've had a secret weapon I've been using in nearly all of my projects: a colorspace based in spherical coordinates called "BCH" which stands for Brightness, Chroma, Hue. It was described by Sergey Bezryadin and Pavel Bourov. There isn't much information available about it online, but these slides explain everything.

Everyday usage is similar to HSV, but BCH has many advantages. It is grounded in real-world light response according to the user's preferred standard white point reference (D65, D50, etc). The color is represented by a vector where the magnitude is the brightness, the inclination is the saturation, and the azimuth is the hue. The motivation behind its development was to have more realistic exposure and contrast adjustment control over low dynamic range images, such as standard JPEGs. I used this capability to great effect when performing Lukidus last year where it drove the realtime color correction of video micrography.



Since the color is a vector, hue is a rotation around 2 * pi. In the video above, the angle of the velocity is mapped to the hue and I modulate independent multipliers for the sine and cosine components of the rotation. These both start at 0, so we only see red, but as I open them up, more colors are introduced, eventually reaching a rainbow pattern (complete rotation), passing through interesting color palettes along the way. I'm really interested in exploring this colorspace further and have some ideas of different mappings that could benefit from it.

Here's a Shadertoy I made with a BCH implementation and comparison to HSV:
https://www.shadertoy.com/view/lsVGz1

EOYS


It's been a while since my last post. I've been recovering from the end of year show and working on doing wavelet turbulence up-resing on the GPU. Above is a video of some different presets that I'm cycling through and were used during the show.

I presented the current iteration as "Time Differential" and made an instruction manual for the Minilab that was printed and displayed next to the controller to solidify the idea that it is a video synthesizer. This was a quick and dirty 3d model I made in Houdini and rendered using wren.



An Apple Magic Trackpad 2 was used with multitouch input to add density and velocity. It wasn't exactly elegant, but it worked fairly well: I ran this really neat hack on my laptop which captures multitouch data and broadcasts them as OSC messages. A Processing sketch running along side it received the position and finger size information and sent them across an ad-hoc wifi network to my desktop, which Touch then assembled into a CHOP where 1 sample = 1 finger. That CHOP was then sent as an array to my shaders. It was surprisingly responsive and more or less reliable.

Overall it went pretty well with a good amount of interest and constructive feedback. The controller was available for the public to play with, and I've already made some changes to the knobs and layout as a result. A lot of people were interested in the sound-reactive capabilities of it which led to a lot of clapping and yelling. Plus, kids loved it, which is a good omen I guess.