Hello and welcome to 5 to 11 March 2017. What’s this weirdly named new project all about, then?
The provisional title for this project is RYGCBMKO. The first seven letters stand for the primary and secondary hues red, yellow, green, cyan, blue, magenta and black (key). The O represents a circle.
The genre of this project is non-figurative abstract animation. The theme of the project is synchronised sound and motion, rhythm both seen and heard.
So every day since the start of the month, I’ve been creating at least one shot of abstract animation, forty-eight frames long. I’ve been working in both Blender (with Animation Nodes) and Krita (hand-drawn).
Here’s the guidance I’ve given myself for the visuals:
- For every day of the project, the artist will bank at least one piece of footage lasting two seconds (48 frames). If any day is skipped, Sunday will be used to catch up.
- Permitted hues are primaries and secondaries – red, yellow, green, cyan, blue, magenta, black and a minimal use of white for accents only. Grey is not permitted. Colour saturation should be strong but does not need to be all the way. The colour space shall be display-referred sRGB. (Sorry, Troy.)
- Permitted shapes are cylindrical forms with rounded ends (capsules), toruses, spheres, and anything of the sort. Non-circular ellipses are not permitted. Arcs are not permitted except where they conform to the arc of a circle. Right angles or sharp angles of any sort are not permitted except when one element passes in front of another. Flat two dimensional planes are not permitted.
- Surfaces may be flat/smooth or make use of the shapes discussed above. Random or “violent” surface textures are not permitted. The surfaces of obviously three-dimensional forms should not behave to real life expectations. Normal-based shading effects are encouraged. Forms should not intersect.
- OpenGL and Blender Internal are permitted, but path-tracing (Cycles) is discouraged. If Cycles must be used (e.g for motion blur, microdisplacement, etc), the following nodes are permitted for shading purposes: Emission, Transparent, Mix, Add and Holdout.
- Any given element on screen must be translating, scaling and/or rotating. Nothing may remain still on screen with the exception of any flat background colour. Black will be the preferred background colour. Aligned elements must move in concert, not chaotically. Colours should not strobe violently.
- The elements on screen should strongly reference a 12 frame rhythm. Musically speaking, each piece of footage (aside from the titles) lasts for exactly four beats, with each beat lasting twelve frames and each sixteenth note lasting three frames.
- At no time may the video be directly produced by automatically interpreting (visualising) any audio. Oscilloscopes, visualisers and other direct algorithmic visual interpretations of the music are forbidden. The audio may be interpreted automatically as an aid.
- Post-processing should be limited to editing, adding glow, sandwiching different elements to one another, etc. Motion blur is permitted to prevent frame strobing effects.
- It is permitted and encouraged for blocks of individual footage to be edited and cut up to better synchronise to the music.
So far my favourite pieces of footage are a little cometty thing skipping across what appears to be a dark liquid surface and a flashy metaball amoeba that escaped from the hypercolour 1990s.
Forty-eight frames works out at four beats for a piece of music ticking along at 120 beats per minute, or “twelve-frame time” in animation-speak. Here’s the stipulations for the soundtrack:
- The genre of the accompanying music will be electronic, likely house music. The music will be 120 beats per minute to make it easier to synchronise the video. Each piece of footage will last for exactly four beats, with each beat lasting twelve frames and each sixteenth note lasting three frames.
- The sound sources will be analogue – the ARP Odyssey with the Volca Beats and the Monotron Delay for backup. The sound may be processed using stomp boxes and digital effects, but anything other than mastering effects in the digital audio workstation must be in place and set before the sound is recorded. The creative emphasis is being inspired through discovery and performance, not polishing turds.
- Sound will be recorded track-by-track in mono into a digital audio workstation for mixing and compiling.
- MIDI may not be used except for the purpose of synchronisation. MIDI note information is forbidden, but MIDI gate is OK. Sequenced pitch and gate information should be fed to the ARP Odyssey by the SQ-1 via patch cables. The artist is reminded that the SQ-1 can be clock-divided to sixteenth, eighth and quarter notes even when synched to the Volca Beats by holding down Play/Stop while turning the SQ-1 on and turning the mode knob to different settings.
- The sonic elements should also strongly reference a 12 frame beat. Audio may be produced by automatically interpreting (visualising) video through the ARP Odyssey if the means are available. Recorded audio may be slowed down or sped up in factors of two for rhythm, and slowed down or sped up as necessary for tuning. Timestretching is forbidden because it sounds naff.
- It is at the artist’s discretion as to when the soundtrack shall be created. Soundtrack work shall not be considered a substitute for animation work.
The original delivery date was 1 April, but I’m keeping this going until Easter, which is when I wind up the secondment at the day job. The intent is to cut the footage and audio together over the Easter long weekend and have something ready to show off on Easter Monday.
In terms of quality, I have no expectations. This is a holding pattern project to try some new things, build new skills and establish a daily creative habit within a timeboxed project. Showing up and doing the work is more important than the watchability of the finished piece. This will probably not be portfolio material.
And once it’s finished, I take a breath, put it to one side and move on, very probably back to Pointy and Gronky. 🙂