Mine All Mine

This was always meant to be a back-burner project.

I had worked with Ol’ Man Jonny a while back on the video Armageddion Time, Using Blender Internal for Cell-Shading. This time we wanted to mix live-action with animation. My first thought was to use the then new (It’s April 2021 at this point) AI-based rotoscoping to separate the backgrounds of videos, placing cartoon characters in and around the actors. Oh, but AI has grown so much since then.

Ol’ Man Jonny and his wife Zanita, however, had other plans.

New plan …. to animate Zanita’s illustration work and combine it with greenscreen footage.

I was able to Persuade Jonny that I could speed up the tune from 148 to 150 BPM. The BPM of 150 meant that a beat was 10 frames long, and a dance animation loop would be 10, 20 or 40 frames long, and all the animals would keep perfect time. The increase of 2 BPM was basically unnoticeable.

We hired a large room in an independent gallery / studio in Sheffield for the day in November 2021, and pinned up a large sheet of papery greenscreen fabric. It was very cold, but I think that improved the dancing, despite the fact it was meant to be in Summer!

The cutouts would be created by zanita and animated by myself. I did some super-rough sketches explaining how to split the animals into cutouts, and Zanita went to work painting and scanning collections of cutout elements for me to animate.

Despite the fast cutting this was all contained within a single scene, with the camera and the greenscreen footage plane teleporting around the scene. The world was a mix of 2D and 3D – everything apart from the hills was 2D cutouts, placed in a 3D scene, and forever turning to face the camera. For the bees and the hedgehogs this was even more complex… these animals had different versions depending on which way they were facing. A collection of drivers read the direction of the camera and the animal, and selected which animation to show.

Well — actually there were two scenes. There was one where I used Blender’s video editor to edit the greenscreen footage, then there was a 3D scene with a lake with hills round the edge covered in trees and a few animals. The 3D scene was rendered using viewport rendering, which is about 4 times as fast.

I would be able to use my addon Greenscreen Within Eevee Pro to put the greenscreen footage inside a 3d scene, and surround it with cutout animation.

One challenge was that the animals were all quite small. Even with cartoon inaccurate scale it would be weird for a frog to be more than knee-high. We would have to either view the actors from lowdown with the animals in the foreground or put them in trees or cut between the animals and the people. I probably should have done more of cutting between the animals and the people.

The hearts were particles – but these must only pop into existence on the beat. I keyframed the age of the particle and the emitter so that except on the beats they would emit hearts several metres under the ground, and these hearts would last only a single frame.

As I was reaching the end of the project the bazaar happened. I had a series of 3 Seizures one morning and was rushed to hospital for brain surgery. In March 2020 one of my fillings had fallen out, and as Covid prevented me from getting to my dentist, a bacteria had spent the last two years growing from my gum into my brain. It was May 2022 by now, and our hopes of releasing for Summer 2022 were becoming unrealistic. It would have to wait a little. At this point the vid was in the bugfix stage… watching back through a fine tooth comb looking for glitches. And most of the glitches were how the viewport handled videos on planes: sometimes they would play the wrong frame. After a while I came up with rendering two versions, and editing the two together… as the random glitches were not always happening on the same frame.

And we finally released it for Valentines day 2023.

2D IN YOUR 3D

2D IN YOUR 3D!!!

Blender Perspective Tricks

when you bring
2d into your 3d
be it
images,
greenscreen footage,
text or
grease pencil,
there are methods
for making it
face the camera.

these are those.

00:00 Intro
01:03 Sidebar : Use Images as Planes
01:39 Copy Rotation
02:17 Sidebar : Pixel Art Textures
03:36 If it’s Grease Pencil
05:52 Sidebar : Multiple Cameras
06:42 Drivers for multiple cameras
11:37 Geometry nodes and Particles
16:29 Speech Bubbles – Always the same size!
20:08 Equarectangular projection

I’ve been chipping away at this over my bus commute in the morning and evenings (I don’t have a lot of free time, and my battery only lasts a 3rd of the bus journey)

 

Free Automated Lipsync: Blender, Makehuman + Rhubarb

NEWS FLASH!

We can now do fully automated nine-phoneme lipsinc, using the same Program that Ron Gilbert and his team used on Thimbleweed Park.

Rhubarb Lipsync is created by Daniel S Wolf, and the Blender addon is created by Scaredyfish.

There are a lot of links in this tutorial, so here goes…

First there’s the software:

  • Rhubarb Lipsinc Download Page:
    https://github.com/DanielSWolf/rhubarb-lip-sync/releases
  • Rhubarb Lipsinc Blender Addon:
    https://github.com/scaredyfish/blender-rhubarb-lipsync

And then some bits I created myself:


I was really exited to discover this – even if it makes my previous tutorials obsolete!

 

13 TEETH

Short film and accompanying interviews created from Sheffield Festival of the Mind

Accompanying interviews:

To quote the festival’s press:

“Mike Futcher and Mark Gwynne Jones’s short film is a gothic thriller about a man who thinks the voices in his head are coming from his teeth. His quest to extract them (eek!) leads to a shocking revelation.
Based on Gwynne Jones’s narrative poem, the film mixes live action and animation to brilliantly unsettling effect. A rare exponent of dentistry noir, it asks the big question: what is behind consciousness?
In the accompanying documentary, The Making of 13 Teeth, Sheffield academics talk us through some of the film’s most striking images, including skeletons, nerves and teeth.
The film is unsuitable for young children.”