Mine All Mine

This was always meant to be a back-burner project.

I had worked with Ol’ Man Jonny a while back on the video Armageddion Time, Using Blender Internal for Cell-Shading. This time we wanted to mix live-action with animation. My first thought was to use the then new (It’s April 2021 at this point) AI-based rotoscoping to separate the backgrounds of videos, placing cartoon characters in and around the actors. Oh, but AI has grown so much since then.

Ol’ Man Jonny and his wife Zanita, however, had other plans.

New plan …. to animate Zanita’s illustration work and combine it with greenscreen footage.

I was able to Persuade Jonny that I could speed up the tune from 148 to 150 BPM. The BPM of 150 meant that a beat was 10 frames long, and a dance animation loop would be 10, 20 or 40 frames long, and all the animals would keep perfect time. The increase of 2 BPM was basically unnoticeable.

We hired a large room in an independent gallery / studio in Sheffield for the day in November 2021, and pinned up a large sheet of papery greenscreen fabric. It was very cold, but I think that improved the dancing, despite the fact it was meant to be in Summer!

The cutouts would be created by zanita and animated by myself. I did some super-rough sketches explaining how to split the animals into cutouts, and Zanita went to work painting and scanning collections of cutout elements for me to animate.

Despite the fast cutting this was all contained within a single scene, with the camera and the greenscreen footage plane teleporting around the scene. The world was a mix of 2D and 3D – everything apart from the hills was 2D cutouts, placed in a 3D scene, and forever turning to face the camera. For the bees and the hedgehogs this was even more complex… these animals had different versions depending on which way they were facing. A collection of drivers read the direction of the camera and the animal, and selected which animation to show.

Well — actually there were two scenes. There was one where I used Blender’s video editor to edit the greenscreen footage, then there was a 3D scene with a lake with hills round the edge covered in trees and a few animals. The 3D scene was rendered using viewport rendering, which is about 4 times as fast.

I would be able to use my addon Greenscreen Within Eevee Pro to put the greenscreen footage inside a 3d scene, and surround it with cutout animation.

One challenge was that the animals were all quite small. Even with cartoon inaccurate scale it would be weird for a frog to be more than knee-high. We would have to either view the actors from lowdown with the animals in the foreground or put them in trees or cut between the animals and the people. I probably should have done more of cutting between the animals and the people.

The hearts were particles – but these must only pop into existence on the beat. I keyframed the age of the particle and the emitter so that except on the beats they would emit hearts several metres under the ground, and these hearts would last only a single frame.

As I was reaching the end of the project the bazaar happened. I had a series of 3 Seizures one morning and was rushed to hospital for brain surgery. In March 2020 one of my fillings had fallen out, and as Covid prevented me from getting to my dentist, a bacteria had spent the last two years growing from my gum into my brain. It was May 2022 by now, and our hopes of releasing for Summer 2022 were becoming unrealistic. It would have to wait a little. At this point the vid was in the bugfix stage… watching back through a fine tooth comb looking for glitches. And most of the glitches were how the viewport handled videos on planes: sometimes they would play the wrong frame. After a while I came up with rendering two versions, and editing the two together… as the random glitches were not always happening on the same frame.

And we finally released it for Valentines day 2023.

2D IN YOUR 3D

2D IN YOUR 3D!!!

Blender Perspective Tricks

when you bring
2d into your 3d
be it
images,
greenscreen footage,
text or
grease pencil,
there are methods
for making it
face the camera.

these are those.

00:00 Intro
01:03 Sidebar : Use Images as Planes
01:39 Copy Rotation
02:17 Sidebar : Pixel Art Textures
03:36 If it’s Grease Pencil
05:52 Sidebar : Multiple Cameras
06:42 Drivers for multiple cameras
11:37 Geometry nodes and Particles
16:29 Speech Bubbles – Always the same size!
20:08 Equarectangular projection

I’ve been chipping away at this over my bus commute in the morning and evenings (I don’t have a lot of free time, and my battery only lasts a 3rd of the bus journey)

 

ODIN – the Kid on the Mountain – A Jig for a Kiss

Celebrating Odin being 5 months old today! (though it was filmed about 2 months ago)

This video features me, my son & my dad ~ Odin Futcher-Rose dancing, Micheal Futcher on graphics and Peter Darling on flute ~ and various other woolly wanders created & collected by my partner Kerry Rose – many of them part of a Hyperbolic Crochet Forrest exhibition.

Created using my addon Greenscreen Within Eevee

Material Greenscreen



Download

Every so often it’s good to take a bit of greenscreen footage, place it like a cardboard cutout in a 3D scene, and do some nice smooth CG camera moves.

Now you can key out the greenscreen within material nodes!

Credits in order of appearance:
Little House by dono
Aquarius Loading theme from Super Carling the Spider by Joe Dixon
Floppy Disk Drive Write by mrauralization
Material Girl by Madonna arranged for C64 by Sami Sepp
Photo by Eugene Capon from Pexels
Peacock by Magdabed

Free Automated Lipsync: Blender, Makehuman + Rhubarb

NEWS FLASH!

We can now do fully automated nine-phoneme lipsinc, using the same Program that Ron Gilbert and his team used on Thimbleweed Park.

Rhubarb Lipsync is created by Daniel S Wolf, and the Blender addon is created by Scaredyfish.

There are a lot of links in this tutorial, so here goes…

First there’s the software:

  • Rhubarb Lipsinc Download Page:
    https://github.com/DanielSWolf/rhubarb-lip-sync/releases
  • Rhubarb Lipsinc Blender Addon:
    https://github.com/scaredyfish/blender-rhubarb-lipsync

And then some bits I created myself:


I was really exited to discover this – even if it makes my previous tutorials obsolete!

 

Lipsinc for the Lazy

LIPSINC for the LAZY on Blender.

Episode 1: Tweened Cartoon.

A quick and simple way to animate dialog.

This would be ideal for a youtuber who wants to represent themselves as an avatar, or quickly animating a cartoon that mostly consists of dialog.

On later episodes I’m going to look at using more realistic characters (EG from Makehuman), and stop-motion – style characters, with replacement mouths and faces.

00:17 Touching up your sound in Audacity
00:54 Important setup steps
01:12 Import the sound
01:51 Add shape Keys
03:31 Shapekeys react to audio
06:05 Shapekeys for exressions
09:22 Animate expressions
10:17 Why don’t we use actions for the body?
11:52 Setting up a pose-lib
12:14 Animating the body

A slightly more advanced version of the character can be found here: https://www.blendswap.com/blends/view/92042