Mine All Mine

This was always meant to be a back-burner project.

I had worked with Ol’ Man Jonny a while back on the video Armageddion Time, Using Blender Internal for Cell-Shading. This time we wanted to mix live-action with animation. My first thought was to use the then new (It’s April 2021 at this point) AI-based rotoscoping to separate the backgrounds of videos, placing cartoon characters in and around the actors. Oh, but AI has grown so much since then.

Ol’ Man Jonny and his wife Zanita, however, had other plans.

New plan …. to animate Zanita’s illustration work and combine it with greenscreen footage.

I was able to Persuade Jonny that I could speed up the tune from 148 to 150 BPM. The BPM of 150 meant that a beat was 10 frames long, and a dance animation loop would be 10, 20 or 40 frames long, and all the animals would keep perfect time. The increase of 2 BPM was basically unnoticeable.

We hired a large room in an independent gallery / studio in Sheffield for the day in November 2021, and pinned up a large sheet of papery greenscreen fabric. It was very cold, but I think that improved the dancing, despite the fact it was meant to be in Summer!

The cutouts would be created by zanita and animated by myself. I did some super-rough sketches explaining how to split the animals into cutouts, and Zanita went to work painting and scanning collections of cutout elements for me to animate.

Despite the fast cutting this was all contained within a single scene, with the camera and the greenscreen footage plane teleporting around the scene. The world was a mix of 2D and 3D – everything apart from the hills was 2D cutouts, placed in a 3D scene, and forever turning to face the camera. For the bees and the hedgehogs this was even more complex… these animals had different versions depending on which way they were facing. A collection of drivers read the direction of the camera and the animal, and selected which animation to show.

Well — actually there were two scenes. There was one where I used Blender’s video editor to edit the greenscreen footage, then there was a 3D scene with a lake with hills round the edge covered in trees and a few animals. The 3D scene was rendered using viewport rendering, which is about 4 times as fast.

I would be able to use my addon Greenscreen Within Eevee Pro to put the greenscreen footage inside a 3d scene, and surround it with cutout animation.

One challenge was that the animals were all quite small. Even with cartoon inaccurate scale it would be weird for a frog to be more than knee-high. We would have to either view the actors from lowdown with the animals in the foreground or put them in trees or cut between the animals and the people. I probably should have done more of cutting between the animals and the people.

The hearts were particles – but these must only pop into existence on the beat. I keyframed the age of the particle and the emitter so that except on the beats they would emit hearts several metres under the ground, and these hearts would last only a single frame.

As I was reaching the end of the project the bazaar happened. I had a series of 3 Seizures one morning and was rushed to hospital for brain surgery. In March 2020 one of my fillings had fallen out, and as Covid prevented me from getting to my dentist, a bacteria had spent the last two years growing from my gum into my brain. It was May 2022 by now, and our hopes of releasing for Summer 2022 were becoming unrealistic. It would have to wait a little. At this point the vid was in the bugfix stage… watching back through a fine tooth comb looking for glitches. And most of the glitches were how the viewport handled videos on planes: sometimes they would play the wrong frame. After a while I came up with rendering two versions, and editing the two together… as the random glitches were not always happening on the same frame.

And we finally released it for Valentines day 2023.

LEAF – THE CONTROLS: different, but still tight

The controls on my recently released game LEAF are very different from how they are on most platformers – but they still work!

This is a reply to “5 Reasons Your Indie Platformer Game Sucks” by Jonas Tyroller : https://youtu.be/vFsJIrm2btU

(I hope I pronounced his name correctly)

The game can be downloaded here:

LEAF

LEAF

LEAF is a game for the Acorn Archimedes which I started in 98, returned to and finished in 2003, never actually released, but displayed at galleries until my A3010 disk drive got tired of touring.  After this it sat mouldering in a cupboard until I brought the disks to the Risc Os London Show 2018 and Rob Coleman kindly restored it.  You can now download it and run it in an Archimedes emulator.


DOWNLOAD

I’m particularly keen to see if anyone actually starts making levels with it, but I guess that’s optimistic.

GETTING THE EMULATOR WORKING

The emulator that I’ve been using is Arculator – download from here

When you unzip and run it, you get a message about missing roms. You can download them from here

I chose riscos3_10.zip , but the other versions may well work as well.

You next put the contents of that zip in the appropriate folder of the ROMS folder.  I chose RiskOs3.1 so I put it in the RISCOS3 folder.

You next have to tell Arculator you’re running RiskOs3

Leaf was created on an A3010 which has this CPU type:


After that, you can use the disk menu to load the LEAF disk image!

CONTROLS

Z      : LEFT            
X      : RIGHT           
SHIFT  : JUMP            
RETURN : CATCH FLY       
SPACE  : GO THROUGH DOOR,
         PICK UP OBJECT, 
      SWITCH SWITCH

Free Automated Lipsync: Blender, Makehuman + Rhubarb

NEWS FLASH!

We can now do fully automated nine-phoneme lipsinc, using the same Program that Ron Gilbert and his team used on Thimbleweed Park.

Rhubarb Lipsync is created by Daniel S Wolf, and the Blender addon is created by Scaredyfish.

There are a lot of links in this tutorial, so here goes…

First there’s the software:

  • Rhubarb Lipsinc Download Page:
    https://github.com/DanielSWolf/rhubarb-lip-sync/releases
  • Rhubarb Lipsinc Blender Addon:
    https://github.com/scaredyfish/blender-rhubarb-lipsync

And then some bits I created myself:


I was really exited to discover this – even if it makes my previous tutorials obsolete!

 

Lipsinc for the Lazy

LIPSINC for the LAZY on Blender.

Episode 1: Tweened Cartoon.

A quick and simple way to animate dialog.

This would be ideal for a youtuber who wants to represent themselves as an avatar, or quickly animating a cartoon that mostly consists of dialog.

On later episodes I’m going to look at using more realistic characters (EG from Makehuman), and stop-motion – style characters, with replacement mouths and faces.

00:17 Touching up your sound in Audacity
00:54 Important setup steps
01:12 Import the sound
01:51 Add shape Keys
03:31 Shapekeys react to audio
06:05 Shapekeys for exressions
09:22 Animate expressions
10:17 Why don’t we use actions for the body?
11:52 Setting up a pose-lib
12:14 Animating the body

A slightly more advanced version of the character can be found here: https://www.blendswap.com/blends/view/92042