What was the brief from Smart Energy and how did you go about bringing it to life?
When we first saw this script we were all incredibly excited as we knew it would be a once-in-a-lifetime project. We had to recreate Einstein in digital form to help explain the benefits of smart meters.
We realised that to digitally recreate such an amazing human being with such historical importance would be immensely challenging but hugely rewarding. We had to make sure that not only was this a convincing CG human but it also portrayed Einstein with the familiarity that everyone has for him.
A slightly intimidating prospect, but a project like this doesn’t come around often. We spent the first few weeks researching Einstein, both from a physical appearance through the ages and also very importantly, who he was as a person. What did he stand for and what did he believe in? How did he interact on camera and surrounded by people? Ultimately we had to understand his personality intrinsically.
Unlike most awkward, introspective scientific geniuses of his generation, he always had a joke for the cameras and reveled in attention. He had a childlike wonder and saw the world very differently to most people- this is what set him apart and his high level of intelligence was also socially apparent when we caught a glimpse of his personal life.
What are the most important factors to consider when creating a photoreal CG human?
For VFX this project is unique and groundbreaking as lately, we have seen an explosion of digital humans but not ones you are entirely convinced by. So for us at The Mill, we spent months researching and developing a robust toolset so we could convincingly portray his personality.
To achieve this, we worked with DI4D to employ the most cutting-edge 4D Volumetric capture to scan our actor. The most important factor when creating a photoreal CG human is building an animation rig and interpreting motion data that maintains high fidelity to the performance or actor you are referencing so for this reason volumetric capture made total sense.
The way to face moves is ultimately the key in all of this. Get this right and the rest is….definitely not easy… but a bit easier to dissect!. What makes our system unique is that we are capturing sequences of blend shapes as opposed to a more traditional method of single linear blend shapes. This allowed us to re-create the most subtle facial performances and elevate our facial workflow with the most intricate detail using our millFACS system.
This system was developed in order to manage complex data obtained from high-resolution expression and performance scanning.
Much of the spot was completed during lockdown, how did this change how the team worked?
The global pandemic certainly made everything a bit more challenging!
What was also unique about this project was, unlike most big production film VFX, we had to create it in a relatively short time frame and test whilst we worked in production. Not having the luxury of extensive R&D and also working in lockdown we had to over-communicate on a daily basis and share regular WIPS.
We eradicated daily meetings in favour of an open dialogue with the team and made sure video channels were always active so we could simulate the same working conditions as though we were in the office!
It’s worth mentioning that the 2D compositing was an incredible feat as well. To seamlessly integrate a CG head onto a live-action body took a lot of painstaking work to get right. In the end we had to age the whole body to match the face- this meant aging the hands and adding more skin details to the chest and arms. In order to do this we employed some extra tracking to ‘pin’ the mesh back to the original live-action body and then a layer of vector tracking on-top so that the additional texture work would stick exactly where we need it to.
It’s surprising how much movement there is in skin and how tricky it is to deal with!
Also, typically speaking in the film world, the actor is actually the digital double being re-created so there lots of references can be obtained and you can directly scan them. With Einstein, we had to find as many references as we could from the internet and archival searches and ensure we maintained flexibility in the modeling and rigging process.
This is why our MillFACS tool came in handy as we could quickly iterate on the model of Einstein without totally destroying our face shapes.
The team was also incredible – each artist was truly a specialist in their field. A big shout out to Harsh, Sefki, Maria, Clare and Andreas who were total rockstars on this project.