VIRTUAL ABBA

The Michael Jackson project I did of virtual MJ posthumously singing and dancing to an unreleased song "live" at the 2014 Billboard Music Awards caught the eye of music producer Simon Fuller and the band ABBA. They asked me to recreate the band as their younger virtual selves for a music video, complete with dialogue and song.

Through a process developed and improved upon over the 10 years leading up to this project, we codified the individual persona of each band member and then leveraged machine learning tools to train their facial rigs. The finished video contains eight minutes of novel performance by all four ABBA characters having a conversation and singing a new ABBA song. This technique reinforced our efforts to achieve likeness via anatomically correct facial muscle, eye, and mouth models. Using physically based lighting and shading in Arnold, we were able to render realistic skin, eyes, teeth, and hair. This is not a deepfake, it's original character content.

In order to recruit the specialized artist and technical talent scattered across the world, we established a virtual musician production company of 50+ and built a cloud-first production pipeline, including an asset management system that offered a continuous interface with in-house and remote artists across the world. This new business model has no reliance on local servers nor local artist workstations, and effectively decentralizes staffing, asset, and production management.

The completed video is stuck in a legal quagmire and probably will never be released.

Full digi head, neck and hair pupeteered by the codified persona of virtual Agnetha delivering an original eight minute performance.

We have now established that it is possible to install a unique individual persona into a believably realistic digital double. While this is, indeed, legacy-building, the question is whether there is something meaningful to preserve (IoT), or if it’s just an exercise in vanity. Possibly both.