With an impressive ProofOfConcept, Dutch animator Jim Derks has shown on Reddit, among other places, what can currently already be achieved with open source AI and commercially available technology.
He digitally rejuvenated Harrison Ford in some scenes of the movie "Cowboys & Aliens". (Unfortunately, he has not (yet?) disclosed an exact workflow, but the result is a combination of Stable Diffusion with ControlNet and EBSynth. In addition, it is very likely that Blackmagic Fusion&s denoising and Magic Mask was used, which in a previously disclosed workflow already played a similar role.
According to Jim Derks, the process took just 20 minutes of processing time per scene, though he also doesn&t give specific details about the hardware used.
Stable Diffusion can turn old into young - even in film.
The result is by no means perfect yet, but it is already so good that a large part of potential viewers would not necessarily notice the "fake". Compared to many other current AI applications for moving images, this demo video particularly stands out for how good the temporal consistency of the result has become. The younger face already "sits" impressively firmly on its older original. Disturbing glitches can hardly be detected without pixel peeping.
Thus, this example shows one thing above all: The easy interchangeability of people and objects in videos is just around the corner. And not just for a few studios in Hollywood, but for anyone with access to some GPU processing power.
The quality will eventually be near-perfect for home users as well, and we&ll soon see how to make changes to clips in the timeline via voice and range selection almost foolproof. Just go to select the object you want and say/write what you want to change. Soon it won&t be much more...