Skip to content
Inside Beat

'Star Wars' fans love CGI Luke Skywalker, but deepfake implications are dangerous

An eerily accurate CGI construction of Mark Hamill appears in "The Mandalorian" and "The Book of Boba Fett," prompting questions of what deepfakes could be used for in the future.  – Photo by Mandalorian

Episode 6 of “The Book of Boba Fett,” released in early February, included a surprising guest appearance: a young Luke Skywalker in all of his CGI glory.

For those of us who don’t have Luke and Princess Leia action figures in our bedrooms, and who think Reylo is an unfortunate combination of letters instead of an epic OTP (one true pair) name, it’s important to consider that the appearance of young Luke is more than a little concerning. Why? Mark Hamill, the actor who plays Luke, is 70 years old and appeared to look about 28 in the new spin-off.

That’s right, folks: A young version of a beloved character was recreated by CGI because the real actor who plays the character is too old to play a young person. So how and why did a young Luke appear on the screens of millions of "Star Wars" viewers? And why might that be disconcerting? 

CGI Luke is a deepfake — a form of manipulated media created using software and artificial intelligence, or AI, to fabricate something that’s never happened before that relies on some element of truth. 

An example of a deepfake is a video of President Joseph R. Biden Jr. uncomfortably lolling his tongue retweeted by former President Donald J. Trump in April 2020 (back when he still had a Twitter). Biden never moved his tongue that way, but he is a real person that was eerily made to look as if he did something that he hadn’t. 

Although exciting to some and jarring to those with more deepfake concerns, the CGI magic in “The Book of Boba Fett” isn’t the first time in "Star Wars" lore that a deepfake Luke has appeared. In the season two finale of another spinoff, “The Mandalorian,” the computer construction also surprised viewers with a cameo appearance. 

Someone who worked on the “The Mandalorian” deepfake revealed how it was done for the first CGI Luke cameo. “They effectively reproduced a de-aged version of Hamill for the shots by combining the texture from his face and also (the body double's) younger face," said Richard Bluff, a supervisor for Industrial Light & Magic VFX, to Esquire.

Even the voice of CGI Luke isn’t Hamill’s voice. It was created with a voice synthesizer application called Respeecher. Esquire also reports that a neural network was fed material of Hamill in that era and was able to learn and develop his voice.

Developing deepfake technology is no doubt a neat addition to storytelling techniques. If the decaying flesh bags known as actors are no longer needed to create spinoffs or even for visual storytelling at all, imagine the limitless spinoffs that can be created!

But the creation of such intensely real feeling deepfakes calls into question the ethics of developing this technology in the first place.

One of the most pressing concerns that deepfake technology poses is a threat to credibility. And no, not just credibility of movies — with technology that can literally rewrite history, deepfakes can distort what “real” even means.

There are those who argue — nihilists among them — that we each experience our own realities based on our past experiences and personal beliefs. What may be true for you may not be true for the person sitting next to you in class. You also probably won’t experience class the same way that your professor does or even in the same way that your peers do.

But due to media being a pervasive, omnipresent force and social media exacerbating media in general's impact, reality is becoming even more subjective. All of the media consumed by a person can make them believe something completely different than people living different lives or lives off the grid.

Can there truly be a shared material reality — which all of empiricism and most philosophy relies upon — if deepfakes disturb the legitimacy of what we consider real or not real?

And what are the repercussions of faking a politician to say something they never did? Or an iconic person from history? How will democracy be affected if doctored videos can be used as a dangerous propaganda tool? 

But there’s also more to deepfakes than a threat to reality as we understand it now. 

The documentary “Roadrunner,” released last summer in theaters, was about Anthony Bourdain’s life. Using hours of footage shot before his tragic suicide, filmmaker Morgan Neville created what many have called a beautiful tribute to Bourdain. 

But there was a major ethical dilemma: Neville used AI to create recordings of lines Bourdain never actually said. Despite Bourdain writing these words, many took issue with virtually resurrecting someone to say things they never said aloud for a movie — especially without a disclaimer. 

The ultimate Foucauldian question: Will the dead be permitted to die in the future? 

Perhaps drawing some shared truths could be the first way to have at least some level of defense against deepfakes. Hamill was not in fact “de-aged” as an IndieWire headline suggests. Hamill has accumulated 70 precious years of life — he himself did not act in "The Mandalorian" or "The Book of Boba Fett" to be digitally de-aged, but instead, a digital version bearing his likeness was created.

Another truth that should be shared: The dead, like Bourdain, should be allowed to die without the threat of virtual resurrection.

Media theorists like Guy Debord discuss how media shapes our experiences of reality. His work wrestles with how media representation becomes even more important, more viewed, more real than the actual thing itself. All of life becomes a distillation of a distillation of a distillation.

Debord put it best: "Everything that was directly lived has moved away into a representation.”

And now, the future of media might look like a distortion of Debord’s prescient analysis. Everything that was directly lived has moved into a computer-generated representation of a representation.


Related Articles


Join our newsletterSubscribe