Originally published at: https://boingboing.net/2020/02/20/how-the-mandalorian-was-shot-o.html
…
If only they’d had a good story to shoot with that wonderful tech. Like Firefly.
As of last summer the question “When will they bring back Firefly??” was legally old enough to get married in 42 states. Time to move on.
Let alone that movie (now there is an example of a god awful terrible story) kind of ended all chances of there being any more to the story.
UE4 can produce some pretty phenomenal visuals, so it doesn’t surprise me they’ve started using it as a VFX platform—even just for pre-production and on-set action that’ll be replaced with final work later. The Weather Channel’s been using similar realtime-render-to-broadcast stuff for a couple of years now, IIRC.
The camera-oriented parallax must be hell to look at for the actors, though.
I have it on good authority that the Lucasfilm folks really wanted to film a lot of season 2 on site in the “village” area of Disneyland’s “Star Wars: Galaxy’s Edge” (AKA Star Wars Land) but Disneyland management politely turned them down due to the logistical issues with needing to have it available for paying guests. Oh well. Maybe we’ll see a fan film secretly filmed on site a-la Escape From Tomorrow.
That’s … a whole lot of LEDs.
Now, when can I get a equivalent, but much smaller, set up to watch it on?
I’ve lost count of the number of times I’ve been shot on a virtual stage using the Unreal engine.
Usually it’s this one:
This is amazing. I’m constantly blown away by the increasing overlap between video game and movie/television production tools. 20+ years ago, when imagining how film and games might start to merge, most of this was unimaginable - and if someone had described it to me, I would have thought it too far-fetched.
I wonder if Unreal gets used much for the final rendering, too. Seems like the focus is on creating a pipeline for taking assets from the tools where they might get their final render into Unreal just for the real-time aspect, but game engines like Unreal and Unity now have increasingly impressive offline rendering and compositing. I’ve certainly seen some photo-realistic renders from those engines.
Well, it’s only where the camera is pointing, so as long as the actors aren’t within frame, having the back of their heads filmed, they might not even get a glimpse of it. Hell, they might not see any of it when they’re facing the camera or other gaps in the screens. Though I wonder if, in general, the backgrounds are distracting for them, given they’re not intended to be for the actors…
Earlier this year, I read some comment from a Fox exec idly musing about being “open” to the possibility of bringing back Firefly in some form. It was a whole lot of nothin’, but oh, the fevered speculation it caused… (and the absurdity of it - most of the actors are very busy, so it would be, at best, a couple random characters from the original show shoehorned into a drama with a new cast).
I’d be ok with “other tales from the 'verse” but Firefly itself is good and dead.
As am I. I never thought at all at the potential of using a game engine for set design and lighting, and given how realistic those shots were in the Mandalorian, it’s clearly a technology that is already mature. How far we’ve come since the days of Asteroids.
Can’t wait to see the Mandalorian on Netflix.
/s
Until watching this video I hadn’t considered how difficult it must be to get the reflections right on an actor’s shiny armor if the scene was filmed entirely with the older green screen technology. Given that the main character of this show is pretty much reflecting his surroundings at all times, I imagine this new technique really pays off.
I just looked it up, and apparently a lot of the Captain Phasma shots in episodes 7 & 8 had to be re-done in CGI specifically to remove the green screen reflections.
“Moooo!”
Atomic Cow was my all time fave. She was the head shot champion.
We always need more Summer Glau