It's important to distinguish between traditional post vfx and in-camera vfx, which has come into fashion in recent years.
In-camera vfx means that the final CGI is already present in the scene when it's shot. This is usually accomplished with giant LED screens. Typically the engine that runs these screens is Unreal.
One major advantage is that the cinematographer's main job, lighting design, gets easier compared to green screen workflows. The LED screens themselves are meaningful light sources (unlike traditional rear projection), so they contribute correct light rather than green spill which would have to be cleaned up in post.
The downside of course is that the CGI is nailed down and is mostly very hard to fix in post. I suppose that's what Gore Verbinski is criticizing — for a filmmaker, the dreaded "Unreal look" is when your LED screen set has cheesy realtime CGI backgrounds and you can't do anything about it because those assets are already produced and you must shoot with them.
The transition between the real set and the virtual set is usually very obvious. LED volumes really should be completely replaced in post and used mainly for accurate lighting and actor reference.
Every producer in the industry is looking to cut costs wherever they can (at least here in Europe).
They’ll happily settle for “looks good enough for viewers who are distracted by their phones anyway” if it means the post budget item goes away completely.
The LED screen thing is so absurd that for a long time I assumed they just replace the content in post somehow and its purpose is merely to aid in lighting and for actors to orient better in the scene.
At least on The Mandolarian this is what happened. Everything behind the actors in the camera's frame would be green while the rest of the volume was used to have a lowres lighting reference for the scene. So essentially it would be a moving green screen. The Unreal output was never directly used in the finished show.
I guess current pipelines depends a lot on chroma key for the matte so isolating the actors cheaply might be hard with such complex backgrounds? Seems like it might not be long until we can automate that in such a controlled environment though.
I don’t see why it’s so absurd, with how cheap display tech has become recently. Ambitious, maybe, but it seemed to work pretty well in The Mandolorian.
It's a video game engine. It's got a ton of optimisations and tweaks to make it run in realtime, but if you're making a movie there's no reason not to spend hours rendering each frame. You don't need to optimise meshes at a distance, or use real-time raytracing with noise reduction rather than just simulating a thousand bounces or limit yourself to 4K textures, you can use as many particle effects and simulations as you'd like. You can't do this with a game engine though - Unreal does have the ability to render out video but it's not going to be the same fidelity.
I didn't think they were actually using the video straight out of the Volume though - my assumption was they'd just use it to make sure the lighting reflected on to the actors nicely and then redo the CGI elements with something else.
> At the end, movies are about the stories, not just pretty graphics.
The great people at Pixar and DreamWorks would be a bit offended. Over the past three or so decades they have pushed every aspect of rendering to its very limits: from water, hair, atmospheric effects, reflections, subsurface scattering, and more. Watching a modern Pixar film is a visual feast. Sure, the stories are also good, but the graphics are mind-bendingly good.
>> It's got a ton of optimisations and tweaks to make it run in realtime, but if you're making a movie there's no reason not to spend hours rendering each frame.
That's how it's used though? It only runs real time for preview, but the final product is very much not rendered in real time at all. Obviously it's a very flexible tool that works with what you need and what you can afford - Disney runs it on their compute farm and they can throw the resources at it to render at the fidelity required. But obviously there are plenty of production houses which don't have those kind of resources and they have to make do with less. But then you wouldn't expect Pixar's own pipeline to work in those circumstances, would you.
>> Unreal does have the ability to render out video but it's not going to be the same fidelity.
I really encourage you to look into what's possible with UE nowadays. Custom made pipelines from Pixar or Dreamworks are better still, of course, but UE can absolutely stand next to them.
The problem is the way surface lighting/scattering is calculated, which does not match what traditional offline renders do.
My issue with UE is the opposite, the engine went too far into cinema production, and making it a performant game engine requires code refactoring. At which point an open-source engine might be a better choice. Its a mix of two (three) worlds, and not the best choice for one specific use.
For what is actually hard to do, like character animation, UE is a good choice. The lighting can be replaced more easily than the animation system.
Money and deadline are the real answer. VFX companies, even more so in later years, are squeeze for time and budget by studios. Unreal Engine allow for fast and quick iteration on CGI/VFX, so it dramatically reduces the time to make them, especially when the director changes their mind every Tuesday. It is the consequence, not the cause.
If every studio was willing to spend Michael Bay money on CGI, it wouldn't be a problem.
All of that SFX budget is worthless without deliberate art direction. Most modern blockbusters look bland and busy. The scale of spectacle that computer graphics allow for just doesn't "WOW" any more. It's a shame that this is what the movie industry has come to.
Visual noise from CGI has been a real problem since at least Transformers in 2007. That's my benchmark for it, the one where I really first remember the overwhelm being a distraction. "Just because you can, doesn't mean you should" is a lesson that keeps needing to be relearned.
> Nowadays, movie fans seem much less impressed by CGI in films. There's a general distaste for a perceived overuse of CGI in favor of practical effects, and there are a lot of complaints that recent CGI is less-convincing and more fake-looking than it used to be, even in the biggest budget films.
Funny it says this right after mentioning Jurassic Park. I, an avid JP fan that was blown away by the movie (and the book) when I was a dino-obsessed teenager, always thought that it was the non-CGI dinos the ones that didn't look that realistic (even if the "puppets" were fantastically done, it was more about the movement/animation). Although we have to keep in mind they used those mostly for close up shots where CGI would've looked even worse.
This is the next evolution of the "My film does not use CGI" sneering. Sure, doing proper pre-rendered VFX with photo-realism is great and also people doing it love it. But can it be done on the budgets/fixed bids/turnarounds when the producer comes with "...and all of that will be a full virtual set and it should be streaming next Monday morning", for peanuts?..
If it's Gore saying it - maybe he should talk to his producers then, and ask them whether they actually have budgeted the "proper" VFX talent/timelines for the show. He has creative control - the people doing the work do not.
I'm really confused at that take.
If you watched the Corridor Channel on YouTube, you can catch a lot of times that Unreal is treated as a draft, or the on-set reference, and gets replaced almost always, before shipping the final. Something doesn't add up here.
Having watched a great deal of Andromeda, Star Trek, and Hercules/Xena growing up, I would submit that weak video effects can be perfectly fine as long as the actors take them seriously enough.
Is anyone out there using Unreal for in-camera character/creature animation? My production experience with Unreal has been solely limited to background and lighting support, for which it is excellent.
good explanation and i also wondered why many of the CGI effects today are so unbelievably bad - and worse than decades ago.
it still doesn't explain why it is done:
• why do directors and producers sign off effects that are just eye-bleeding bad?
• using a realtime engine to develop the effects, doesn't preclude using some real render-pass at the end to get a nice result instead of "game level graphics". a final render-pass can't be that expensive that ruining the movie is preferred? if 20 years ago a render-farm could do it, it cannot cost millions today, can it?
The reason is it's a hell of a lot cheaper and easier to work with, and in general enables things to be done that would otherwise be cost prohibitive.
(And AFAIK they do usually do a non-realtime run, but a high-end render going for maximum photorealism also requires a whole different pipeline for modelling and rendering, which would essentially blow the budget even more so)
I feel like there's some strong rose tinted glasses effect happening here. Early 2000s were especially full of absolutely dreadful CGI and VFX in almost every film that used them unless you were Pixar, Dreamworks, or Lucasfilms. I can give you almost countless examples of this.
The only thing that changed is that now it's easier than ever to make something on a cheap budget, but this absolutely used to happen 20-30 years ago too, horror CGI was the standard not an exception.
> • why do directors and producers sign off effects that are just eye-bleeding bad?
It's a bit cheaper.
> • using a realtime engine to develop the effects, doesn't preclude using some real render-pass at the end to get a nice result instead of "game level graphics".
It's probably a bit expensive in terms of effort or processing-wise.
In both cases you aren't ruining a movie. You're just making it more mediocre. People rarely leave cinema because CGI is mediocre.
Gore Verbinski directed a film trilogy with absolutely impeccable art direction and possibly the best special effects in the history of film (they're also, subjectively, some of my favourite films of all time).
(-_-) You can CHANGE the light reflection algorithm in Unreal Engine from Lambert to Oren Nayar. Google "unreal engine lambert to oren nayar light reflection" you have to modify files, but it's doable.
VFX is just too damn expensive. It will get much worse with AI tools taking hold. Once these are 80% there but 10x cheaper, they will be (over)used everywhere, despite delivering clearly inferior results.
tl;dr video game engines replacing maya. they(unreal) do not have properties(photo realism, or the lack of) that mesh with movies too well and if overdone they produce uncanny-valley effect. going from maya to game engines is step back.
In-camera vfx means that the final CGI is already present in the scene when it's shot. This is usually accomplished with giant LED screens. Typically the engine that runs these screens is Unreal.
One major advantage is that the cinematographer's main job, lighting design, gets easier compared to green screen workflows. The LED screens themselves are meaningful light sources (unlike traditional rear projection), so they contribute correct light rather than green spill which would have to be cleaned up in post.
The downside of course is that the CGI is nailed down and is mostly very hard to fix in post. I suppose that's what Gore Verbinski is criticizing — for a filmmaker, the dreaded "Unreal look" is when your LED screen set has cheesy realtime CGI backgrounds and you can't do anything about it because those assets are already produced and you must shoot with them.
Does this happen often? Are there any examples?
https://techcrunch.com/2020/02/20/how-the-mandalorian-and-il...
They’ll happily settle for “looks good enough for viewers who are distracted by their phones anyway” if it means the post budget item goes away completely.
https://m.youtube.com/watch?v=gUnxzVOs3rk
I didn't think they were actually using the video straight out of the Volume though - my assumption was they'd just use it to make sure the lighting reflected on to the actors nicely and then redo the CGI elements with something else.
Say you're making children's videos like Cocomelon or Bluey in 3D, you don't need all these nice things.
At the end, movies are about the stories, not just pretty graphics.
The great people at Pixar and DreamWorks would be a bit offended. Over the past three or so decades they have pushed every aspect of rendering to its very limits: from water, hair, atmospheric effects, reflections, subsurface scattering, and more. Watching a modern Pixar film is a visual feast. Sure, the stories are also good, but the graphics are mind-bendingly good.
People don't pay 45 eurodollars for IMAX because they like the story.
That's how it's used though? It only runs real time for preview, but the final product is very much not rendered in real time at all. Obviously it's a very flexible tool that works with what you need and what you can afford - Disney runs it on their compute farm and they can throw the resources at it to render at the fidelity required. But obviously there are plenty of production houses which don't have those kind of resources and they have to make do with less. But then you wouldn't expect Pixar's own pipeline to work in those circumstances, would you.
>> Unreal does have the ability to render out video but it's not going to be the same fidelity.
I really encourage you to look into what's possible with UE nowadays. Custom made pipelines from Pixar or Dreamworks are better still, of course, but UE can absolutely stand next to them.
My issue with UE is the opposite, the engine went too far into cinema production, and making it a performant game engine requires code refactoring. At which point an open-source engine might be a better choice. Its a mix of two (three) worlds, and not the best choice for one specific use.
For what is actually hard to do, like character animation, UE is a good choice. The lighting can be replaced more easily than the animation system.
Funny it says this right after mentioning Jurassic Park. I, an avid JP fan that was blown away by the movie (and the book) when I was a dino-obsessed teenager, always thought that it was the non-CGI dinos the ones that didn't look that realistic (even if the "puppets" were fantastically done, it was more about the movement/animation). Although we have to keep in mind they used those mostly for close up shots where CGI would've looked even worse.
If it's Gore saying it - maybe he should talk to his producers then, and ask them whether they actually have budgeted the "proper" VFX talent/timelines for the show. He has creative control - the people doing the work do not.
it still doesn't explain why it is done:
• why do directors and producers sign off effects that are just eye-bleeding bad?
• using a realtime engine to develop the effects, doesn't preclude using some real render-pass at the end to get a nice result instead of "game level graphics". a final render-pass can't be that expensive that ruining the movie is preferred? if 20 years ago a render-farm could do it, it cannot cost millions today, can it?
(And AFAIK they do usually do a non-realtime run, but a high-end render going for maximum photorealism also requires a whole different pipeline for modelling and rendering, which would essentially blow the budget even more so)
I feel like there's some strong rose tinted glasses effect happening here. Early 2000s were especially full of absolutely dreadful CGI and VFX in almost every film that used them unless you were Pixar, Dreamworks, or Lucasfilms. I can give you almost countless examples of this.
The only thing that changed is that now it's easier than ever to make something on a cheap budget, but this absolutely used to happen 20-30 years ago too, horror CGI was the standard not an exception.
It's a bit cheaper.
> • using a realtime engine to develop the effects, doesn't preclude using some real render-pass at the end to get a nice result instead of "game level graphics".
It's probably a bit expensive in terms of effort or processing-wise.
In both cases you aren't ruining a movie. You're just making it more mediocre. People rarely leave cinema because CGI is mediocre.
He knows what he's talking about!
Once the slop starts at the very basics it's just natural it embraces also CGI.
The requirement to recompile the engine makes this feature non existent for a film crew.