Look at the guy outside the window from about 1:35. On the non-corrected view he stays in the same position relative to the bars, but in the corrected one he moves relative to the bars.
I love it!
No, I really don’t think it’s any new advances in computing or anything like that. Again, 3D games have shown parallax for decades.
It really is that the modder just put the the camera a few inches forward from the pivot point (I had described it as “putting the camera on a sphere”). The video above that @KingGhidorah posted explains it well.
You’ll notice that explanation video also shows an example of the camera being far behind the pivot point — what games call third-person mode. In that mode the camera swings quite widely when the player turns their head, and you see loads of parallax. Again, third-person mode is something that games have had for decades.
The only new thing is, instead of having the camera several feet behind the pivot point, like in third-person mode, it’s now three inches in front of it.
Yes, you’re exactly right… Since we use the mouse to move our characters head, we don’t look around the same way as in the real-world. Not to mention the lack of blinking, blind-spots, and blurry peripheral vision… What we see in our mind’s eye is highly mediated by our brains, which really takes a sup-par signal and constructs a cogent reality. Which all goes to say first-person games are not quite simulators, but more like reresentations of perception. And so they should be!
That’s what I thought when I watched the video. If this was added to a game it should also have a slider right under FoV to adjust how strong (the radius of the sphere) the effect is.
A properly-done VR implementation in a 6-DOF headset (one with motion tracking) will include this movement - it’s baked in by the headset’s motion tracking system.
Essentially, the headset is continually telling the game where your actual eyes are.
I see that this has been addressed elsewhere a bit without a direct reply, so I’ll add a bit more VR thought to make this post worthwhile.
In general, motion sickness occurs when there is a disagreement between what our vision is telling us about movement and what our inner ears are telling us. On a boat, plane, car, etc., this happens because we see the vehicle as a frame of reference in which we are stationary, but our inner ear is reporting movement.
In VR it works the opposite way: our inner ear tells us we aren’t moving, but our vision tells us we are. For this reason, games generally do not provoke much VR sickness when they have the player stand still, or even walk around in a stationary environment. But if the environment is moving around the player (flying games, free-floating zero-g games like Lone Echo), then things can get queasy fast.
Thanks! Good to know. So, not a solution to the severe nausea I get when trying VR.
Not really surprised. I figured it was mostly my inner-ear delivering different data to that from my eyes.
You know, I’m just glad that games have evolved this far. I’m so very happy not to still be stuck playing Pong.
I think I like it, but as to whether it’s more “correct”, that is a vexed question.
If you hold your head still and move your eyes, the camera is pivoting about the focal point, as in normal games. But IRL we don’t perceive eye movement, on its own, as a panning of the “camera”. If you move your head, and have two eyes, then this correction is apt, but the correction for each eye is quite different; I assume this is simulating the middle eye which no one actually has.
The problem is that projecting a scene onto a screen doesn’t really correspond to first-person vision to begin with – the viewer has to do a mental translation – and this “perspective correction” is introducing some arbitrary assumptions about how it should work: first, that you are a cyclops, and second, that what is on the screen should correspond to what is on your retina. Neither is “correct”, but people may or may not prefer the result. It’s kina like using extreme focal lengths in cinematography: not wrong or right, just a choice.
For VR, the focal points of the virtual cameras have to correspond to the focal points of your two eyes for it to work at all. There’s a whole other can of worms there but I don’t want to go on one of my rants about what’s wrong with stereoscopic “3D.”
No hurling.
It just made me think: “oh they’ve used a slightly wider angle lens - maybe a bit too wide”
The cost to do this in a number of game engines is essentially free, and has been since the 90s. Every time a character moves or changes view direction the camera position needs to be recalculated. The camera view is stored as a 4x4 matrix of values, moving it forward from the normal position would only take multiplying the matrix that results from the existing camera position by an additional static 4x4 matrix, a ‘transform 5 inches forward’ matrix. This would only need to be done once per scene render where the camera has moved. (This s cheap enough that some engines recalculate camera position every frame)
For comparison rendering a single vertex can require multiple multiplications of 4x4 matrixes against other 4x4 matrixes or 4x1 vectors.
I could see a software render like DOOM having some cost to doing this, but ever since sometime round the TNT2 era this has been doable entirely in hardware, making it super cheap.
Put me in the “that looks SO much better!” category. Why because it more closely matches our physical reality. I can hardly play FPS type games because they are so vomit-inducing with the twitchy viewpoints (don’t get me started on how my kids play games). I’d like to see some action and fighting sequences but this should be a check box in the video options of every game!
This is so weird, I was just thinking about this the other day while I was surveying my beautiful spaceship cockpit.
It’s perfect and Frontier desperately needs to do this for Elite: Dangerous.
Nice to see Vin Diesel branching out
I’ve written a lot of camera code in my 25 year career as a game developer, so I can comment a bit on this.
It is neither more or less realistic, it’s just novel.
Game developers love to talk about creating new effects like this to “better simulate biology”. However, most game developers (myself included) know very little about biology or neurology. The classic example of this is the “walk bob” that first person games all do. The idea is that our heads bob up and down when we walk, so making the camera do that must be seem more realistic, right? The answer is no, because our brains cancel out that motion. Adding it back in artificially actually just creates a third new “thing” that our brains adapt to, but is a common cause of motion sickness in FPS games as well.
This is what’s happening here too. It’s different so you get excited, but your brain compensates for any difference between how your head moves and where your eyes are. The idea that your neck is a single pivot and your eyes are a simple offset from that is a ridiculous oversimplification of how vision works anyway.
None of this matters, because the bottom line is that you’re viewing a 2D projection of a 3D environment on a small screen. You have no depth perception, and your FOV is completely wrong. All bets are off for “simulating how your brain really sees the world”. That’s just tech bro hubris for how they think human visual systems work.
Instead, what happens is that the fish-eye lenses and bobbing walks in these games became a visual language for games that is separate from realism. Kinda like how explosions in movies are always orange fireballs (real explosions are colorless shockwaves with some dust) or how eagles always sound a certain way (which is actually a red tail hawk). People have come to expect these things in games and like them, but they aren’t “real”. They’re just “how games look”. Whether you like the look more or less is not “realism”. Immersion and realism are also separate things, remember.
This wouldn’t be branching out for him as he founded a game development studio almost twenty years ago.
Yeah, seems exaggerated, as if the neck projects about two feet from the pivot. Pierson’s puppeteer?
They’re in the video description. Copied here:
Doom 2016 has the opposite problem, it feels like my eyes are some 2 feet ahead of my body. I keep bumping into walls when I go around corners because my body is so far back.
But VR movies have this problem big time – looking around always feels a bit off. And that’s on top of the fact that your viewpoint doesn’t change at all if you move around. They actually make me a little dizzy, especially if I’m standing.
OTOH, VR games don’t have this problem, your “eyes” move very naturally pretty much for free due to the way the headset is tracked.
Edit: And yes, the adjusted parts of the video are indeed more beautiful, but not because they’re “better” or “more natural”, but simply because you can see the depth that you know should be there. Moving a camera is much more powerful than just zooming.
Thank you so much for posting this. I totally get it now. This was very helpful.