IGN: after 1 year; Where art thou ray traced games

Just like with Lumen triangle intersections give you the most accurate result. All of the other representations (voxels, sdfs) are lower resolution. I would expect that as geometry gets more detailed we will rely on triangle ray tracing more not less.

For Voxels, I would imagine that accuracy is bounded by memory consumption. The accuracy behind SDF representations is dependent on the number of functions used to represent the models. There's no theoretical limit for either of these geometric representation so they're just as viable as triangles are from an accuracy perspective and are in fact the preferred geometric representation for heterogeneous participating media in high fidelity film production rendering studios like Disney ...
 
For lighting, you also don't need too much accuracy. It's only shadows where you need a closer mapping with geometry, and even then lower resolution proxies are very serviceable.
 
For lighting, you also don't need too much accuracy. It's only shadows where you need a closer mapping with geometry, and even then lower resolution proxies are very serviceable.

Reflections? Also with shadows lower resolution tracing produces self occlusion artifacts. See the Activision presentation.
 
Okay, missed reflections but they are a really small part of a realistic rendering. Most reflections aren't glass-like and even those that are tend not to be a major point of focus. You can get away with low-proxy reflections in a beautifully lit environment without it being particularly noticeable.
 
Okay, missed reflections but they are a really small part of a realistic rendering. Most reflections aren't glass-like and even those that are tend not to be a major point of focus. You can get away with low-proxy reflections in a beautifully lit environment without it being particularly noticeable.

I'm sure there's a guy on YouTube showing a tech demo using voxel for reflections.

And I'm sure The Tomorrow Children on PS4 did reflections that way.
 
Okay, missed reflections but they are a really small part of a realistic rendering. Most reflections aren't glass-like and even those that are tend not to be a major point of focus. You can get away with low-proxy reflections in a beautifully lit environment without it being particularly noticeable.

Hopefully we will have UE5 games soon and can get a first hand look. Lumen has some very significant limitations to make its SDF tracing work and even then it falls back to triangle RT for the hard stuff. It remains to be seen whether those limitations matter in shipping games.
 
I see it like this, whatever form of ray tracing is done, hardware will generally always be much faster. Without eating into the compute capabilities of the GPU either. I believe Intel's new gpu's are oriented in the same way.
 
Indeed, though the argument is really around flexibility versus performance. Without the same level of investment in competing ideologies, we'll never have a fair comparison. Maybe, for example, Larrabee was the ideal future but because it couldn't compete with workloads designed for existing architectures, it was an evolutionary dead end? Kinda like the QWERTY keyboard - other keyboards are objectively better (because QWERTY was designed to slow down typing!), but because QWERTY is everywhere, it's the system everyone learns to use so the one that sticks.

As it is now, GPU shader model is being pushed into programmability and flexibility it really wasn't designed for at initial conception of tiny, linear programs, bringing with it a host of issues that are having to be worked around instead of working with an ideal design.

This is not a stealth "Sony should be using Cell" post
 
Kinda like the QWERTY keyboard - other keyboards are objectively better (because QWERTY was designed to slow down typing!), but because QWERTY is everywhere, it's the system everyone learns to use so the one that sticks.
Actually I seem to recall thats actually a myth

from wiki
Research on efficiency
The Dvorak layout is designed to improve touch-typing, in which the user rests their fingers on the home row. It would have less effect on other methods of typing such as hunt-and-peck. Some studies show favorable results for the Dvorak layout in terms of speed, while others do not show any advantage, with many accusations of bias or lack of scientific rigour among researchers. The first studies were performed by Dvorak and his associates. These showed favorable results and generated accusations of bias.[36] However, research published in 2013 by economist Ricard Torres suggests that the Dvorak layout has definite advantages.[37]

In 1956, a study with a sample of 10 people in each group conducted by Earle Strong of the U.S. General Services Administration found Dvorak no more efficient than QWERTY[38] and claimed it would be too costly to retrain the employees.[34] The failure of the study to show any benefit to switching, along with its illustration of the considerable cost of switching, discouraged businesses and governments from making the switch.[39] This study was similarly criticised as being biased in favor of the QWERTY control group.[8]

In the 1990s, economists Stan Liebowitz and Stephen E. Margolis wrote articles in the Journal of Law and Economics[36] and Reason magazine[15] where they rejected Dvorak proponents' claims that the dominance of the QWERTY is due to market failure brought on by QWERTY's early adoption, writing, "[T]he evidence in the standard history of Qwerty versus Dvorak is flawed and incomplete. [..] The most dramatic claims are traceable to Dvorak himself; and the best-documented experiments, as well as recent ergonomic studies, suggest little or no advantage for the Dvorak keyboard."[36][40]

EDIT: more daming/damming/damning (oh its the 3rd way) the winners of the fastest typing competitions tend to use qwerty, you would think if dvorak etc gave you a ~20% or whatever advantage then ppl would choose them

edit: sorry about using actually twice in the first sentence, what a tosser
 
Last edited:
I know lots of people that use personal computers on a daily basis. Not just for games, but for "real" work and stuff. And plenty of people who buy a Mac and jump through the hoops to run games on them, either natively or by installing windows on them, even though it's much more expensive to do that. And lots that only use Linux and again, jump through all the hoops to game on that (although that's easier now). Back in the day I knew a guy who only gamed on Windows NT. Which means he could play Quake engine games. He refused to dual boot, NT was his thing and he was all in on it. And they will tell me how Mac is better, or Linux is better, and they have their reasons. Some of them I think are just into the culture of being different, and like tweaking and doing the work to get a game running. And that's fine. I know no one that uses Dvorak. Even the hardcore "different" guys.
 
Indeed, though the argument is really around flexibility versus performance. Without the same level of investment in competing ideologies, we'll never have a fair comparison. Maybe, for example, Larrabee was the ideal future but because it couldn't compete with workloads designed for existing architectures, it was an evolutionary dead end? Kinda like the QWERTY keyboard - other keyboards are objectively better (because QWERTY was designed to slow down typing!), but because QWERTY is everywhere, it's the system everyone learns to use so the one that sticks.

As it is now, GPU shader model is being pushed into programmability and flexibility it really wasn't designed for at initial conception of tiny, linear programs, bringing with it a host of issues that are having to be worked around instead of working with an ideal design.

This is not a stealth "Sony should be using Cell" post

What's the difference between Larrabee and Ampere/Arc/RDNA? It still relied on coherent SIMD processing and would still benefit from the restrictions imposed by graphics APIs.
 
What's the difference between Larrabee and Ampere/Arc/RDNA? It still relied on coherent SIMD processing and would still benefit from the restrictions imposed by graphics APIs.
The APIs are part of the problem. You can't test how good a new architecture is if it requires an entire paradigm shift; you can only test how good it is inside the current paradigm it's not suited for.
 
Hopefully we will have UE5 games soon and can get a first hand look. Lumen has some very significant limitations to make its SDF tracing work and even then it falls back to triangle RT for the hard stuff. It remains to be seen whether those limitations matter in shipping games.

The only significant limitation behind lumen in software ray tracing mode is that SDF generation has to be done offline so you can't really represent deformable geometry in realtime under this system but considering how it's supposed to work with Nanite which also doesn't work for deformable geometry as well, hardware ray tracing currently doesn't buy you much of anything under these constraints ...
 
Actually I seem to recall thats actually a myth....
The 'slow down' perhaps, but the ergonomics were designed to suit a mechanical typewriter to prevent jamming keys and the limitations QWERTY was designed around don't apply to computer input. If a better keyboard is possible with better ergonomics, less fatigue, fewer injuries, and/or greater speed, it still won't get used.

This is the article cited from Reason Magazine (and it's the little brother to the other article written bu the same authors) -https://reason.com/1996/06/01/typing-errors/ - it isn't really talking about efficacy but why QWERTY becoming a standard not just for being first but for other reasons and the fallacy of people leaning on an example without properly confirming it, the old urban myth problem.

As said article states, data suggesting Dvorak isn't advantageous tends to compare people with existing QWERTY experience converting over, which is not the same as comparing someone who grew up with only Dvorak versus someone (of identical natural skill and KB use to develop at the same rate) who only knows QWERTY. In short, the comparisons are much like trying to see the value of Larrabee in running existing games of the time instead of with 20 years of graphics evolution designed around Larrabee.

I can quite accept people manage to overcome the limitations imposed by QWERTY, but QWERTY isn't designed ideally as the perfect computer input. It's just the best option because everyone was already used to it. Even if QWERTY isn't disadvantageous to typing, we are still 'locked in' to it. We'll be locked in to it regardless of how optimal it is for modern typing workloads and there's no point trying to research the true ideal KB layout (Dvorak isn't necessarily that so comparing QWERTY to Dvorak isn't proving QWERTY isn't imposing limits). Same as everyone driving on the left in the UK and the right in the US. Same as mains electricity being 120V in the US versus 220V in Europe. We're wrestling with IPv4 which wasn't designed to be future proof but just happened to be the starting point for addressing internet devices, and everyone started running with it and building a network around it, and then inventing complicated fixes like NAT to overcome its inherent limitations. We end up with a lot of legacy baggage limiting future options where, even if we recognise a change would be beneficial, the cost to change is prohibitive.

That's where consoles used to have an advantage, allowing a whole new paradigm in a new machine with new software, though of course business concerns limited how much investment they get to develop and explore new ideas that conflicted with larger common patterns.

In short, it really is impossible to compare alternative techs fairly where one is mainstream and the other experimental. A huge amount of a system lies not just in its immediate qualities, but the world and human thinking that is shaped around it. As hardware develops RT solutions, software will develop around that hardware, and an alternative paradigm that'd yield a net better results (from different tradeoffs) can't prove itself or be adopted. We just have fringe cases like Dreams where MM had to create their own entire tool-chain, an infinitesimally small investigation into the possibilities of non-triangle rendering against a world of decades of 3D triangle rasterisation thinking.
 
Back
Top