Next Generation Hardware Speculation with a Technical Spin [2018]

Status
Not open for further replies.
utilization of ray tracing games will not proceed unless we can offer ray tracing in all product ranges from LOW END to high end,”

So it can still come with the release if the next gen consoles unless they're releasing early next year? *crosses fingers*

Maybe AMD's plan is to launch HRT/ML dedicated hardware across everything from laptops, desktops, APUs in a somewhat short timeframe then they could have a good chance to be the preferred way to program HRT/ML for games?
How each company will implement HRT will be different. The DXR API provides the input and expected behaviors and outcomes of running a function. It's up to the vendors to come up with a strategy or way to make that happen. The R&D will be there, in that aspect. How to make it happen on their GPUS across their whole lineup. If they do not, or have not been committed to this for some time now, it's doubtful it will happen soon enough for next gen. There are 2 components here, making the driver and if they find performance is crap, which it will be, making the hardware to speed it up. It's not exactly straight forward, I just assumed AMD was further along in this, but if they are waiting to see if things get adopted before committing to it, then they're aren't working on it in time for next gen.
 
Makes sense.

I bet both companies are looking if the X gains traction but I guess most would say a non factor for the majority.

On a side note, I have a feeling this gen was more comparison heavy compared to last gen and going to be worse next gen.

PS4's power on top could be more incidental as it's their first time in that position.
Some of the other senior members will be able to provide much more background on this topic. DF has been around for a long time, and I've mainly been a PC guy until this generation. So DF was new to me lol, I used to just be a Tom's, Anand, [H], person looking at benchmarks and making purchasing decisions from that.
 
https://wccftech.com/amd-radeon-mi60-resnet-benchmarks-v100-tensor-not-used/
AMD’s Radeon MI60 AI Resnet Benchmark Had NVIDIA Tesla V100 GPU Operating Without Tensor Enabled

Footnotes are very important. They can reveal information that is vital to interpreting the metrics on display and sometimes they can also reveal caveats hidden in plain sight. AMD recently launched the world’s first 7nm GPU, the Radeon Instinct MI60, and it is a milestone in the ongoing transformation of AMD’s professional GPU side. The specifications are great and the performance spectacular, but the efforts put in by engineers might be overshadowed by something hidden in the footnotes. NVIDIA’s Tesla V100 GPU was gimped in the ResNet 50 benchmark.

“The 70W Tesla T4 with Turing Tensor Cores delivers more training performance than 300W Radeon Instinct MI60. And Tesla V100 can deliver 3.7x more training performance using Tensor Cores and mixed precision (FP16 compute / FP32 accumulate), allowing faster time to solution while converging neural networks to required levels of accuracy.” – NVIIDA

GPUs take a long time to design and develop and it is clear that AMD got blindsided in the Tensor department. That said, while Tensor cores can and do speed up certain calculations, they do not work in every case and FP32 is still a very important metric of performance. So yes, the MI60 has performance comparable to the Tesla V100, but only in FP32 mode. Overall training performance is vastly superior on the V100. If you are someone who uses Tensor to accelerate inference then the T4 is going to be more of a competitor than the V100.

Ouch my greatest fears realized on several fronts now. Grim future for next gen for me.
 
Some of the other senior members will be able to provide much more background on this topic. DF has been around for a long time, and I've mainly been a PC guy until this generation. So DF was new to me lol, I used to just be a Tom's, Anand, [H], person looking at benchmarks and making purchasing decisions from that.

It's okay. It's mostly anecdotal for me anyway so it's no where scientific.

Even if gaming media are getting more aggressive with showing the public the differences, even if they have to zoom in, it might not be enough to sway the majority if they could get almost 2 new, "next gen" games instead of superior hardware.
 
https://wccftech.com/amds-david-wan...-dxr-until-its-offered-in-all-product-ranges/
This is unfortunate for my pro-RT stance on consoles. This will pare back the probabilities significantly.
If the R&D is not done by now, I'm not sure it could be delivered in time for launch.

I just went from 45% probable, to < 10%.
I already corrected this in another thread, WCCFTech used some machine translation which from japanese to english don't seem to be that great.

Here's translation by actual japanese guy
Mr. Wang said that "AMD will definitely respond to DXR," after preposing that "This is a personal view", but "For the time being, AMD is providing it free of charge" We will focus on improving the offline CG production environment centered on Radeon ProRender ".

"The spread of Ray-Tracing's game will not go unless the GPU will be able to use Ray-Tracing in all ranges from low end to high end," he said.
Wang doesn't say Navi won't support DXR nor that it would be supported by AMD only when they can do it in everything from lowend to highend - he's saying that currently [Vega 20 just released] they're focusing on Radeon ProRender and that Raytracing in games won't really spread until it's available for everything from lowend to highend
https://www.overclock3d.net/news/gp..._respond_to_directx_raytracing_-_david_wang/1
 
https://wccftech.com/amd-radeon-mi60-resnet-benchmarks-v100-tensor-not-used/
AMD’s Radeon MI60 AI Resnet Benchmark Had NVIDIA Tesla V100 GPU Operating Without Tensor Enabled

Ouch my greatest fears realized on several fronts now. Grim future for next gen for me.
Err, how is this news? It was stated in the damn launch slidedeck that that particular test was done in FP32, which Volta Tensor-cores don't support. I'm not sure why you would want FP32 precision in this particular case (other than the obvious better results for MI60), but since it is available in the application I'm sure there's use cases for it too.
 
https://wccftech.com/amd-radeon-mi60-resnet-benchmarks-v100-tensor-not-used/
AMD’s Radeon MI60 AI Resnet Benchmark Had NVIDIA Tesla V100 GPU Operating Without Tensor Enabled







Ouch my greatest fears realized on several fronts now. Grim future for next gen for me.

I went straight to the comments sections. Nothing more hilarious than reading the sideshow of AMD vs. Nvidia warrior comments, from my PC brethren no less, over benchmarks they'll never use, or see within foreseeable gaming future. I need some beer(s) and chips for this shit-show....
 
Raytracing in games won't really spread until it's available for everything from lowend to highend
Which is definitely true. Maybe not APU low-end, but if it doesn't fit $200 and up graphics cards, then I agree it's not going anywhere.
Again: in games.
 
over benchmarks they'll never use, or see within foreseeable gaming future.
Tensor Cores are used in games right now, through DLSS and AI upscaling. Also accelerating denoising for ray tracing.
Ouch my greatest fears realized on several fronts now. Grim future for next gen for me.
To be fair Vega 20 is just a modified Vega 10. Tensor Cores addition need heavier and more pervasive modifications. So If AMD were to add them, expect that in Navi.
 
Last edited:
Tensor Cores are used in games right now, through DLSS and AI upscaling. Also accelerating denoising for ray tracing.

To be fair Vega 20 is just a modified Vega 10. Tensor Cores addition need heavier and more pervasive modifications. So If AMD were to add them, expect that in Navi.
I guess it never occurred to me that AMD could have been blindsided by tensor cores, which google introduced a bit back, and was still developing MI60, and it took that long from vision to deploy that tensor core came and started gen 3 in that time frame. I think that was the “negative Nancy” hit I was feeling.
 
Or go for a next gen GPU from nvidia with something like a i9 or ryzen 7, if you want power and all that. Besides that you could have a PS5 for the singleplayer exclusives, best of both worlds, if one can justify a PS just for its exclusives that might be played once or twice.
 
Or go for a next gen GPU from nvidia with something like a i9 or ryzen 7, if you want power and all that.

Expensive high performance consoles like NeoGeo, CDi and 3D0 and have tried and failed. The market for such consoles appears limited. In recent times we had the launch PS3 which sold incredibly poorly despite Sony dominating the previous two generations.

Microsoft could own such a market, not with hardware but with a controller-friendly UI for Windows 10. Steam Big Picture mode is 80% of the way there, it's all the Windows 10 crud requiring a keyboard and mouse that breaks the console 'experience'.
 
Devs will use that extra compute power to render nasal hair.
uncharted-4-a-thiefs-end-ps4.png


Probably.

Although Kratos has been in sticky situations too.

Microsoft could own such a market, not with hardware but with a controller-friendly UI for Windows 10. Steam Big Picture mode is 80% of the way there, it's all the Windows 10 crud requiring a keyboard and mouse that breaks the console 'experience'.
Would be nice if they implemented a radial keyboard option on console.
 
IMO, they should concentrate on what they know, and give consoles lots and lots of compute. I want next gen physics.

I just want a big improvement in hair/fur quality, physics driven animation and cloth simulation. Seriously if all the first gen of Raytracing is going to bring is realtime reflections I don't want them to waste silicon on it.

Red Dead 2 has eliminated cloth from clipping on current gen, well I haven't noticed any cloth or weapons clipping on my character yet only hair.
 
I just want a big improvement in hair/fur quality, physics driven animation and cloth simulation. Seriously if all the first gen of Raytracing is going to bring is realtime reflections I don't want them to waste silicon on it.

Red Dead 2 has eliminated cloth from clipping on current gen, well I haven't noticed any cloth or weapons clipping on my character yet only hair.
As much as I want RTRT to make any kind of steps in the console market, I'm also fed up with the lack of proper physics simulations.
 
For console it can be a good idea to dont have too high demands and/or hopes. By the time they hit the toy stores it might dissapoint.
 
Status
Not open for further replies.
Back
Top