AMD RyZen CPU Architecture for 2017

Not sure this has to do with Ryzen.
Very high framerates on lower IQ settings seem to be very dependent on the CPU's single-threaded performance. I think this is what was being discussed because Ryzen doesn't match Skylake or Kaby Lake in that department.


Maybe next gen headsets will be 120hz giving a 60hz re-projection option, but at that point, 120hz native will be better so you'll still want a CPU capable of that. A 60fps target for VR on PC is pointless today, or in the future, unless it's as a fall back option.
You stated next gen headsets may be 120Hz, but then you say 60 FPS for VR is pointless in the future. Which one is it?
I've seen the difference between 90Hz and 120Hz reprojected from 60Hz first-hand. I say if there's a chance developers can target 60FPS and use the extra performance for better IQ, I think they should.

I gather the problem may be the panel's controller. PSVR has a 120Hz display controller but the Vive and Oculus have 70% more pixels. Maybe that's the reason, and VR headsets are probably using smartphone display controllers that can't handle more than 1080p@120Hz or 1440p@90Hz.
 
Interesting, but shouldn't the scheduler mostly keep the process (at least the 8 thread vrad) on the same cpu in the 2 leftmost cases?
(ie that's the kind of thing we want from it when ryzen config eventually is reported properly with the seperate l3)
 
You stated next gen headsets may be 120Hz, but then you say 60 FPS for VR is pointless in the future. Which one is it?

I was clear on that. 120hz native is better than 60hz reprojected to 120hz. So if you're buying a new CPU today with VR in mind, of course you wouldn't limit yourself to 60fps level performance. The 60 fps re-projected to 120hz option would be pure fall back only, for those people that can't hit 120hz (or 90hz). But you don't buy a brand new gaming CPU for the purpose of playing VR without ensuring it can hit the desired frame rates. Anyway, all this talk of 60 or 120hz is nonsense at the moment. The target is 90hz.

I've seen the difference between 90Hz and 120Hz reprojected from 60Hz first-hand. I say if there's a chance developers can target 60FPS and use the extra performance for better IQ, I think they should.

Yes, I know your a big fan of PSVR.
 
Shouldn't we be arguing with tests and evidence on why 90hz is the better and why ppl have problems when they use 60. At least is what we would expect in this forum ;)

Enviado desde mi HTC One mediante Tapatalk
 
Shouldn't we be arguing with tests and evidence on why 90hz is the better and why ppl have problems when they use 60. At least is what we would expect in this forum ;)

Exactly.
I was simply explaining my anecdotal experience and all arguments I can see against that are reduced to:
90Hz are the target. It is known.
It is known.
It is known.
90Hz. It is known.
 
That 150$ price difference is obviously relevant for both A and B style gamers. But I wouldn't say "trounces in gaming" when the downside mostly applies to small percentage of competitive players. Only a small minority of players own a 120Hz/144Hz display and choose to play at reduced quality settings at reduced resolution (and own top of the line GPU).

People bench CPUs at lower resolutions so they can get a glimpse at the future capabilities of that CPU, Ryzen may seem fine now at 4K and some 1440p, but two years into the future when future GPUs eat 4K like candy it will not be fine at all compared to the competition. With this performance, Ryzen is not friendly to dual GPUs configurations, or games that are heavily single threaded: Arma 3, Day-Z and Forza Horizon 3 come to mind.

If you have an i5 and a 1050 in your system now, you wouldn't feel the need to upgrade your CPU. Your GPU will only be pushing 55 fps in BF1 @1080p because that is it's maximum capability. Now switch that 1050 with a 1080, and now you are starved for CPU power. This formula applies directly to Ryzen @4K, when bigger and more powerful GPUs arrive, Ryzen systems will suffer compared to the competition even @4K.
 
People bench CPUs at lower resolutions so they can get a glimpse at the future capabilities of that CPU, Ryzen may seem fine now at 4K and some 1440p, but two years into the future when future GPUs eat 4K like candy it will not be fine at all compared to the competition. With this performance, Ryzen is not friendly to dual GPUs configurations, or games that are heavily single threaded: Arma 3, Day-Z and Forza Horizon 3 come to mind.

If you have an i5 and a 1050 in your system now, you wouldn't feel the need to upgrade your CPU. Your GPU will only be pushing 55 fps in BF1 @1080p because that is it's maximum capability. Now switch that 1050 with a 1080, and now you are starved for CPU power. This formula applies directly to Ryzen @4K, when bigger and more powerful GPUs arrive, Ryzen systems will suffer compared to the competition even @4K.
But one can argue than when VGAs ar fast enough to put 120FPS in 4K games will be better using milti core CPU and ryzen performance will increase while 4/8 intels will decrease.

This is the more possible scenario in my opinion.
 
But one can argue than when VGAs ar fast enough to put 120FPS in 4K games will be better using milti core CPU and ryzen performance will increase while 4/8 intels will decrease.
This scenario is dependent on unknown variables, future games may use more cores, or may not use more cores, it's all up in the air. You shouldn't gamble with it too much, the same argument was also mentioned back in Bulldozer days. Developing games on multi core CPUs is seemingly hard on PCs, we literally needed a decade to arrive at the acceptable 4 cores utilization that we have now.

However, if they did manage to use more cores effectively, this would put all those 6800s and 6900s back on the radar for being gaming friendly, and with their superior single threaded performance, Ryzen will still be at a disadvantage.
 
This scenario is dependent on unknown variables, future games may use more cores, or may not use more cores, it's all up in the air. You shouldn't gamble with it too much, the same argument was also mentioned back in Bulldozer days. Developing games on multi core CPUs is seemingly hard on PCs, we literally needed a decade to arrive at the acceptable 4 cores utilization that we have now.

However, if they did manage to use more cores effectively, this would put all those 6800s and 6900s back on the radar for being gaming friendly, and with their superior single threaded performance, Ryzen will still be at a disadvantage.
Not at less than half the price and thats for sure.

Also we already see games that uses more cores than 4 and BZ would never be competitive doesn't matter utilization, it was simple clocked too low.

The more ppl with access to more than 4 cores the more games will use them, right now Intel sells them too expensive to very little amount of users use them to justify the inversion on that, with zen that will change and we will see a grow in that market with the consequence of more developers looking at it.

By what I've seem even a pentiu, 4560 is enough CPU to play at 4k at the moment, but that doesn't mean its recomandable to buy one.
 
Not at less than half the price and thats for sure
right now Intel sells them too expensive to very little amount of users use them
Another unknown, If games suddenly invest heavily in 8 cores, these expensive Intel chips might justify their high price if they provide a very big advantage in gaming compared to Ryzen.

Also we already see games that uses more cores than 4
And in these games, Ryzen is not even putting a good show compared to KabyLake.

The more ppl with access to more than 4 cores the more games will use them
Most people have i5s and i3s. So no, people will not be buying tons of Ryzen 8 core CPUs in a month. Most will buy 4 or 6 Ryzen CPUs. Programming for 4 Cores will still take precedence for the majority of PC titles. Never the less, that is all up in the air, you can't gamble with a 500$ CPU on such an uncertain possibility. You take what you have for granted right now. At least that's what most gamers are interested in doing.
 
I read that as the reverse, the data fabric is half of the external DDR clock.
For that matter 32B*1.333GHz/2 = 42.7
42.7 / 2 = 21.3 GB/s.

Coincidence that a 32B link cut in half or alternating directions every cycle is that close the the reported inter-CCX link?
My bad in reading The Stilt wrong.

Anyway, I thought they would be using NoC routers in a ring configuration, so the bus would be 32B anywhere. But now it is more like both CCXs get a 32B direct path (?) to the memory controller, while perhaps there is a fast data-only exclusive path between them, which is configured to be 16B.
 
People bench CPUs at lower resolutions so they can get a glimpse at the future capabilities of that CPU, Ryzen may seem fine now at 4K and some 1440p, but two years into the future when future GPUs eat 4K like candy it will not be fine at all compared to the competition. With this performance, Ryzen is not friendly to dual GPUs configurations, or games that are heavily single threaded: Arma 3, Day-Z and Forza Horizon 3 come to mind
I would personally feel much more future proof with a 8-core processor than a 4-core processor. This is the first time we have consumer priced competitive 8-core (16 thread) processors available. Intel is also later this year bringing 6-core mainstream (non-HEDT) Coffee Lake CPUs to market. After this point, 4-core will not anymore be the highest core count of new architecture high clocked i7 processors. AAA game developers will ensure that their games run well on the best available consumer 8-core and 6-core CPUs. In a few years, 4-core will lose to 6-core and 8-core in gaming. People are too quick to forget. The same happened with high clocked dual cores. Core 2 Duo was winning against lower clocked Core 2 Quad in games, but nowadays Core 2 Duo is dead, but you can still use your Core 2 Quad for gaming. Quad core i7 is also starting to show noticeable gains over i5 with HT disabled (8 threads vs 4). Some years ago reviewers recommended i5 for gamers because performance was pretty much identical to equally clocked i7 (and sometimes even better, because HT was causing slow down in games).

We should expect to see Windows scheduler improvements regarding to Ryzen SMT and separate L3 caches in the near future. We have already seen huge gains in some games with disabled SMT and/or core locking all threads to the first compute cluster. Game engines will also improve. Ryzen is currently the best bang for buck for game programmers (http://www.hardware.fr/articles/956-12/compilation-visual-studio-mingw-w64-gcc.html). 8-cores is perfect for compiling code, but it also helps a lot in cooking data and recompiling shaders. Ryzen will become the first AMD CPU since 10 years that's used by professional game developers. As developers will be using Ryzen during development, their code will also run better on it.

Game engines today are heavily single thread bound because the code base was designed around DX11 (or more truthfully DX9). Vulkan and DX12 are gaining traction. In 2 years there will be AAA game engines designed solely around Vulkan and DX12. GPU-driven rendering is also gaining huge amount of popularity. Everybody seems to be now interested about it as GPUs in consoles are getting significantly fatter, but CPUs not so much. This further reduces the single render thread bottleneck (even in DX11 games).

Let's see how 7700K (quad) performs vs 6-core and 8-core CPUs in 2 years. My bet is that 1800X has overtaken 7700K in some latest AAA games.
 
Last edited:
Given the choice, clearly native 90hz is better. And PC headsets don't support 60Hz anyway.
Playstation VR games are commonly 60 Hz with two time-warps per one rendered frame -> 120 Hz output. It offers reduced head tracking latency compared to 90 Hz. Head movement is 120 fps, animation is 60 fps. IMHO this is the best compromise right now. Obviously 90 fps rendering at 180 Hz time-warp would be even better, but it is not cost effective right now.
 
However, if they did manage to use more cores effectively, this would put all those 6800s and 6900s back on the radar for being gaming friendly, and with their superior single threaded performance, Ryzen will still be at a disadvantage.
Nobody is saying that 6900K isn't a future proof CPU. It is a very good and very well balanced CPU indeed. I would personally feel more future proof with 6900K than 7700K, including my gaming needs. But 1089$ is too much for most people to spend on a CPU. 499$ of Ryzen 1800X is more reasonable. 1800X is not as good as 6900K, but it has fantastic price/performance ratio. This is the first time we have a competitive 8-core CPU in the same price class as fast quads.
 
From the reddit AMA:

Q:Are there any plans for mobile Ryzens? Are we going to see laptops, hybrids and etc in a near future powered by Ryzen?
Lisa Su: yes you will see ryzen mobile parts going into laptops and 2-in-1's in 2H2017.


Those 4W minimum on the mobile Raven Ridge aren't just for show.
:D
 
It would be amusing if we saw a Surface 5 using it. Not a Surface Pro 5, as I don't think MS would want to muck with what's inside the Pro line just yet.


If the performance/watt is similar and the iGPU is much better, then Microsoft will just miss an opportunity that some other company will seize.
Both HP and Acer have launched some pretty decent 2-in-1s in 2016. They're good candidates for this.
 
Back
Top