anexanhume
Veteran
Just realized this tweet was already posted above by AzBat, probably where we both saw itThanks was just about to say can't find where I saw it.
Don't get twisted though, Xbox is here as well! We've got that extra thicc #SeriesX
alongside his smaller brother, the #XboxSeriesS
! Got the whole gang back together again
Isn't dual stream just implying vertex and pixel shaders? That the multicore command processors are just more efficient at handling them than the the prior GCN architecture. I haven't kept up with the 'tech' 'tech' stuff happening at AMD, so my memory on their architectures history is quite hazy.
Dual command processor - run different level of DX, one for game and a lower priority one for OS.
@Jay actually answered this but though I'd quote so you saw it:
(That's my understanding too. I think the X1 had something similar in principle for games and dash overlay / snap).
Edit: funnily enough, just come across this in the stream I'm going through (link is timestamped):
I'm not sure about the exact relationship between the L2 and Infinity Cache. It appears like one possible route for data is L2 to Infinity Cache, but there are options for controlling whether one or the other is used at the driver level at least.So does L2 then go to infinity cache and the infinity cache to the memory controllers? Or are we looking at something closer to esram setup, where it's being deployed as a scratch pad that is not developer accessible?
Stream would probably be the command buffers for the overall GPU. In GCN, it was usually something the block could only handle from one source at a time. The current gen had two command processor blocks to handle two the system and game in parallel with less concern of latency or throughput problems because one client or the other was blocking a single processor. RDNA introduced a command processor that could support two graphics queues, which may be related to what Microsoft was discussing.Isn't dual stream just implying vertex and pixel shaders? That the multicore command processors are just more efficient at handling them than the the prior GCN architecture. I haven't kept up with the 'tech' 'tech' stuff happening at AMD, so my memory on their architectures history is quite hazy.
Watch Dogs: Legion running on the XSX with RT @4K30:
this is easily the best example so far of cross gen scaling.This is the best footage of Watchdog. I have seen it looks pretty good.
I think they mentioned "virtualized command streams"? So it's probably one [multi-core] command processor with context switching.Stream would probably be the command buffers for the overall GPU. In GCN, it was usually something the block could only handle from one source at a time. The current gen had two command processor blocks to handle two the system and game in parallel with less concern of latency or throughput problems because one client or the other was blocking a single processor. RDNA introduced a command processor that could support two graphics queues, which may be related to what Microsoft was discussing.
Pretty sure it's 2 distinct command processors, with custom firmware with different DX levels.I think they mentioned "virtualized command streams"? So it's probably one [multi-core] command processor with context switching.
After all, dedicating a whole piece of hardware just for the dash seems like a waste.
The PS4 and Xbox One had two command processors, possibly because the unit could not be fully virtualized or because context switching at the time didn't meet their quality of service limits.I think they mentioned "virtualized command streams"? So it's probably one [multi-core] command processor with context switching.
After all, dedicating a whole piece of hardware just for the dash seems like a waste.
I am a bit perplexed that we barely see anything of the Series X's features promoted or demonstrated in actual exclusive games.
For the Playstation 5 we see games running and interviews talking features implemented in actual games.
The Series X is promoted as a machine that has these technologies and plays games, but barely games demoed
Also MS restructuring the SDK into Gamecore has had a very big impact on timeliness of devkit updates (and thus what features are there for devs to leverage).
It is bewildering that for the second generation launch in a row, Microsoft are seemingly behind Sony again on the firmware/SDK readiness. The world's biggest software company who literally make VisualStudio, which is what devs use for PlayStation development.
Just... how? Was the demise of XDK and the rise GDK a very recent decision?
It is bewildering that for the second generation launch in a row, Microsoft are seemingly behind Sony again on the firmware/SDK readiness. The world's biggest software company who literally make VisualStudio, which is what devs use for PlayStation development.
Just... how? Was the demise of XDK and the rise GDK a very recent decision?
Just... how? Was the demise of XDK and the rise GDK a very recent decision?
Late 2019 to Early 2020 going by release notes of GameCore.