D
Deleted member 11852
Guest
I would fully expect this, I'm looking for the results to contact much of what DF discovered in their One X vs Series S comparison.but the lack of DX12U features would likely make it perform worse.
I would fully expect this, I'm looking for the results to contact much of what DF discovered in their One X vs Series S comparison.but the lack of DX12U features would likely make it perform worse.
hmm.. yea this one is tougher to prove. I feel like on Matrix Experience might be the only thing released on Xbox Series consoles that actually takes advantage of these new hardware features. I don't think the launch titles do anything particularly well in using them quite yet.I would fully expect this, I'm looking for the results to contact much of what DF discovered in their One X vs Series S comparison.
Unsurprisingly in practice 6TF GCN >> 4TF RDNA2 if we ignore RT and Zen 2 stuff. But that shouldn't be a surprise as we had plenty of benchs already showing it. But I'd still choose XSS cause higher framerate is much more important than a higher resolution.Summary lifted from Era..
"summary incoming
- on paper the One X's gpu > Series S's gpu; more memory and higher bandwidth
- however Series S cpu > One X's cpu, so One X is limited to 30fps but at higher resolutions
- Guardians of the Galaxy
- Series S - 1080p/30, better shadows and textures
- One X - 1440p-1890p/30
- Forza Horizon 5
- SS - 1440p/30 or 1080p/60 (both dynamic); 4x msaa; quality has higher object detail
- OX - 2160p/30 (dynamic), 4x msaa; object detail similar to performance mode on SS
- Halo Infinite
- SS - 1080p, 30 or 60; 30fps has framepacing issues
- OX - 2160p/30 or 1440p/60; 30fps has framepacing issues, 60fps has framerate drops
- Far Cry 6
- SS - 1224p/60fps, dynamic
- OX -2160p30fps, dynamic
- CoD Vanguard
- SS - 1440p/60 or 1080p/120, dynamic,
- OX - 2160p/60, dynamic, lots of visual effects downgraded; cutscenes are 40-50
- CoD Warzone
- SS - 1080p/60, actually Xbox One mode
- OX - 2160p/60, dynamic, actually 45-55
- Cyberpunk
- SS - 1080p-1440p/30, better pedestrian density
- OX - 1440p/30, driving dials fps to low 20s
- Metro Exodus
- SS - 1080p/60 (lower internal res), full ray traced GI
- OX - 2160/30
- pretty even so far for Oliver
- loading (OX HDD, OX SSD, SS)
- Vanguard: 24.4, 11.8, 3.5
- FH5: 85.5, 42.9, 24.3
- Halo: 61.3, 27.8, 14.1
- the Ascent: 128.9, 30.8, 30.3
- Backwards Compatibility
- Final Fantasy 13 -
- SS - 1152p, 2x msaa
- OX - 1728p, 2x msaa
- Mirrors Edge
- SS - 1440p/60, has drops
- OX - 2160p/30
- Series S has Auto HDR, better loading, and 60fps on some 360 games
- some One series games don't have One X enhancements on Series
- Doom 2016
- SS - 1080p
- OX - 2160p
- Red Dead Redemption 2
- SS - 864p
- OX - 2160p
- Far Cry 4
- SS - 60
- OX - high res, 30
- Prey
- SS - 60
- OX - high res, 30
- One X is better with older games
- Series S has some fps boost for some games, faster load times, also has new hardware features
- One X is competitive in recent titles and very much in older titles, but will struggle more with 60fps
- One X will be unsupported eventually while Series S will be support alongside the Series X"
hmm.. yea this one is tougher to prove.
Its interesting though that the OneX at similar resolution or just a bit higher would have outperformed the Series S almost alwaysUnsurprisingly in practice 6TF GCN >> 4TF RDNA2 if we ignore RT and Zen 2 stuff. But that shouldn't be a surprise as we had plenty of benchs already showing it. But I'd still choose XSS cause higher framerate is much more important than a higher resolution.
The One X is lacking a lot of hardware features that could be required for Matrix Experience. I doubt it would run particularly well at the same fidelity. My general thought here is that CPU and SSD requirements would already stop it from running on One X, the GPU may do alright, but the lack of DX12U features would likely make it perform worse.
Would limiting a game to four cores disproportionately free up memory bandwidth for the GPU in XSS? Its big mem bandwidth deficit must be exacerbated by a CPU that’s ~4x* faster than than in the X1X. Didn’t DF’s speculative tests show a (discrete, so no mem contention) 4TF RDNA perform close to a 6TF GCN?Unsurprisingly in practice 6TF GCN >> 4TF RDNA2 if we ignore RT and Zen 2 stuff. But that shouldn't be a surprise as we had plenty of benchs already showing it. But I'd still choose XSS cause higher framerate is much more important than a higher resolution.
I'm still surprised they didn't match One X GPU performance as a baseline. 6TF with 12GB of memory would have been great.I think I’d currently go with a One X but that will probably change within a year or 2. Seems like it would be better if Series S didn't exist TBH.
I think I’d currently go with a One X but that will probably change within a year or 2. Seems like it would be better if Series S didn't exist TBH.
6TF with 12GB of memory would have been great.
That fact isn't lost on me. I'm just surprised they did it.Sure, but the MSRP would have been $399 instead of $299... and going against a system with more of everything ($399 PS5 DE).
yeah they should of gone with 6tf & 12gb. I think Ive written this before. Theres no way that would of added an extra $100 to the price.That fact isn't lost on me. I'm just surprised they did it.
There must surely have been options for Series S giving up nothing technically to One X (6Tf GPU, 12Gb) whilst still be a chunk cheaper than PS5DE (10Tf, 16Gb, 800Gb SSD).Sure, but the MSRP would have been $399 instead of $299... and going against a system with more of everything ($399 PS5 DE).
I know why people use the tag gamepass box but I personally have mixed feelings with that framing.They basically wanted a Game Pass box, as cheap and small as you could get it.
Absolutely, but what.. three.. four.. CPU generations and two GPU generations separate the APUs, along with 16nm vs 7nm.The One X and Series S are targeting 2 different markets so I can understand why they went with the more underpowered console. They basically wanted a Game Pass box, as cheap and small as you could get it.
Absolutely, but what.. three.. four.. CPU generations and two GPU generations separate the APUs, along with 16nm vs 7nm.
I'm leaning towards the RAM differences being the issue, but RAM is cheap so I wonder what a Series S with 12Gb of 168-bit RAM would deliver, and what the cost difference would be.
The 5.2 TF 5500 XT is about as fast as the 6.2 TF RX 580. So isn't it expected that a 4 TF RDNA 2 part will come in slower than a 6 TF GCN one? According to AMD there was only a 1.25X improvement in IPC from GCN to RDNA 1, and any IPC improvements in RDNA 2 were linked to the Infinity Cache, which the consoles lack.Absolutely, but what.. three.. four.. CPU generations and two GPU generations separate the APUs, along with 16nm vs 7nm.
I'm leaning towards the RAM differences being the issue, but RAM is cheap so I wonder what a Series S with 12Gb of 168-bit RAM would deliver, and what the cost difference would be.