Digital Foundry Article Technical Discussion Archive [2014]

Status
Not open for further replies.
Here AC4 PS4/PC comparison. Notice the textures are blurred on PC version compared to PS4. Well cheap FXAA is on by default on any PC aa version. And the PC version is 4xMSAA also by the way.

a19kp23.png

All modes except SMAA on the PC version are combined with FXAA. The reasoning behind this is that FXAA provides the transarency AA element that MSAA and other methods don't.

I wouldn't describe it as cheap though. The blurring effect is extremely subtle and in combination with just 2x TXAA the image quality is completely sublime. Well in excess of what's acheivable with SMAA alone despite the slightely sharper image produced by the SMAA option.
 
I don't really know if I should be infracting you for a troll here. :???: If the XB1 version doesn't look better than the 360 version, the developers need to be shot. The most sane explanation for a delay is that the game taxes XB1 already so getting it running in a decent flavour on XB360 is likely a real challenge.

I was running with the rumour that the 360 version is running at 720p@60 and doesn't look that different from the XB1 version. If the differences are as slight as is being made out then what reasons are there to buy the XB1?
 
I was running with the rumour that the 360 version is running at 720p@60 and doesn't look that different from the XB1 version. If the differences are as slight as is being made out then what reasons are there to buy the XB1?

Upscaled to 720 I assume. I wonder with the good performance they are getting from a lower spec PC system if that helps the case that the 360 will be good enough ? Lower res, fewer polys, lower res textures, less debris etc. I mean we could assume that the 360 and XB1 version will be closer than say in other multi-generation games, bf4, cod, metal gear no ??
 
DF comment about Xb1 being match by a 100 Pounds GPU card even powered by a i3 or FX6300 makes me wonder about the CPU performance overhead MSFT seet for it-self (virtualization+ not that low level API). DF previous analysis also shown that even the ps4 can dip when reasonable PC conf don't, I don't think that GPU was the issue (as the ps4 GPU is head and shoulders above Bonaire).
Analysis of CPU usage in a game like AC4 on PC also makes me wonder about the choice for 8 weak CPU cores. Some parts of games engines are said to be quite scalable though one may wonder if both Sony and MSFT aimed a bit too low, another issue would be architecture related how performances scale from one jaguar compute cluster to two. One only has to look at a couple of PC cpu review to see that scaling the number of cpu cores has different success on different architectures and that is on pretty scalable workloads.
Now it is not like Sony and MSFT had much choice, trinity/richland was stuck on GF 32nm process, streamroller might have been late / a too risky bet.
Though looking at all the efforts MSFT sank into the design, going into great length to design its own sounds dsp, the esram, etc. I wonder if they could have focused their efforts on more significant architectural tweaks. At least Sony went with off the shelves parts.

Thruth be told I'm eager to see how the upcoming ARM cpu perform (denver, A57, whatever Qualcomm comes with) and how Maxwell GPUs fare. both consoles might not end looking that good in perfs per watts and per mm2. I'm especially curious about to which extend the big L2 in those gtx 750 alleviate bandwidth limitation (it seems that it allowed Nvidia to use a more conservative memory set-up).
 
DF comment about Xb1 being match by a 100 Pounds GPU card even powered by a i3 or FX6300 makes me wonder about the CPU performance overhead MSFT seet for it-self (virtualization+ not that low level API). DF previous analysis also shown that even the ps4 can dip when reasonable PC conf don't, I don't think that GPU was the issue (as the ps4 GPU is head and shoulders above Bonaire).
Analysis of CPU usage in a game like AC4 on PC also makes me wonder about the choice for 8 weak CPU cores. Some parts of games engines are said to be quite scalable though one may wonder if both Sony and MSFT aimed a bit too low, another issue would be architecture related how performances scale from one jaguar compute cluster to two. One only has to look at a couple of PC cpu review to see that scaling the number of cpu cores has different success on different architectures and that is on pretty scalable workloads.
Now it is not like Sony and MSFT had much choice, trinity/richland was stuck on GF 32nm process, streamroller might have been late / a too risky bet.
Though looking at all the efforts MSFT sank into the design, going into great length to design its own sounds dsp, the esram, etc. I wonder if they could have focused their efforts on more significant architectural tweaks. At least Sony went with off the shelves parts.

Thruth be told I'm eager to see how the upcoming ARM CPUs perform (denver, A57, whatever Qualcomm comes with) and how Maxwell GPUs fare. both consoles might not end looking that good in perfs per watts and per mm2. I'm especially curious about to which extend the big L2 in those gtx 750 alleviate bandwidth limitation (it seems that it allowed Nvidia to use a more conservative memory set-up).
 
DF also shown some interest in those Maxwell GPU, they are pretty awesome once you realize they use the same process as next gen consoles and AMD GPU.

Imo those cards makes next gen look bad. Those cards are most likely better in perf per Watts and perfs per mm2. The higher end version does what it does with ~ the same bandwidth as the original HD7790.

The lower end still does "great" out of ~80GB/s of bandwidth (most likely well enough match do what the XB1 is doing now). That is a 148mm2 chip, with a 55 Watts TDP :80:
If Nvidia is on a winning streak and Denver turns out well, we could see interesting SoC in "near" future.
 
They are matched by middle tier PC GPUs because they ARE using middle tier PC GPUs. And I'll take i3 over Jaguar any day.
 
Maxwell looks good for a low priced, competitive PC (Steam) box. Heck, I could get one for my PC and get next-gen for 60 W (on top of my PC) and 1/3 the price. Not that I've any interest. But in a year's time, a silent, potent GPU seems highly possible and almost a throwaway part.
 
They are matched by middle tier PC GPUs because they ARE using middle tier PC GPUs. And I'll take i3 over Jaguar any day.
Mid tiers is disputable, I would say low part of mid tiers.
My point is more that both MSFT and Sony dedicated a lot more silicon to their GPU. The PS4 pretty much embarks a Pitcairn which is 212mm2 by self, MSFT bonaire which is 160mm2+ the massive amount of esram /scratchpad. The ps4 relies on a wider bus and faster memory. Those next gen were nothing special as opposed to last gen, though to see such improvement on a product using the same process is for me unexpected, Nvidia achieved quite something here.

As for the core i3 well I can't disagree though the option was never on the table for Sony or MSFT.
 
Where does the WiiU sit in the mix between PS360 and PSXB1?
 
DF also shown some interest in those Maxwell GPU, they are pretty awesome once you realize they use the same process as next gen consoles and AMD GPU.

Imo those cards makes next gen look bad. Those cards are most likely better in perf per Watts and perfs per mm2. The higher end version does what it does with ~ the same bandwidth as the original HD7790.

The lower end still does "great" out of ~80GB/s of bandwidth (most likely well enough match do what the XB1 is doing now). That is a 148mm2 chip, with a 55 Watts TDP :80:
If Nvidia is on a winning streak and Denver turns out well, we could see interesting SoC in "near" future.

Yes. First thing I thought was Maxwell are picture perfect for consoles indeed. Shame I suppose they were too late in the game.

Maybe even MS could have ditched their ESRAM altogether with one of these parts and just scraped by with 68GB/s.
 
Where does the WiiU sit in the mix between PS360 and PSXB1?

The WiiU has more 1080p@60 games than the XB1 does! So in that measure it's PS4>WiiU>XB1.

But I guess in real terms it's still the little fish in this pond.
 
The WiiU has more 1080p@60 games than the XB1 does! So in that measure it's PS4>WiiU>XB1.

I dont think so...

Some of the Wii U games said to be 1080P are really 720P as well.

Some 1080 games on One

Dying Light
EA Sports UFC
Fifa 14
Forza 5
Kinect Sports Rivals
NBA 2K14
NFS Rivals
Lego Marvel Superheroes
Tomb Raider (gameplay, 900P cutscenes)

That's 8 off the top, are there that many 1080 Wii U games?

You probably meant "exclusives" and exaggerated :p

Not sure if it'd be true for exclusives. Like I said, I think like Donkey Kong was said to be 1080 but really was 720. The old "PR flack says 1080 because doesn't know internal from scaled" thing.
 
He said 1080p60 specifically. Not kept up to date on those but definitely one or two are not 60 fps in your list.
 
Yes. First thing I thought was Maxwell are picture perfect for consoles indeed. Shame I suppose they were too late in the game.
May be they are too late or neither Sony or MSFT were willing to pay the extra Nvidia was asking. Back when both companies made their decisions AMD might have looked way better both in perf per watts and mm2. AMD still GPU still fry Nvidia offering when it comes to compute performances, the GPUs used in both the ps4 and XB1 are clocked conservatively so power efficiency should be honest (though Nvidia just made a nice breakthrough wrt perfs per watts and efficient use of available bandwidth).
The picture ain't that grim and betting on Nvidia delivering this was quite a bet, not too mention that Nvidia would have most likely asked for a premium over AMD. In the gran scheme of things Nvidia has more important matter to deal with.

Maybe even MS could have ditched their ESRAM altogether with one of these parts and just scraped by with 68GB/s.
Imo opinion the issue is more the amount not the type of RAM. ~3 GB of RAM is crazy imo, what to do you need 3GB of ram for while gaming? Seriously OS like RT or Winphone or any other OS runs fine with really low amount of RAM. Anyway I disagree more and more with MSFT choices and that goes further than the choices they made on hardware but I think it is better to discuss that in a more appropriate thread as the one about msft and Sony business model. :)
 
Couldn't help but laugh at a GAF comment over news that Xbox Won this face-off

misterx is going to have a field day!

:LOL:

Anyways, game looks really nice even on my 360 (trial version). It seems DF really loved the game and seems a few others do too. Kind of want it now. This is a case where next gen, with 60 and 1080 vs 720 and 30, would be really nice. Double Helix was really making a name for themselves just as they are snatched away...shame there isn't snow or jungle levels though according to IGN...
 
Status
Not open for further replies.
Back
Top