*split* multiplatform console-world problems + Image Quality Debate

Does it imply the same amount of AA and various effects for both version but different rendering resolutions?

I'd assume the algorithms for AA, shading etc will be the same. Won't know until we see it. I'm guessing resolution and framerate will be the major differences and pixel quality, if you want to call it that, will be the same.
 
a circumstance of PS4 game being 720p or 900p upscaled to 1080p but at High/Ultra type settings while the XB1 is set to the same resolution but at medium settings.

Which is a distinct possibility later in the generation.
 
You do understand that 720P on a native 720P TV looks better than upscaled to 1080P? People keep talking about upscaling as though it is magic and some sort of substitute for real information.

So does a DVD look better on a standard def TV or upscaled to 720p HDTV?

Furthermore "some sort of substitute for real information" is a issue of native 1080p versus upscaled 1080p. Native 720p doesn't have more information than a 1080p upscaled from 720p image. The lack of clarity comes from the limited information that the 720p image provides.
 
Last edited by a moderator:
...
In my opinion, 1080p versus 900p upscaled to 1080p versus 720 upscaled to 1080p is a much safer enviroment for the MS to operate in contrast to devs choosing the same native resolution and then upping quality in other metrics. Where you have a circumstance of PS4 game being 720p or 900p upscaled to 1080p but at High/Ultra type settings while the XB1 is set to the same resolution but at medium settings.

I totally agree. If you end up with both games at 720p, but the PS4 is pushing GPGPU to do things like cloth or water simulation (for example), then I think it'll be a difference that pretty much all gamers could appreciate.
 
It's definitely too early to call, but I think one thing is becoming clear fairly early on: with the PS4 having 100% more ROPS, 50% more CUs, and all of its memory at a near 200% higher bandwidth, it is going to be an impressive feat if the Xbox One achieves parity on anything, really, and would be a huge coup by the hardware designers if they manage to find a way to make the Move engines, CPU clocks and ESRAM to make up for all that. And while finding specific things where it will be able to outperform the PS4 will still be possible, it's going to be a challenge for sure.

Whether any of that really matters is a completely different kind of thing however - the Xbox One has some really interesting OS level features and Kinect is really well provided for at the core level. The Wii basically won last gen early on with nearly a full generation of hardware less than its competition, certainly in terms of the first years and profitability. The scalar in the Xbox One may also make it less important to some.

But future Digital Foundry face-offs at the very least seem like they will remain one-sided for a long, long time.
 
People who think X1 will achieve parity with PS4 on a technical level are just setting themselves up for disappointment. The question is how small they can make the gap, but I think it'll be appreciable on a technical level. That doesn't mean the X1 versions of multi-platform games won't be "good enough" for most people.

The relevance of Digital Foundry comparisons for console warring is pretty much going to be nil, because PS4 is going to win every single time. Now, just doing the comparisons for curiosities sake could be interesting, so I'll still read them, but I have a feeling the number of hits they receive is going to be a lot smaller. There won't be much for people to argue and bicker about.
 
Also, it makes me wonder if MS would have been better going with a more PC like split-pool 8 GB DDR3/2 GB GDDR5. ESRAM/EDRAM just seems like a bad solution, especially if they want forward compatibility with future consoles.

The way the eSRAM is being handled appears to abstract it away from the code, beyond some kind of explicit mapping of page table entries.
A future memory subsystem can map those how it deems necessary to memory controllers.
As long as there are enough memory channels, it can probably stripe the data to fit the new architecture as needed.

Future memory interfaces will very likely have more than enough bandwidth, even if there are some efficiency losses.
Latency could be a question mark, but all disclosures point to a reluctance in discussing that portion. It may not be a big enough problem that a more powerful future platform can't brute force it, or the latency difference is not as big as some assume.
 
People who think X1 will achieve parity with PS4 on a technical level are just setting themselves up for disappointment. The question is how small they can make the gap, but I think it'll be appreciable on a technical level. That doesn't mean the X1 versions of multi-platform games won't be "good enough" for most people.

The relevance of Digital Foundry comparisons for console warring is pretty much going to be nil, because PS4 is going to win every single time. Now, just doing the comparisons for curiosities sake could be interesting, so I'll still read them, but I have a feeling the number of hits they receive is going to be a lot smaller. There won't be much for people to argue and bicker about.

I wouldn't go that far. I think the PS4 will always outshine the XB1 in certain areas. Pixel count being one of them as more CUs matters (unless devs are forced to occupy those CUs for something other than rendering). Also the flexibility that having 176 GBs to all 8 GBs of fast memory will make a difference where 32 MBs isn't enough and 68 GBs to the bigger memory pool won't cut it.

But there are still areas where the XB1 might have an hardware advantage or at least created a more competitive playing field. We still haven't been given the ultimate purpose of some of logic that exists on the hypertransport i/o bus.

http://www.nvidia.com/content/GTC-2010/pdfs/2152_GTC2010.pdf
http://www.youtube.com/watch?v=7bJ-D1xXEeg

These two links motivate my thoughts that the some of that logic is part of transcode pipeline for advanced texture compression necessary for PRT or tiled resources. If true and the portions of the 47 MBs no one can find are part of that pipeline, then later titles may force the PS4 to devote more of its gpu CUs/cpu cores and RAM bandwidth to accomodate more advanced compression schemes.

But then again the PS4 might readily stomp the XB1 everytime out. And if eSRAM isn't very exploitable and that i/o logic is simply Kinect related then thats likely to happen.
 
Last edited by a moderator:
From the XB1 architect DF article :

"Interestingly, the biggest source of your frame-rate drops actually comes from the CPU, not the GPU," Goossen reveals. "Adding the margin on the CPU... we actually had titles that were losing frames largely because they were CPU-bound in terms of their core threads. In providing what looks like a very little boost, it's actually a very significant win for us in making sure that we get the steady frame-rates on our console."

I shudder to think of the problems associated with performance before THE BOOST.:runaway:

I can't imagine they will but I wonder if another increase in clock speed is being kicked around ? Later titles will get better performance as the tools get better etc. of course but since there was so much ado about the how the clock boost was going to be a significant win ... ??
 
I think they'll just accept that games will run at lower resolution on X1. I'm sure they knew that was going to happen. It isn't as if the games are going to be terrible looking. They'll look good. Just not as good as PS4.
To some extend and being honest what is the point of 1080p from their pov when they do plan to have apps snapped left or right on the screen.
It makes sense.
 
50% more power does not equate to 50% more graphics, that'd actually require power at an order of magnitude much greater than 50%.
 
Let me just bring up a somewhat overlooked issue here, which is that so far it's only COD that's significantly underperforming, and there can be some arguments about BF4 (as it's a somewhat less important game and the performance delta is much smaller anyway).

On the other hand MS still has a very pretty looking game - at least in visuals, it's far better than what I was expecting. MS also has exclusivity for Titan Fall which also looks kinda cool (definitely better than COD Ghosts), and they do have a couple of promising developments as well.

So I'd say it's a bit too early to just write graphics on X1 off completely.
 
Well, the problem with the COD performance is that it shouldn't be performing at that level. The game isn't really doing anything that impressive, certainly nothing that stands out as "next-gen". This should be making people wonder what's going to happen when the real next-gen games start showing up over the next couple of years.

The same goes for Killer Instinct, which looks every bit the X360 title (aside from some particle effects that serve no real purpose except to wow).. it's another one that has people scratching their heads over why on earth they weren't able to hit 1080p with it.

It's not the comparison that's the issue... it's that the system is apparently struggling with games that it should be able to practically sleep through.
 
question about the digital foundry BF4 comparison. i think we can agree that the xbo footage is pretty bad. but how about the sharpness? i know that sharpening filters work off edge detection and a higher contrast image can provide better results. but is that sharpening filter something dice added or is it something that ms added to their scaler? if its something dice added, wouldn't it make sense to use it for both versions since they're both using fxaa ontop of scaling to 1080p?

ive read a post written by a knowledgeable person on xbox one's scaler and he suspects that the scaler is pretty advanced and is capable of allowing the xbox one to render at 720p and have the pixel quality match or exceed native 1080p. which to me screams bs i mean there is a limit and i think 2.25 more pixels is past that limit. but wanted to get your opinions on this train of though?
 
ive read a post written by a knowledgeable person on xbox one's scaler and he suspects that the scaler is pretty advanced and is capable of allowing the xbox one to render at 720p and have the pixel quality match or exceed native 1080p.

There is thing called Nyquist theorem which says 720p source cannot carry more information than 1080p source can.

In other words you cannot later regain what you loose when you sample at 720p.
 
ive read a post written by a knowledgeable person on xbox one's scaler and he suspects that the scaler is pretty advanced and is capable of allowing the xbox one to render at 720p and have the pixel quality match or exceed native 1080p.
Bollocks. Quite simply. The best non-realtime scalers in the world can't recreate the quality of an original. Anyone suggesting the scaler can exceed the quality of a native master is oozing fanboy honey. Heck, we even see the evidence in games - COD doesn't look as good as or better than the 1080p version!

Having said that, and I stand by it, you have made me just consider if an upscaler with the geometry buffer still available couldn't do a better job than a conventional upscaler? There are very good upscalers that can deal with solid geometry as if it were vectors. Or even transform the G-buffer into a vector source that you upscale and preserve line quality. You could, potentially, upscale the geometry and generate clean edges, and add back the surface detailing. It's 99.999% certain MS aren't doing this, but it's something to consider as a technology for consoles, like frame interpolation. Games afford far more options than upscaling 2D (3D with motion, time being the third dimension) TV content.
 
Back
Top