*split* multiplatform console-world problems + Image Quality Debate

There's still something hidden under the hood

Take a look at what Johan Andersson tweets
This is getting ridiculous. Surely whatever you think is being hidden was used by repi and the team they just cannot talk about it. Its more likely that ms has ndas on everything as they don't want any technical details being released as none of them will really benefit their cause.

Two massive multiplatform games from big publishers, one with a strategic partnership with ms, are both considerable different from the ps4 showing the gulf in power. These secret things are either so secret ms has hidden them from devs as well or they simply don't make the difference fanboys are hoping for.

As a betting man I will put money on it being the latter option. Your mileage may vary.
 
our experience with Battlefield 4 demonstrates that you can easily see the visual difference between them. The Xbox One version holds up well given the gulf in resolution, but it doesn't require a pixel counter to tell that the PS4 game is crisper and cleaner either. At last week's Battlefield 4 review event in Stockholm, we noted that the resolution change from one version to the next was obvious to many of the press in attendance, with some even suggesting on-site that the PS4 version was operating at native 1080pwhen its actual resolution was 1600x900.

People can hide their heads in the sand as much as they want but it doesn't get over the fact that, even if they don't know what makes it so, people will notice the difference. One does look substantially better than the other.

MS's problem now is what to counter with. They've already lost this battle, the two biggest third party launch games look amazing on one platform, not so much on the other. It's a done deal.
 
BTW Beyond3D has an official frame rate thread, so apparently people can't tell frame rates either.

That's actually true to an extent. Many people would be hard pressed to see the difference between 60 fps and 50 fps (assuming screen tearing was happening in a part of the screen where they wouldn't notice it). They'd notice the difference between 60fps and 30fps for the majority, though some people don't, which is something I'll never understand. I think the main reason for a thread like that, is you would never know a games framerate until you bought it, and you could end up disappointed. Game trailers are rarely encoded at 60fps, so you can see a true frame-per-frame representation of the game you're going to buy. The trailer would likely look the same even if one platform was 30fps and the other was 60fps.

Screenshots, on the other hand, can give you a pretty good representation of image quality as long as they aren't bullshots. So you know what you're getting, in that regard.

Personally, I watched the BF4 vids on a 1080p screen and I can't see a lot of a difference except for more pronounced shimmering on the X1. Youtube compression of those vids probably hides some of the detail differences, but they really don't end up looking too far apart even though the PS4 version is pushing way more pixels.

The differences with COD might be a lot more apparent, because there is a bigger gap in resolution. If they have a good AA solution, it might not be as noticeable as you'd think, depending on screen size and viewing distance. I think framerate is the bigger difference maker here. If PS4 is rock solid 60fps and X1 is not, then that to me is the real winner in terms of performance. Now, if they'd gone the route of having them both the same resolution, but with way more visual complexity on PS4, then that would also be an obvious improvement. Doesn't sound like that's what they've done.
 
It's a shame that the resolution difference is difficult to appreciate on YouTube, as there are still people claming the Xbox One footage is equal to the PS4. Most of us here can appreciate the difference by simply looking at the numbers, but Joe Gamer will judge from videos alone.

Now I watch YouTube on my phone, which has a 720p resolution, and I'm sure most others are doing much the same. So when I look at the videos, they look essentially the same (pixel crawl aside).

If they were to be compared on two 1080p screens side by side, the difference would be unquestionable.

How do you know? From what I seen the current footage and pics from BF4 need to simply be thrown out the window.

Crushed black from a buggy output and blur that no one can seem to explain makes BF4 comparisons useless.

I can't accept that the XB1 version won't removed the full RGB issues nor can I accept that the PS4 will natively render at 1080p but waste that effort with a nasty blur filter. Whats the point of native 1080p if it has less clarity than an upscaled 720p image? Because people like smaller HUDs? Its all a giant clusterf*(k.

Furthermore, 1080p vs 720p might not be an argument for long. How long has Durango been in dev kits? Less than a year? So basically Dice has had about 11 months to port over whats basically a PC version of BF4 over to the XB1 which has been mostly incomplete and unstable both hardware and software wise. If I was a dev, eSRAM and whatever funky or exotic features on the XB1 would have to wait and lower resolution and upscaling would be the solution I would choose to make launch. If MS had a problem with that, it should have been presented a more stable platform for launch titles.

The SPEs were rarely well exploited early on so I expect the same from devs when it comes to eSRAM and special accelerators on the XB1. Until devs get their heads around the Durango arch, lower resolution with upscaling versus the PS4 will be common.
 
Last edited by a moderator:
You know, I guess all that talk of "super-charged pc architecture" on Sony's side really seems to be just that. While on the other hand MS's talk about "balanced hardware" and whatnot, in the light of recent events, sounds more and more like plain PR after all. I'm not trying to make a fanboy argument here and bash MS (since it seems they're doing a pretty damn good job of doing it themselves ;)). If anything I'm more of an MS "believer" if you will. However, they got people like Penello posting on that-other-forum that they "won't give up a 30%+ performance difference to Sony", "They invented DirectX", etc. At E3 they had presentations that were all about 1080p/60fps for CoD and BF4. And now? It all just seems really unflattering and outright stupid after what we know now.

In fact, it seems the super-charged PC architecture results in easier porting from PC to PS4, because basically the PS4 is "just" (not in a negative way) a beefed-up PC. You've got a single large pool of high bandwidth RAM that works just as good for high performance graphics applications as it most probably also does for standard OS- and application-stuff. We also have to consider that those anonymous sources that stated that "PS4 ports run at 1080p, 90fps, unoptimized" while "X1 ports run at 900p, 20-30fps" probably were true or not far from reality after all. And coming to think of it, it does actually make a lot of sense. There just isn't as much need to optimize with PS4 because the system's architecture is straight forward and doesn't require special treatment to run good. This wasn't the case in the last generation with 360 and even more so with PS3's Cell and it looks like it still isn't the case with X1. And I think that last part is where the problem lies for MS, at least for now.

It certainly was clear to many, given the X1's leaked documents, that the system will have bottlenecks due to the usage of DDR3 and the specialized ESRAM. I guess it just wasn't as clear how simple and straight-forward the PS4 is compared to that.

However, it's still hard to comprehend how an unspectacular game as CoD Ghosts, isn't able to achieve anything above 720p/60 on X1. The game is based on idTech 3, although highly adapted and enhanced, as far as I know it's no deferred shader engine, so there shouldn't be any problem with fitting the framebuffer into the ESRAM (of course, then the problem is still with getting the texture data into the framebuffer which is probably bound by the DDR3 bandwidth). It's also strange when you compare it to something like Forza 5 which is a forward renderer as well that is however capable of reaching 1080p/60 with certainly good fidelity. Of course the games are quite different but still, there are some similarities and I guess that there just wasn't enough time to optimize for the requirements of the X1.

I think that the devs will learn to cope with X1's architecture over the next year and that should allow for the gap to close to some degree, at least for multiplats, because looking at first party titles the gap isn't that big to begin with. However, there is no denying that apart from closing the gap as good as possible will not suddenly make the X1 a more powerful system, because at some point the pure spec differences will still prevent parity (graphics- and computewise).

The big takeaway from this is probably not that the X1 requires a certain amount of optimization to shine, but probably that the PS4 simply does not.

I highly suspect that MS was caught completely off guard by what was considered originally a brilliant solution based on their predictions. It became relatively not as good when Sony managed to implement a direct solution that proved more powerful. Assuming that GDDR5 wouldnt have been a cost efficient solution MS chose the best custom solution they could think of which they knew competition most likely wouldnt have come up with. Indeed what MS claims are true to some extent. They might have had the performance advantage if Sony's bet didnt bring the fruits.

Going back to the leaks we can suspect that Sony had to either go with more GDDR3 or less GDDR5 as the two most likely solutions.

MS knew they had the cheap functional proven and established solution of 8GB GDDR3 combined with a custom ESRAM solution to boost performance.
Whereas they predicted Sony would either go GDDR3 at 8GB max or GDDR5 with less memory and a lesser custom solution if any (referring to both GDDR3 and GDDR5).

Sony was very lucky and they got the best case scenario of 8GB GDDR5 and everyone was surprised. Everyone thought including the most knowledgeable forum members that most likely this was not feasible considering the costs.
 
An unbiased analysis of the tech would conclude that (ignoring a variety of others factors). That's not the same thing as an analysis of how the game looks visually. Tech graphics and visuals aren't synonymous. The utility of high end tech graphics is ONLY to provide the end user with as nice of visuals as possible. When pixels are (evidently) no longer the limiting factor for IQ and clarity, other things need to be considered as well (i.e. AA, AF, no tearing, contrast ratios, color spectrum). Do those not count for some reason? The objective answer should be that they DO count. We just aren't proficient at quantifying them in out common discourse on the topic.

IQ has been conflated to mean resolution for yrs now by common forumites. It's one thing if the assets are held back by the frame res. It's quite another when that's not the case though. especially in light of major advantages one version might have in those other areas I noted.

From the Resolutiongate article: The Xbox One version holds up well given the gulf in resolution, but it doesn't require a pixel counter to tell that the PS4 game is crisper and cleaner either.
 
Going back to the leaks we can suspect that Sony had to either go with more GDDR3 or less GDDR5 as the two most likely solutions.
From Cerny's interviews, they initially considered 128 bit GDDR5 plus internal SRAM, versus 256 bit GDDR5. They never considered DDR3. The irony is that they would have been stuck with 4GB with the earlier design, and the die would have been too big to allow 18CUs, and would have been difficult to code for.
 
How do you know?

Because the difference between 1920x1080 (or indeed 1600x900) and 1280x720 is huge. We're not talking about the small, but sti noticable differences that current gen has. We're looking at an enormous 50 - 100% variation. That's getting on fpr a generational difference.

Also, those have have seen tbe two games have testified as to how much better PS4 looks. This is not something that can be ignored.

Microsoft have dropped the ball in the worst possible terms. They have fudged the hardware and the messaging from day 1, there's no turning back now. I was excited to see what they had, firmly believing they'd out do Sony. History has shown that they have in the past. For me as gamer that that's a bit of a graphics whore, Microsoft have disappointed me to a ridiculous level. I'm not going to pay for a machine that's this underpowered.

Sorry if this shouldn't be said in this thread. Mods, feel free to remove.
 
I think that the issue is not really technical, actually Patcher got it right a while ago: it is about the price and how Kinect affects it.
Lots of people are not sold on it usefulness, may the XO had launch at 399$ with Kinect people would have though that it comes for free. Now the system is more expensive, lots of people still believe they don't need Kinect or don't care, and actually the system underperform relatively to the ps4.

The performance the XO are fine, the price is not, the perceived value may not be here in the eyes of lots of core gamers.
 
Last edited by a moderator:
I doubt complexity of esram is really that different from edram. Maybe they have to manage it themselves at the minute but I'm sure its not a foreign concept to devs of the standing we are talking about.

Being linked to the problems with cell is a stretch too far in my opinion and an attempt to clutch at straws. Also the notion ps4 is going to stand still which is being implied is linked with that idea.

When devs start hitting the low level GPU access beyond the DX capabilities the gap is likely to widen even further.
 
I highly suspect that MS was caught completely off guard by what was considered originally a brilliant solution based on their predictions. It became relatively not as good when Sony managed to implement a direct solution that proved more powerful. Assuming that GDDR5 wouldnt have been a cost efficient solution MS chose the best custom solution they could think of which they knew competition most likely wouldnt have come up with. Indeed what MS claims are true to some extent. They might have had the performance advantage if Sony's bet didnt bring the fruits.

Going back to the leaks we can suspect that Sony had to either go with more GDDR3 or less GDDR5 as the two most likely solutions.

MS knew they had the cheap functional proven and established solution of 8GB GDDR3 combined with a custom ESRAM solution to boost performance.
Whereas they predicted Sony would either go GDDR3 at 8GB max or GDDR5 with less memory and a lesser custom solution if any (referring to both GDDR3 and GDDR5).

Sony was very lucky and they got the best case scenario of 8GB GDDR5 and everyone was surprised. Everyone thought including the most knowledgeable forum members that most likely this was not feasible considering the costs.

I'd like to think its the tools as Albert made some pretty strong statements about what we would see at launch but that doesn't seem to be matching up with what is being reported by those who've seen the games. Someone should ask him about it on Twitter - does he want to revise his comments or does he stand behind them?

Going back and rereading the MS articles on DF and their latest piece certainly raises questions. It will be interesting to see what things look like in a year or so when developers have had time to familiarize themselves with both platforms and are not under as much pressure as they are now with trying to launch games, support legacy platforms and work with incomplete development tools.
 
I highly suspect that MS was caught completely off guard by what was considered originally a brilliant solution based on their predictions. It became relatively not as good when Sony managed to implement a direct solution that proved more powerful. Assuming that GDDR5 wouldnt have been a cost efficient solution MS chose the best custom solution they could think of which they knew competition most likely wouldnt have come up with. Indeed what MS claims are true to some extent. They might have had the performance advantage if Sony's bet didnt bring the fruits.

Going back to the leaks we can suspect that Sony had to either go with more GDDR3 or less GDDR5 as the two most likely solutions.

MS knew they had the cheap functional proven and established solution of 8GB GDDR3 combined with a custom ESRAM solution to boost performance.
Whereas they predicted Sony would either go GDDR3 at 8GB max or GDDR5 with less memory and a lesser custom solution if any (referring to both GDDR3 and GDDR5).

Sony was very lucky and they got the best case scenario of 8GB GDDR5 and everyone was surprised. Everyone thought including the most knowledgeable forum members that most likely this was not feasible considering the costs.
This is speculation but I don't think 4 GB DDR5 would have hurt the PS4's ability to render at 1080p. This is because supposedly it was a last minute decision and COD was developed longer than that. The extra 4 GB is icing for the PS4.

Also, it makes me wonder if MS would have been better going with a more PC like split-pool 8 GB DDR3/2 GB GDDR5. ESRAM/EDRAM just seems like a bad solution, especially if they want forward compatibility with future consoles.
 
It's far to early to declare that there's no turning back now. Sony had been in a very bad looking situation with the PS3, with a hefty delay to market on top of it. They've managed to pick themselves up in the end quite well, although they did fall behind significantly on the US market.

It is also too early to tell if the current deficiencies in visuals are redeemable later or not.
If from Q4 2014 every multiplatform game would look on the PS4 as KZSF, and on the X1 as Ryse (so, close to 1080p vs 900p) then I think everyone could be happy, both the customers and MS itself. Sales would definitely not be affected by such a difference.

If however things remain like they are now with BF, or get worse and we get 720p vs. 1080p for the rest of the generation, then there will most likely be consequences. The severity of these are, however, once again hard to judge, seeing how PS3 users were happy enough with their version of COD, GTA, AC and BF, to name some of the most popular multiplatform titles. Oh and to be honest, I can't really consider neither BF4 nor COD:Ghosts to be more than cross-platform upscaled titles anyway.

Not to mention the platform exclusives - we still haven't seen much from Quantum Break or Infamous, and nothing from the new Halo or ND's games. Those can still have a strong effect on what console people chose to go with.
 
This is speculation but I don't think 4 GB DDR5 would have hurt the PS4's ability to render at 1080p. This is because supposedly it was a last minute decision and COD was developed longer than that. The extra 4 GB is icing for the PS4.

Also, it makes me wonder if MS would have been better going with a more PC like split-pool 8 GB DDR3/2 GB GDDR5. ESRAM/EDRAM just seems like a bad solution, especially if they want forward compatibility with future consoles.
4GB may not have affected 1080p60, but may have had other affects like lower res textures etc.

Future compatibility, we have no idea what they may come up with in the future, in fact they may take the esram model further forward with faster and more of it. Based on what their seeing 3 or so years down the line.
Doubt they would base it on launch titles. ;)
 
Because the difference between 1920x1080 (or indeed 1600x900) and 1280x720 is huge. We're not talking about the small, but sti noticable differences that current gen has. We're looking at an enormous 50 - 100% variation. That's getting on fpr a generational difference.

Also, those have have seen tbe two games have testified as to how much better PS4 looks. This is not something that can be ignored.

Microsoft have dropped the ball in the worst possible terms. They have fudged the hardware and the messaging from day 1, there's no turning back now. I was excited to see what they had, firmly believing they'd out do Sony. History has shown that they have in the past. For me as gamer that that's a bit of a graphics whore, Microsoft have disappointed me to a ridiculous level. I'm not going to pay for a machine that's this underpowered.

Sorry if this shouldn't be said in this thread. Mods, feel free to remove.

The pixel difference can be considered huge. But a 50% difference in pixels doesn't mean there is a 50% difference in image quality. Going from 720p to 1080p doesn't turn water into wine. Plus, there won't be a pixel difference as this won't be 1080p vs 720p but rather native 1080p vs upscaled 1080p and a difference in clarity.

Furthermore, its not a question of how well you and I can appreciate the difference. Its how well the general public does. And for the most part outside of core gamers, reference images aren't used to qualify image quality differences.

In my opinion, 1080p versus 900p upscaled to 1080p versus 720 upscaled to 1080p is a much safer enviroment for the MS to operate in contrast to devs choosing the same native resolution and then upping quality in other metrics. Where you have a circumstance of PS4 game being 720p or 900p upscaled to 1080p but at High/Ultra type settings while the XB1 is set to the same resolution but at medium settings.
 
Last edited by a moderator:
Does it imply the same amount of AA and various effects for both version but different rendering resolutions?
 
Plus, there won't be a pixel difference as this won't be 1080p vs 720p but rather native 1080p vs upscaled 1080p and a difference in clarity. .

You do understand that 720P on a native 720P TV looks better than upscaled to 1080P? People keep talking about upscaling as though it is magic and some sort of substitute for real information.
 
It's far to early to declare that there's no turning back now. Sony had been in a very bad looking situation with the PS3, with a hefty delay to market on top of it. They've managed to pick themselves up in the end quite well, although they did fall behind significantly on the US market.
When PS3 launched, we all knew it had a difficult architecture and room to grow based on first showings. People aren't feeling that with XB1 which looks a lot more straightforward. Now I for one think it has got a fair bit of legroom versus these launch titles, but I can understand Joe Public thinking there's not enough hardware difference to justify the shortcomings and they're seeing 33% less CUs in action. It's not really comparable to the quite different architectures of last gen (I suppose it's not really last gen until Nov 15th but I'm getting in early).
 
Back
Top