*split* multiplatform console-world problems + Image Quality Debate

I dunno, my impression was that they were mostly just listing issues with the dev environment. They never really mentioned stuff like weak hardware, lack of time, lack of publisher support or such.
 
I dunno, my impression was that they were mostly just listing issues with the dev environment. They never really mentioned stuff like weak hardware, lack of time, lack of publisher support or such.

Yes, you brought up "f**kup". ^_^

They have been rather politically correct in that article.
 
Why is that XO shot having that horrible edge enhancement?? Same as X1 shots...
Bad AA algorithm results in the bright halo with the edges, eg check above the black guys head the inside of the cockpit.

btw both ps3 & xbone both look pretty crap IMO, some people saying this is the best gfx shown so far, well thats pretty sad if thats the case
 
The only suprising part is X1 version is 720p and not 900p. They werent even close to 60fps at 1080p when they tried it because they had to more than halve the pixels

Wouldnt be shocking if X1 version stays more to 60fps than PS4 because X1 should be reeeally locked

A little bit suprised PS4 was allowed to go for that..
 
Last edited by a moderator:
The difference in contrast because of the black crush on X1 capture makes it pretty impossible to compare texture quality directly.
 
Crytek is working on Ryse, and yes it's 900p and 30fps, but it's also one of (if not the) the most technologically advanced games today. Which should make any dev tool issues even more problematic for them compared to the port of the yearly iteration of COD.

Wouldn't it also be fair to say that it is more like certain last-gen Stadium/sports based games, some of which can make it to 1080p@60fps? I mean, obviously there are some big differences in terms of effects and such, but it's not exactly an open-world game, and the kind of combat here and speed it plays out in isn't exactly comparable to a Call of Duty game ... ?

Not that any of the recent CoD games every really impressed me other than by being 60fps and low-latency.
 
Another article on the 720p fiasco.

http://www.redgamingtech.com/xbox-one-esram-720p-why-its-causing-a-resolution-bottleneck-analysis/

It doesn't seem to say anything new but gets into the numbers more than most articles.

What's interesting to me is whether this was a business decision or an engineering failure to not be able to display 1080p/60 fps in a game like COD. With all their DirectX knowledge, they must have known they would have trouble with this early on. Did the engineering guys say, we won't do 1080p without high memory bandwidth and the business and numbers guys say "no GDDR5 for you!" or did the engineering team want to try to be clever and see if they could get it done more efficiently? Where is Dean Takahashi?
That is an A-mazing article. It explains everything so well -and I also listened to the audio of his youtube video, well said-, it is unbiased and the guy knows his stuff. It is perhaps the best article on the matter to date, imho.

I honestly think Digital Foundry should hire the guy who wrote that article.
 
If XB1 ends up being 900/60 vs 1080/60, few would care outside of forum people but the fact that right now, COD is closer to the 360 resolution than PS4 makes for a story that's being picked up by every tech publisher and even non-tech publishers. That would be an engineering failure in my mind. If they get there in 2014 COD, the story might die. I don't know if they'll get there.

Yeah but that could be just like early 30fps/60fps games last gen, it really depends on the state of the dev tools. If the early games are largely pc ports, which I figure they will be since pc games is mostly what game makers have been focusing on while patiently waiting for the current generation to wear itself out, then those won't fair well when running on purely ddr3. I'd expect it to get better next rev of games, although from what I see on paper the xb1 will always be at a disadvantage to the ps4 regardless, it's just a question of how much. For the audience they are primarily going after I don't think it will really matter for the most part.


Also, the PC argument isn't a good one to use. You always want to get the best hardware out of anything you purchase. Take the iPad for instance. They'll never be as good as a PC but Apple pushes really hard to make their AX chip essentially double in speed each year. And there's no more a casuals system than the iPad.

The pc argument is such that if you are a pc gamer that also games on consoles, then graphics on consoles will not be an issue. That's because there is simply no way anything these consoles can do will impress any pc gamer, they are already slower than pc hardware from 2+ years ago. This console priorities to this type of pc gamer will be to play certain games with his friends or to get experiences not available to him on pc. The graphics disparity between the xb1/ps4 to this gamer won't be a big deal.

The more imminent threat, which I presume is partly the one Microsoft is trying to get, are the casuals and hybrid core/casuals. Those guys are being attacked by ipads, tablets and so on, they spend money but aren't overly concerned with graphics. Microsoft are making a good play for them with the types of things that would interest that form of gamer, yet on the other hand would totally bore your typical core gamers. If they can just figure out cross platform support to where you can buy apps/games once and use them on your xb1, tablet, pc, laptop and phone then they are in the position to offer a value that no one else can. Then in theory in a few years while core gamers are still arguing which console version renders a higher quality color of van dyke brown, casuals, core/casuals and pc gamers will be talking about how cool it is that they can use apps across all their Microsoft devices. To a large group of people I'd argue that there is far more value there than just talking about resolution.
 
First footage of Ryse were 1080p iirc...right or wrong? Not sure about this...but if true, it could point to some unexpected issues...

Don't know about "first" footage, but Crytek has presentations with ryse imagery that states "1080p on a 7970"
 
Well that was at the time of E3 or even earlier, right? Probably did not have anything close to final specs or silicon back then...
 
Mark Rubin has explained that in the end games are going to look very similar on the PS4 and the Xbox One, and when MS free some resources up the performance of games on the Xbox One could change dramatically.

Dramatically seems to be such a strong word... 10% is not a big deal, imho.

"It's not just hardware physically, the amount of resources that each system is allowing the game developers to use isn't the same," Rubin revealed when questioned on why the Xbox One version of Call of Duty: Ghosts did not hit the same 1080p expectations as the PS4 offering.

"From our standpoint that's something that could change,” he added. “We might get more resources back at one point. And that could make things change dramatically for the Xbox One.

“It's a long complicated road that will take years to develop, and I think at the end we'll have games looking very similar, usually, on both systems."
Read more at:
 
The only suprising part is X1 version is 720p and not 900p. They werent even close to 60fps at 1080p when they tried it because they had to more than halve the pixels

Wouldnt be shocking if X1 version stays more to 60fps than PS4 because X1 should be reeeally locked

A little bit suprised PS4 was allowed to go for that..
Seems like people are completely blaming the devs when that's not fair IMO. People did the same with bad PS3 ports, and PS3 owners brought up games like Uncharted as an example of what the system is capable of. Most people would tell PS3 owners that it's not a fair comparison at all, and now I'm seeing essentially the same arguments. The only thing I'm surprised about is that it's 1080p vs 720p. But still, I don't think the blame can be put on the devs alone.
Mark Rubin has explained that in the end games are going to look very similar on the PS4 and the Xbox One, and when MS free some resources up the performance of games on the Xbox One could change dramatically.

Dramatically seems to be such a strong word... 10% is not a big deal, imho.

Read more at:
Of course he's going to say that they look very similar... he's not going to shit on MS.

And both systems will most likely free up some resources as the generation goes on.
 
...then they come in and you go, okay, well it needs 3MB of RAM - oh, crap, we only allocated two! You can't just take a MB from anywhere. It's not like there's just tonnes of it just laying there. You have to pull it from something else.

When you have several gigabytes in your system and your examples are of MB's, leads me to believe that Mark Rubin's talking about ESRAM here. And within those constraints, they had a moving target of available space in that ESRAM. We know GPGPU code is being used for kinect and several people here acknowledged ESRAM's potential usefulness for such applications, and it seems MS is using a good deal of ESRAM for their own.

source:
http://www.eurogamer.net/articles/2...all-of-duty-ghosts-dev-infinity-ward-responds
 
Isn't it also possible he meant GB instead? As in 3GB of the 8GB RAM that is being used by the OS?

Tommy McClain
 
Isn't it also possible he meant GB instead? As in 3GB of the 8GB RAM that is being used by the OS?

Tommy McClain

I don't think he would have slipped there. And I honestly don't think this game would be affected by having a few less GB of RAM, they already have quite robust streaming systems that work within half a GB at closely the same resolution on current gen, which makes me believe their engine would not be much affected by those changes to warrant a resolution downgrade, that said a few MB changes inside the 32MB ESRAM space could be much more devastating depending on how they opted to use it.


edit: Oh, here's a GAF thread that claims COD uses under 2GB on PC. It may not be a valid benchmark but it's telling.
http://www.neogaf.com/forum/showthread.php?t=709124

So it cements my belief that he's talking about ESRAM.
 
Last edited by a moderator:
I agree that it sounds like there is some eSRAM reservation. Would be interested to understand the amount and use case for it. I think the GPU timeslice could easily be given back to the title, with a worst case of slowing down the frame rate of the running title when the dash is up, which would likely be acceptable to most consumers. Dynamically taking eSRAM from the game would be more problematic and so while game devs could likely plan for that and reduce resolution or something when snapping another app via api/notification; time may not how allowed for such a system to have been fully vetted for launch.

Does COD:Ghosts even use deferred rendering? The whole thing is rather curious. Reducing resolution to achieve a target frame-rate, which is what IW has claimed, means there are either bandwidth or rasterization constraints. There was a lot of talk about thread allocation in that interview as well, which is bunk in the context of having to lower res to get to 60fps. So they are still dancing around what's really happening to the extent I'm not sure the 3MB's comment can be directly applied.
 
I agree that it sounds like there is some eSRAM reservation. Would be interested to understand the amount and use case for it.
I don't think so. It makes zero sense. The ESRAM is a working buffer - you'd no more reserve a slice of it for the OS than you'd reserve a slice of L2 CPU cache.
 
Maybe the 10% reservation extends to the eSRAM as well? It could be that the 10% is a full system reservation CPU,GPU,eSRAM.
 
I don't think so. It makes zero sense. The ESRAM is a working buffer - you'd no more reserve a slice of it for the OS than you'd reserve a slice of L2 CPU cache.

Didn't Microsoft previously talk about how useful the eSRAM is in regards to GPGPU and the example they used was with the kinect processing they do, wouldn't that require a eSRAM reserve?.
 
Back
Top