I dunno, my impression was that they were mostly just listing issues with the dev environment. They never really mentioned stuff like weak hardware, lack of time, lack of publisher support or such.
Bad AA algorithm results in the bright halo with the edges, eg check above the black guys head the inside of the cockpit.Why is that XO shot having that horrible edge enhancement?? Same as X1 shots...
Crytek is working on Ryse, and yes it's 900p and 30fps, but it's also one of (if not the) the most technologically advanced games today. Which should make any dev tool issues even more problematic for them compared to the port of the yearly iteration of COD.
That is an A-mazing article. It explains everything so well -and I also listened to the audio of his youtube video, well said-, it is unbiased and the guy knows his stuff. It is perhaps the best article on the matter to date, imho.Another article on the 720p fiasco.
http://www.redgamingtech.com/xbox-one-esram-720p-why-its-causing-a-resolution-bottleneck-analysis/
It doesn't seem to say anything new but gets into the numbers more than most articles.
What's interesting to me is whether this was a business decision or an engineering failure to not be able to display 1080p/60 fps in a game like COD. With all their DirectX knowledge, they must have known they would have trouble with this early on. Did the engineering guys say, we won't do 1080p without high memory bandwidth and the business and numbers guys say "no GDDR5 for you!" or did the engineering team want to try to be clever and see if they could get it done more efficiently? Where is Dean Takahashi?
If XB1 ends up being 900/60 vs 1080/60, few would care outside of forum people but the fact that right now, COD is closer to the 360 resolution than PS4 makes for a story that's being picked up by every tech publisher and even non-tech publishers. That would be an engineering failure in my mind. If they get there in 2014 COD, the story might die. I don't know if they'll get there.
Also, the PC argument isn't a good one to use. You always want to get the best hardware out of anything you purchase. Take the iPad for instance. They'll never be as good as a PC but Apple pushes really hard to make their AX chip essentially double in speed each year. And there's no more a casuals system than the iPad.
First footage of Ryse were 1080p iirc...right or wrong? Not sure about this...but if true, it could point to some unexpected issues...
"It's not just hardware physically, the amount of resources that each system is allowing the game developers to use isn't the same," Rubin revealed when questioned on why the Xbox One version of Call of Duty: Ghosts did not hit the same 1080p expectations as the PS4 offering.
"From our standpoint that's something that could change,” he added. “We might get more resources back at one point. And that could make things change dramatically for the Xbox One.
“It's a long complicated road that will take years to develop, and I think at the end we'll have games looking very similar, usually, on both systems."
Seems like people are completely blaming the devs when that's not fair IMO. People did the same with bad PS3 ports, and PS3 owners brought up games like Uncharted as an example of what the system is capable of. Most people would tell PS3 owners that it's not a fair comparison at all, and now I'm seeing essentially the same arguments. The only thing I'm surprised about is that it's 1080p vs 720p. But still, I don't think the blame can be put on the devs alone.The only suprising part is X1 version is 720p and not 900p. They werent even close to 60fps at 1080p when they tried it because they had to more than halve the pixels
Wouldnt be shocking if X1 version stays more to 60fps than PS4 because X1 should be reeeally locked
A little bit suprised PS4 was allowed to go for that..
Of course he's going to say that they look very similar... he's not going to shit on MS.Mark Rubin has explained that in the end games are going to look very similar on the PS4 and the Xbox One, and when MS free some resources up the performance of games on the Xbox One could change dramatically.
Dramatically seems to be such a strong word... 10% is not a big deal, imho.
Read more at:
how is it horrible if it make some think its better?
...then they come in and you go, okay, well it needs 3MB of RAM - oh, crap, we only allocated two! You can't just take a MB from anywhere. It's not like there's just tonnes of it just laying there. You have to pull it from something else.
Isn't it also possible he meant GB instead? As in 3GB of the 8GB RAM that is being used by the OS?
Tommy McClain
I don't think so. It makes zero sense. The ESRAM is a working buffer - you'd no more reserve a slice of it for the OS than you'd reserve a slice of L2 CPU cache.I agree that it sounds like there is some eSRAM reservation. Would be interested to understand the amount and use case for it.
I don't think so. It makes zero sense. The ESRAM is a working buffer - you'd no more reserve a slice of it for the OS than you'd reserve a slice of L2 CPU cache.