*split* multiplatform console-world problems + Image Quality Debate

That would only be an issue if devs had to keep their frontbuffers in eDRAM. Besides that sounding absolutely bizarre for a platform that can easily transfer buffers between eDRAM and GDDR3, developers have talked about using back/G-buffering schemes that take up nearly the entire 10MB (i.e. Halo Reach's 96bpp at 1152x720), so I'd be extraordinarily surprised if that were a real problem on 360.

there is a difference too large between the band of the two memories, go back to gddr3 at render time I see it as a big bottleneck for processing
 
there is a difference too large between the band of the two memories, go back to gddr3 at render time I see it as a big bottleneck for processing

Huh? You can't read from EDRAM on the 360. To read or display it, you have to copy a buffer to gddr when rendering completes.
 
there is a difference too large between the band of the two memories, go back to gddr3 at render time I see it as a big bottleneck for processing
The FB is rendered in eDRAM. It is exported to DDR3. The display chip reads from DDR3. You can store as many framebuffers queued for output as you can fit in RAM, quintuple buffering should you so choose.
 
MS's problem now is what to counter with. They've already lost this battle, the two biggest third party launch games look amazing on one platform, not so much on the other. It's a done deal.


I recently got to play the One version of Battlefield 4 here at the Metreon. While I agree that the PS4 version looks damn good and just a tease better than the One version save resolution (if that matters to you), the One version looks pretty amazing as well. Much like its counterpart. But I wouldn't argue the differences merely to downplay the other.
 
Another article on the 720p fiasco.

http://www.redgamingtech.com/xbox-one-esram-720p-why-its-causing-a-resolution-bottleneck-analysis/

It doesn't seem to say anything new but gets into the numbers more than most articles.

What's interesting to me is whether this was a business decision or an engineering failure to not be able to display 1080p/60 fps in a game like COD. With all their DirectX knowledge, they must have known they would have trouble with this early on. Did the engineering guys say, we won't do 1080p without high memory bandwidth and the business and numbers guys say "no GDDR5 for you!" or did the engineering team want to try to be clever and see if they could get it done more efficiently? Where is Dean Takahashi?
 
Or they had to cut some corners elsewhere because they were going to spend a certain portion of the BOM on Kinect 2.
 
Or they had to cut some corners elsewhere because they were going to spend a certain portion of the BOM on Kinect 2.
The Kinect is taken into account with the $100 price difference between the PS4. These are essentially $400 boxes on equal footing. In fact, I doubt Kinect 2 is anywhere near $100 BOM.
 
What's interesting to me is whether this was a business decision or an engineering failure to not be able to display 1080p/60 fps in a game like COD. With all their DirectX knowledge, they must have known they would have trouble with this early on. Did the engineering guys say, we won't do 1080p without high memory bandwidth and the business and numbers guys say "no GDDR5 for you!" or did the engineering team want to try to be clever and see if they could get it done more efficiently? Where is Dean Takahashi?

My guess is that console dev tools maybe lagged behind, so COD Ghosts was primarily made on pc. Once console dev tools came up to speed they basically ported the pc version over to the consoles. That would work fine on ps4, but likely didn't sit as well with the xb1 both due to less gpu grunt and due to esram. With little time to re-architect memory use the only fix possible to make launch was to drop to 720p.

Ultimately I don't think this was an engineering failure on Microsoft's part. They want more of a do all machine that can break even at launch and downscale on costs well, and they were willing to go with less power this gen to meet that goal. Launch woe's aside, the software side should get better. If the situation ended up being 1080p on ps4 and 900p on xb1, I honestly don't think they would mind that so long as they get those disposable cash spending casuals and hybrid casual/core people on their box. At the end of the day it's the cash they are chasing, which they have identified as more than just going after core gamers.

When you think about it, casuals and hybrid core/casuals won't be too affected, and graphics whores will be on pc anyways. They risk losing some of the core gamers but presumably they think it's a calculated risk that will financially pay off in the end.
 
Ahh,forgot about the contrast,but yeah that does make it stick out more.Regardless, it's no better than current gen.
Jaggies is a function of resolution. The X1 is at the same resolution, using the same (no) AA solution. Of course it's going to have the same jaggies.

The difference, however, will be in texture resolution, draw distance and pixel effects. In other words, what people call "pixel quality". If you have any trouble telling the difference between the two games, then something is horribly wrong, and it's not the hardware...
 
Jaggies is a function of resolution. The X1 is at the same resolution, using the same (no) AA solution. Of course it's going to have the same jaggies.

The difference, however, will be in texture resolution, draw distance and pixel effects. In other words, what people call "pixel quality". If you have any trouble telling the difference between the two games, then something is horribly wrong, and it's not the hardware...

Current gen is a bit lower at 1208x704 and PS360 use MLAA/FXAA.


I wasn't talking about the overall, just the jaggies.


Why is that XO shot having that horrible edge enhancement?? Same as X1 shots...

XO= XBONE
 
In fact, I doubt Kinect 2 is anywhere near $100 BOM.
Why? It has a bleeding edge TOF camera sensor, that previously has only been available in devices costing thousands. I don't know what goes into making a TOF camera or device, so maybe it's just as cheap as a normal CCD, but the key issue for me is I don't know. Thus I trust you have some insider knowledge telling you the tech is cheap! ;)
 
Infinity Ward respond to the Resolutiongate issue.

One of the greatest challenges the engineers have to deal with is memory management, or thread management. There are X number of threads in your CPUs. Where in those threads is the stuff that's Microsoft or Sony? Where does it fall? How does it work? We don't have the SDKs for those features yet, and then they come in and you go, okay, well it needs 3MB of RAM - oh, crap, we only allocated two! You can't just take a MB from anywhere. It's not like there's just tonnes of it just laying there. You have to pull it from something else. And now you have to balance that somewhere.

Mark Rubin: In a way. I don't know if I can point to one particular cause. Early on, we didn't know where exactly the resolution of anything would fall because we didn't have hardware or the software to support it. We tried to focus in on 1080p, and if we felt like we were on borderline of performance somewhere... We tried to make the best decision for each platform that gives you the best-looking game we could get and maintains that 60 frames a second.

It's very possible we can get it to native 1080p. I mean I've seen it working at 1080p native. It's just we couldn't get the frame rate in the neighbourhood we wanted it to be.

I definitely see/hope both platforms will look way better the next time we get a chance at it.

So they are hoping they can get to 1080p native on the XB1 later on in the cycle. He didn't sound very positive that they can. Perhaps it will take MS to release resources to make this happen? It reads like there is a very real possibility that games taht demand more of the system are going to axe kinnect support in order to free enough resources.
 
Back
Top