NGGP: NextGen Garbage Pile (aka: No one reads the topics or stays on topic) *spawn*

Status
Not open for further replies.
I don't think microsoft are listening to devs anymore. It's probably true that they have a more casual mindset. Just think about it, if they listened to devs, then I doubt we would hear that epic games is dropping their voxel lighting solution. Only reason they're doing that is due to underwhelming specs. This sucks cause over 70% of high budget games use unreal engine so if that engine suffers, it means a lot of games will.
Surely Epic knew the final specs of Durango 6 months ago when they announced UE4, why would they make a big deal about voxel global illumination knowing they would have to cut it?
 
Nope. I am talking about providing a pc like enviroment for enthusiast where the higher skus play the same game from the same disc as lower end skus at higher fidelity.

If you havent notice the proposed hardware is catered to smaller form factors like tablets and small laptops where passive cooling usually suffice.

That said...

Is the console/set-top box world ready for the iPhone model where there's always a $399 version, $299 version, $149 version and all SKUs make money? Every two years they come out with a new model to take the place of the top model and everything gets bumped down one level.

Apple's pricing doesn't change for their entire lineup, they just continually give "more" instead of charging less for their SKUs. Couple this with a subscription subsidy...
 
That Voxel cutting thing is just a rumor/speculation. We've heard no actual statements from Epic ever confirming that to be the case as far as i know.

I think that console launches regularly bring out the worst in the atmosphere surrounding the industry, when the dust settles, both consoles will be getting about the same support from third parties regardless of their power disparity.
 
Surely Epic knew the final specs of Durango 6 months ago when they announced UE4, why would they make a big deal about voxel global illumination knowing they would have to cut it?

FWIW I don't think they cut it based on next gen specs, it's more likely impractical or not worth the cost even on high end PC's.
If you look at the demos they were all in very constrained environments.
 
I want to ask you an honest question, publicly, but I really wonder if you will respond. What is your opinion on the differences between Orbis and Durango? Which do you prefer? Why? I understand your unwillingness to talk, but as far as I'm concerned, people deserve a drop in the pond from the best insider in the business :).
I couldn't really comment even if I knew anything about the Sony machine. Things I say that look like they contain real information tend to get appropriated.
 
well hopefully they have a better alternative cause the articles I read made it look amazing and revolutionary for devs. Square enix already licensed UE4. I wonder if they knew about this problem. I'm sure it was part of the agreement. I guess we'll know more at GDC.
 
Can someone help me out here.. so in simple terms MS opted for more cheaper RAM, and to offset the bandwidth penalty swapped CU's/ROP's for ESRAM.

Is it reasonable to assume that CPU's between PS4 and Xbox3 are roughly the same.
And is it also reasonable to assume that GPU die size between PS4 and Xbox3 are roughly the same (guessing here 32MB ESRAM vs 6CU's and 4ROP's).

The only real difference from a BOM perspective would be main RAM.

I've seen a number of posts about Sony implementing boutique memory interfaces to bump their system from 4 to 8GB; but seemingly impractical.

Isn't Xbox the one trailing here and possibly needing to do more. How difficult would it be for MS to make a late change to 16GB? And could that enable new scenarios/experiences that might be a differentiator?
 
That said...

Is the console/set-top box world ready for the iPhone model where there's always a $399 version, $299 version, $149 version and all SKUs make money? Every two years they come out with a new model to take the place of the top model and everything gets bumped down one level.

Apple's pricing doesn't change for their entire lineup, they just continually give "more" instead of charging less for their SKUs. Couple this with a subscription subsidy...

I think for the mainstream they will be happy with whatever sku they purchased as long as its supported with a traditional console life span (5-6 years). But who of us is going resist extra hardware that's not a peripheral and bolster the primary feature of gaming, gameplay and visuals. This includes the caveat where the only the fragmentation that exists is just the experience itself with no loss in plug and play and the library remains contiguous.

I really like the ideal that a console can serve as a base configuration to a platform that would provide acquisition of greater performance through upgrades at an accelerated pace while still maintaining a 7-8 year transition to totally new hardware. Where upgrades provide the level of performance that's worth the additional investment. I am all for that reality. Hardware upgrade done in a way thats attractive to us as gamers is the missing revenue that MS and Sony needs to create a healthy ecosystem.
 
Can someone help me out here.. so in simple terms MS opted for more cheaper RAM, and to offset the bandwidth penalty swapped CU's/ROP's for ESRAM.

Is it reasonable to assume that CPU's between PS4 and Xbox3 are roughly the same.
And is it also reasonable to assume that GPU die size between PS4 and Xbox3 are roughly the same (guessing here 32MB ESRAM vs 6CU's and 4ROP's).

The only real difference from a BOM perspective would be main RAM.

I've seen a number of posts about Sony implementing boutique memory interfaces to bump their system from 4 to 8GB; but seemingly impractical.

Isn't Xbox the one trailing here and possibly needing to do more. How difficult would it be for MS to make a late change to 16GB? And could that enable new scenarios/experiences that might be a differentiator?

8GB is already overkill (hence 3GB rumored OS reservation sounding probable) on the 720. 16GB isn't even remotely logical.
On the Orbis, upping it to 8GB might have a bit more merit as there is probably the bandwidth to utilize it.
Just look to the bandwidth available per frame to understand why.
 
People probably should stop using "PR metrics" from MS's marketing slides and "performance estimates" given to insiders as a baseline of Durango's performance.

From a pure numbers angle Durango is *not* 6-8x faster. Architecture aside: Fillrate is about 3x (4 vs. 12.8Gpixels), texturing is about 5x (8 vs. 38Gtexels), flops are about 5x (240 vs. 1200GFlops), about 3x triangles (500M vs. 1600M triangles/sec), aggregate bandwidth, counting only the bandwidth to the eDRAM and not the internal bandwidth is something less than 4x (54 vs. 170GB/s). Memory footprint is over 6-8x jump, once you count count the OS it is about 10x.

Durango is mostly in the 3x-5x range of raw peak performance improvement range.

Before the numbers are poo-poo'd with "efficiency of modern GPUs" I would note that Durango also has to perform at 1080p, counter diminishing returns, GPU performance doesn't necessarily increase linearly with raw specs (i.e. workflow issues), more expensive shaders to get better results, etc balance that out--especially the higher resolutions. So any appeal to "it will look 6-8x better" (whatever THAT means) is easily then cut in half but increasing resolution, which brings it back inline with 3x-5x or less in terms of "realized" potential.

The good news is Cape Verde seems quite capable of playing games like BF3, Batman AC, Dirt 3, etc at relatively high settings (minus AA) near 1080p at 30Hz so indeed Durango can play this generations worth of content the way mid-low range PCs currently do.

I don't think you'll see many, if any, 1080p games. Dynamic resolution is looking like it'll be a common feature if the display planes patent has anything to do with this box. They're not gonna waste all the gpu power just to get a 1080p resolution bump over the previous gen which didn't even hit 720p with stable framerates consistently. I wouldn't cry too much until you actually see the games. That said, PS4 should be noticeably better than Xbox in the visuals department. We'll see if the price is noticeably different as well.
 
Can someone help me out here.. so in simple terms MS opted for more cheaper RAM, and to offset the bandwidth penalty swapped CU's/ROP's for ESRAM.

Is it reasonable to assume that CPU's between PS4 and Xbox3 are roughly the same.
And is it also reasonable to assume that GPU die size between PS4 and Xbox3 are roughly the same (guessing here 32MB ESRAM vs 6CU's and 4ROP's).

The only real difference from a BOM perspective would be main RAM.

I've seen a number of posts about Sony implementing boutique memory interfaces to bump their system from 4 to 8GB; but seemingly impractical.

Isn't Xbox the one trailing here and possibly needing to do more. How difficult would it be for MS to make a late change to 16GB? And could that enable new scenarios/experiences that might be a differentiator?

16 gigabytes? are you serious? That's not even remotely logical for a console coming out in 2013, especially with the bandwidth it does have.

And there's no real need for it considering the GPU, the ram and bandwidth it has now is perfectly aligned.

Now Sony could upgrade their specs to 8gb(although that would be pretty expensive), but it atleast is more likely if they want to be the most powerful in all ways. Their GPU is more powerful, and has the ROPS and CU's to utilize it without starving it.

What these specs will be able to do with games in a closed box should make everyone excited for the possibilities coming up.
 
If i find this hilarious because when everyone was drooling over Cell and its theoretical TFLOPs and how it was a supercomputer. It was the 360 with the most relevant technology as it was the first piece of mainstream hardware to be released with an unified shader gpu which dominates gpu tech now.

The 360 GPU indeed crossed the generation boundary between unified and specialized shaders.
At the same time, Cell spearheaded heterogeneous computing where nextgen consoles and AMD's modern GPUs are entering as we speak.
 
You have zero idea how large these next generation machines are going to be. At the very least, they will be around the size of launch PS3's.

The power consumption is down, probably significantly in the case of Durango compared to previous gen at launch, why would the size be at least that of the launch PS3?

I personally don't think Sony will go with 8 GB of RAM, 6 GB GDDR5 is far more likely.

Well let's hear your ideas on how you'd setup the 6GB of GDDR5 in PS4, amount of chips and width of the bus?
 
8GB is already overkill (hence 3GB rumored OS reservation sounding probable) on the 720. 16GB isn't even remotely logical.
On the Orbis, upping it to 8GB might have a bit more merit as there is probably the bandwidth to utilize it.
Just look to the bandwidth available per frame to understand why.

16 gigabytes? are you serious? That's not even remotely logical for a console coming out in 2013, especially with the bandwidth it does have.

And there's no real need for it considering the GPU, the ram and bandwidth it has now is perfectly aligned.

Now Sony could upgrade their specs to 8gb(although that would be pretty expensive), but it atleast is more likely if they want to be the most powerful in all ways. Their GPU is more powerful, and has the ROPS and CU's to utilize it without starving it.

What these specs will be able to do with games in a closed box should make everyone excited for the possibilities coming up.

I'm confused by this bandwidth/frame notion. That does not even make sense. And at any rate, Durango has more bandwidth/rop or flop than Orbis, and even if it didn't I don't see why more memory requires more bandwidth.

There are lots of scenarios I can image that are expensive from a memory footprint perspective and not a memory bandwidth perspective. NUI in particular, in addition to the obvious game world content and system databases. But I'd rather hear from some dev's to see what they think. More is generally better, and while it may go to waste in some multiplatform titles. I sure people can come up with all sorts of ways to use it. Perhaps pre-computing data or render targets and storing them for later use saving clock cycles down the line.
 
The 360 GPU indeed crossed the generation boundary between unified and specialized shaders.
At the same time, Cell spearheaded heterogeneous computing where nextgen consoles and AMD's modern GPUs are entering as we speak.

Yeah it spearheaded the gpcpu movement (graphical purpose central processing unit). What's pushing heterogeneous processor is the desire to bring gpgpu power closer to the CPU, not the other way around. LOL.

Im joking. I applaud the PS3 in the end it was no slouch and is the best all around current gen console out there. But it wasn't a dominant console and it's potential never fully realized. It's was just a little too ahead of its time.
 
I'm confused by this bandwidth/frame notion. That does not even make sense. And at any rate, Durango has more bandwidth/rop or flop than Orbis, and even if it didn't I don't see why more memory requires more bandwidth.

There are lots of scenarios I can image that are expensive from a memory footprint perspective and not a memory bandwidth perspective. NUI in particular, in addition to the obvious game world content and system databases. But I'd rather hear from some dev's to see what they think. More is generally better, and while it may go to waste in some multiplatform titles. I sure people can come up with all sorts of ways to use it. Perhaps pre-computing data or render targets and storing them for later use saving clock cycles down the line.

Gpus tend to be bandwidth bound not instruction pipe bound.
 
There's no second GPU and has never been one in any credible rumour besides the 2010 roadmap leak.

They were going for 6-8x the 360, Durango is at least that - enough said.

I'd bet they explored the idea for a while, but the 2nd GPU was always going to be part of an ARM based SoC for low power operations and managing the overlays and notifications. I'm sure they gave up on the idea when they decided it would be too costly o integrate with the rest of the design relative to any advantages over AMD's modern power-saving techniques.

I couldn't really comment even if I knew anything about the Sony machine. Things I say that look like they contain real information tend to get appropriated.

Sure, sure, sure. But about bikes: If you had to choose between a 15 Speed Japanese model with racing wheels or a 10 Speed American bike with a little trailer full of DVDs permanently attached, which would you choose?
 
Yeah it spearheaded the gpcpu movement (graphical purpose central processing unit). What's pushing heterogeneous processor is the desire to bring gpgpu power closer to the CPU, not the other way around.

The goal is to allow CPU and GPU to share workload seamlessly. So both will move closer to each other.
 
So what exactly does more RAM buy you in terms of gaming performance or IQ?

Larger levels potentially. Higher resolution (hence larger) textures. Perhaps this next generation of consoles won't have textures that turn into a blurry mess as soon as you get within 3 meters of them in game. :p Potentially for more complex graphical effects. You can hold more items in memory before having to go to disk. Basically it gives developers a lot more flexibility in what they can do.

What it doesn't help in, are things that are highly computationally bound. Such as with a large number of players in a multiplayer game. Or physics calculations (at least ones that are relevant to gaming).

In a high end PC, they actually use much less than 3.5 GB. But that is because they are running upscaled console games to begin with. When your actually designing your game around that ram amount as the lowest common denominator, a lot changes in terms of design, but i should not have to be the one saying that.

That isn't the only reason or even the main reason. The main reason is that a large percentage of the PC install base is still running 32 bit versions of Windows. That imposes a hard limit on how much addressable memory a program can have available to it at any given time. By default this is limited to a 2 GB virtual address space. There are some things that can be done to get that up slighting but that also comes with its own potential problems.

It'd be nice if Microsoft stopped releasing 32 bit versions of their OS, but it hasn't happened yet. Hopefully the next version of Windows will drop the 32 bit version. By then 32 bit only machines capable of running Windows will hopefully be a large minority in businesses that require MS to release 32 bit versions. And consumers can always just stay on the version they are using if they don't have a 64 bit capable CPU.

Regards,
SB
 
Legacy is very strong in the MS world, a 32bit version of Windows allows you to run Windows 3.1 and DOS apps (!). Running 2000/XP drivers is another feature, at least on 7 : you thus can have a modern OS on a modern PC running a 12 year old driver and 20 to 30 year old apps.

32bit version of Windows 8 though doesn't target old hardware the way Windows 7 does. A CPU with NX bit is needed, and it's pretty rare to have a CPU that supports that yet doesn't support 64bit. Also Windows Server has gone 64bit only ; so *maybe* Windows 9 will be 64bit only.
 
Status
Not open for further replies.
Back
Top