Predict: The Next Generation Console Tech

Status
Not open for further replies.
Well that's when you use discernement.
The jump from 256MB ot 512 cost MSFT a billion but it also helped them as otherwise the ps3 would have pulled ahead.
Now we speak of a company, Epic, which has to sell its engine and the main feature they found to push people to adopt their next engine is a feature which while it is supposed to lower development costs require 8/16GB of RAM.
In my opinion, looking at the requirement, both the GPU and the amount of ram, it's not ready for prime / the technology is still not there give it a few more years.
Thought I wonder if it could still be a win for Epic if the dev kits comes with loads of ram.
Their technique may be used by artists to lower creation time and then the results somehow backed when the artist is pleased with the result.
 
Last edited by a moderator:
Somehow it's quite baffling to read how so many here appear to come to the conclusion that a GPU with the same or even better performance/specs compared to GTX 680 / HD 7970 would be "too powerful" for "next-gen" at the end of 2013 or maybe even later and that something like a HD 7770 would already be a "monster upgrade" :???:?

Just two examples:

According to the following review for example:

http://www.anandtech.com/show/5541/amd-radeon-hd-7750-radeon-hd-7770-ghz-edition-review

a HD 7770 appears to manage an [irony]astonishing[/irony] 33 fps at 1920x1200 in "Battlefield 3" and an even more [irony]astonishing[/irony] 20 fps at 1920x1200 in "Crysis: Warhead" :rolleyes:?

Seriously, do you want a PS3.5 / Xbox 540 or do you want a PS4 / Xbox 720 (or whatever they are going to be called) :rolleyes::eek::mrgreen:?
 
Last edited by a moderator:
Next generation will be defined by what MS and Sony put in their boxes, not by Epic or anyone else's demos.
Next gen engines will be scaled to run on those boxes, or Epic won't be selling very many copies.

The days where consoles get the ultra high end PC parts is over IMO, power constraints dictate that even if costs don't.

Can stacked / embedded memory make up for power constraints on the CPU/GPU at all, or would lower powered processors just get swamped by unusable bandwidth?
 
How strange is seing Dave posting here where we are arguing about if it will be a hd7750 or hd8850 when surely he has been selling the chips to both sony and microsoft! He will be making good laughs.A dollar for your thoughts!
 
How strange is seing Dave posting here where we are arguing about if it will be a hd7750 or hd8850 when surely he has been selling the chips to both sony and microsoft! A dollar for your thoughts!

Do you really think so? How many consoles have included an off the shelf part?
 
The ideal specs for the xbox 720 or Xbox 3 (whichever)

...
- 1080p 60 fps for xbox 360 games.
...
Um, and how would this be achieved? Upscaling and frame doubling? Doesn't sound very useful. Since the game controls it's render surface and update speed, it would be very difficult, if not impossible, to change their rendering settings without impacting the rest of the game significantly.
To Acert93 I think there might be a significant difference between Xenos and RSX, not in theoretical numbers but in sustained fillrate and fillrate with blending. RSX has indeed strong(er) points, more pixel shading raw power, more texturing power as well as support for some shadow filtering.
In peak numbers, yes, but on average, the XGPU often outperforms the RSX in those areas too. As you yourself put it, the numbers don't tell everything. It's like speakers that say they are 300watt, when in reality they're just 14watt RMS and they can, theoretically, for an instant, output 300 watts without exploding.
 
Somehow it's quite baffling to read how so many here appear to come to the conclusion that a GPU with the same or even better performance/specs compared to GTX 680 / HD 7970 would be "too powerful" for "next-gen" at the end of 2013 or maybe even later and that something like a HD 7770 would already be a "monster upgrade" :???:?

Just two examples:

According to the following review for example:

http://www.anandtech.com/show/5541/amd-radeon-hd-7750-radeon-hd-7770-ghz-edition-review

a HD 7770 appears to manage an [irony]astonishing[/irony] 33 fps at 1920x1200 in "Battlefield 3" and an even more [irony]astonishing[/irony] 20 fps at 1920x1200 in "Crysis: Warhead" :rolleyes:?

Seriously, do you want a PS3.5 / Xbox 540 or do you want a PS4 / Xbox 720 (or whatever they are going to be called) :rolleyes::eek::mrgreen:?
It's baffling how some expect without moving to a new process gtx kind of specs within a reasonable power envelop.
As the the HD77xx comparison doesn't make sense. People blindly consider FLOPS but forgot that card has half the ROPs, half the bandwidth of higher end model.
No matter how I look at it I see nobody here that expect either Sony or MSFT to come with such a ROPs and memory set-up. Not too mention the number of SIMD.
Actually I don't think anybody here would put the odd of that few Cus/SIMDs (10) clocked that high (1GHz) really high. If MSFT or Sony achieve a miracle all the best, but let look at the whole picture instead of FLOPS. BART was close to a "Miracle" in what it achieve vs a hd58xx.
I think there is room for a high performing part based on a clever balance of arithmetic power, ROPs throughput and bandwidth.

It's because you look only at FLOPS that you miss the fact that you are doing an apple to orange comparison. What ever the rumors are worse we Sony with a 256 bit bus (and most likely a matching number of ROPS (ie 32) and gddr5 and lot of bandwidth and MSFT with embedded ram. In both case a bandwidth/ROps set-up more akin to high end cards.
Other than that people were already calling the 360 an xbox 1.5 and base on what? Be my guest, paper flops and a shit load of bias.
 
Somehow it's quite baffling to read how so many here appear to come to the conclusion that a GPU with the same or even better performance/specs compared to GTX 680 / HD 7970 would be "too powerful" for "next-gen" at the end of 2013 or maybe even later.

It's also little baffling that you don't seem to understand why people are saying that. Those two GPUs run something like 2x hotter than Xenos in 360 did at the launch in 2005 and that was already pushing it. The 680 and 7970 are already at 28nm, so there likely isn't anything magical happening next year with regards to performance per watt, sure some improvements, but likely not to the extent, that would make something like a 7970 possible in smallish console without even mentioning the 384bit memory bus.

Closed box and large resources in game development will help and I'm sure the games will look satisfactory, but dreaming of top end PC-components will most likely end in tears. At least your not talking about 2x7990 like V3. :)
 
As the the HD77xx comparison doesn't make sense.

"almighty" came up with that:

http://forum.beyond3d.com/showpost.php?p=1668961&postcount=14709
http://forum.beyond3d.com/showpost.php?p=1668971&postcount=14713

It's baffling how some expect without moving to a new process gtx kind of specs within a reasonable power envelop.
It's also little baffling that you don't seem to understand why people are saying that. Those two GPUs run something like 2x hotter than Xenos in 360 did at the launch in 2005 and that was already pushing it.

At least according to the following video for example (just randomly found via search engine) it appears to be manageable in a Mini-ITX case:

http://www.youtube.com/watch?v=e7L0GeDaiIw

?

:mrgreen:

So why not in a console :mrgreen:?
 
Last edited by a moderator:
Its too expensive, too big, too hot. It makes no sense for them to put something like that in box. This gen was the last one where we have seen tech push that was comparable, if not better, than what was on PC market. GPUs have become much bigger and much more power-hungry, and the big price is result of that.

They can't make consoles that need two years to break even, technology is moving faster and faster, new markets (tabs, phones etc.) are opening up and they can't afford such risks. Who knows what will happen in 6-7 years, maybe these consoles truly are last and the biggest push will come with better online and cloud integration with system rather than graphics technology.

Of course, the leap will still be good, they need something to hook people up with, but don't expect high end PC stuff or better, its never gonna happen. I guess I would be happy with something like Pitcairn in console, couple gigs of RAM and thats it. Next stop, profit.
 
Well it uses it as a reference in one post and I don't think that in the other one is pushing that much the HD7770 performances are on another level than what are systems offers.
The HD7770 is used in the discussion because the arithmetic (theoretical) throughput is close to some rumors vague statement about +1TFLOPS GPU. Still that throughput alone isn't enough to determine a GPU performances.
Look at this review from Anandtech for example and how the HD8750 with not that much of FLOPS advantage (~ 30%) absolutely crushes the hd7700 in high quality setting (like 1080P and x4 AA).

I'm confident that next gen (both systems) will out perform an HD7770, I'm even more confident that neither the ps4 and the xbox next are to use off the shelves parts.

At least according to the following video for example (just randomly found via search engine) it appears to be manageable in a Mini-ITX case:

http://www.youtube.com/watch?v=e7L0GeDaiIw
You noticed the size of the box, the cooler, etc, right?
Not too mention the price of those components.
 
Last edited by a moderator:

Your first example with the nice Silverstone case has like 3x the inner volume compared to the launch 360 and a 18cm fan with huge holes in the case walls. The second example is only about 2x larger than the launch 360, but used water cooling on the CPU. Total costs for making something like that happen is likely more than the manufacturers are willing to go through.
 
Status
Not open for further replies.
Back
Top