The Next-gen Situation discussion *spawn

Providing a piece of compelling mass market software that lights sales charts on fire doesn't necessarily have anything to do with the power of the hardware. Yourself and Rangers should know this, as there are plenty of examples, both in and out of the home console space.
 
It will cost too much money to be financially viable. Or lose too much money.

where do you get this idea that a console can be as good as a high-end PC? That's outmoded thinking. The PC has accelerated away in terms of power and the consoles cannot compete. It's a ridiculous notion to look to console to provide that level of performance within what we consider a normal box. But agree or not, there's no point repeating this same old discussion again. You want better GPUs and we've all heard that now. It might happen or it might not. People should stick to talking about thermals and costs and die sizes in making these predictions, and determining if XB3 or PS4 will have something akin to a GTX 680 based on considered factors, irrespective of what anyone here wants to see in a console.

True of a PC with a $500 video card and other top components.

But next gen consoles will be $300-500. How are the PCs in that price range?
 
It's unlikely that running PC games will get you very close to system tdp. Running something like furmark on a 560 Ti will draw vastly more power than even peak load in a game and it's hard to see any game really pushing an Intel CPU. On my over clocked Sandybridge, Intel Burn Test can literally double power consumption compared to running a multicore game.

For 360S testing they actually ran a power virus on the CPU and GPU simultaneously and showed the results in the Hotchips presentation - actual games were causing the (CGPU presumably) to draw 80+ % of what the power virus did. I don't think PC's get nearly that close to the thermal/power ceiling when playing games, even with all the settings bumped up.
 
I wonder how i ended up here, now there's two threads :)

there's no point in epic or others asking for a GTX 680 in a console. If there's any hope for the next gen having the power of a GT680 (and i would post this world wide if i was able to.) is for them to not bother with what's currently out.

It's possible to have similar sesults, but if you're looking for a down sized model of a highend chip it's not going to happen. things like heat and wattage should be taken into account day one. only a new chip design would produce results like that.

maybe the core 8 series are the ones....."sigh" AMD and Nvida's new tech can't come any sooner could they? I yearn for a 16x leap, like the old ps2, Xbox and GC days.
 
360 and PS3 would of shown a much much bigger leap in graphics over X-box and PS2 if they didn't have to render in the 'HD era', that is where a lot of there power was wasted and why the jump at times seemed so small. The extra power the next generation consoles have will more then likely be consumed by natively rendering at 1920x1080.
I'm curious since I don't participate in the technical threads around here, but how many times has this been brought up?

Me personally? I don't care if all console games are stuck at 1280x720. Take the focus off of resolution and put the graphical power back into the actual games (frame-rate, physics, graphics, etc.) themselves. I think going beyond 720p should be done in the form of upscaling, worrying about several different resolutions will just lead us back to current console problems like screen-tearing, uneven frame-rates, and texture pop-in as trade-offs.

Consoles are about ease-of-use, the only thing I should be worried about on consoles is changing the control configuration. Integrating a few options here and there is fine, but tweaking at PC-level would be too much. The things that makes me worried about current and next-gen consoles is focusing on online services/games with DRM and proprietary accessories, now that scares the hell out of me.

Oh and then there's backwards compatibility. I know there's talks of incompatible architecture and BC through the cloud, but dammit I want it done on my next-gen console locally!:devilish:
 
I'm curious since I don't participate in the technical threads around here, but how many times has this been brought up?

Me personally? I don't care if all console games are stuck at 1280x720. Take the focus off of resolution and put the graphical power back into the actual games (frame-rate, physics, graphics, etc.) themselves. I think going beyond 720p should be done in the form of upscaling, worrying about several different resolutions will just lead us back to current console problems like screen-tearing, uneven frame-rates, and texture pop-in as trade-offs.

Consoles are about ease-of-use, the only thing I should be worried about on consoles is changing the control configuration. Integrating a few options here and there is fine, but tweaking at PC-level would be too much. The things that makes me worried about current and next-gen consoles is focusing on online services/games with DRM and proprietary accessories, now that scares the hell out of me.:

So you're telling me that setting your consoles out put resolution and leaving it at that is too hard for you too manage?
 
Sure but if your on a video game forum your already a part of the 1%.
I think you could have a significant power deficit and I don't know if the majority of people would notice.
There is a point below which the disparity is obvious and people vote with their wallets, but I think given where we are with polygon counts and shaders, that disparity may have to be pretty large.
I actually don't think there will be a big disparity, I suspect both sides have similar power constraints, and that will equate to similar overall performance. Though it's still possible we may see very different trade offs.

Gamers who discuss on video game forums or who read gaming sites for reviews etc are only 1% of total PS3/360 owners?

This gen, all the people who I know who have PS3s, if you asked them why they got a PS3 rather than a 360, the first reason they'd give was because it was more powerful.
And this is either something they'd read on IGN, or some forum or they'd been evangelised by one of their more interested gamer friends.
 
Gamers who discuss on video game forums or who read gaming sites for reviews etc are only 1% of total PS3/360 owners?

This gen, all the people who I know who have PS3s, if you asked them why they got a PS3 rather than a 360, the first reason they'd give was because it was more powerful.
And this is either something they'd read on IGN, or some forum or they'd been evangelised by one of their more interested gamer friends.

Yes, you're completely correct. It couldn't be the "It only does everything" commercials or any of the other mass market media blitzes that Sony did for a long time. It couldn't be they had the previous PS2/PSX console either. It certainly couldn't be brand loyalty. It certainly couldn't be blind fanboy-ism either. It must be online forums. Indeed.
 
Yes, you're completely correct. It couldn't be the "It only does everything" commercials or any of the other mass market media blitzes that Sony did for a long time. It couldn't be they had the previous PS2/PSX console either. It certainly couldn't be brand loyalty. It certainly couldn't be blind fanboy-ism either. It must be online forums. Indeed.

That too, I said earlier a lot of people drunk Sony's Kool Aid this gen.
And the online fora are the homeground of the blind fanboyism.

My point is that Sony's marketing, combined with word of mouth as well as what people read /discussed online led to the PS3's perception of being more powerful than the 360, despite this not being true (the two systems are probably the closest in terms capabilities of any two consoles we've seen)
 
Just hoping that "next-gen" consoles will be able to offer much better visuals than just "Frostbite 2" PC Ultra visuals.

Personally, I'm pretty happy that the 720 seems to have a GPU that will let it run BF3 at 1080p 60fps on Ultra settings.

So i'm not complaining, but I don't think we'll see consoles as powerful as the PS3/360 were when they came out this gen - the market has changed significantly as numerous people have said.

Not only is casual gaming on smartphones,tablets, PCs etc competing with consoles, but the global economy is weaker and consumers are less likely to fork out large amounts of money for a console (instead of getting that iPad).
Increased concern for the environment also mean it's unlikely that we'll see monster consoles (plus MS doesn't want a repeat of the RROD debacle and I suspect customers don't particularly want the behemoths that were the original 360 and PS3 sitting in their living rooms).

Plus, MS and Sony are probably looking to pack in Kinect and Move (or similar) which reduce the available budget for the processing hardware.
 
Last edited by a moderator:
Also:

It's already sad enough that "RSX" for example apparently wasn't based on G80 but still has to last for more than six years...

So please do not let such a "catastrophe" happen again...

:mrgreen:;)
Are you prepared to wait until Winter of 2014 or Spring of 2015 for the console to launch?

Did you realize, that the complete opposite was suggested with that?

To come back to the example mentioned above:

According to the following Wikipedia page:

http://en.wikipedia.org/wiki/GeForce_8_Series

the G80 was released on November 8, 2006?

And according to the following Wikipedia page:

http://en.wikipedia.org/wiki/PlayStation_3

the date the PS3 became available at retail was November 11, 2006?

So both apparently came out in the same week, but still the RSX (unfortunately) was only based on an old and much less exciting G70/G71?

Why? If they really wanted to, they probably could have based it on the G80, couldn't they?

So it would be nice if next time the hardware would be based on the very latest and greatest technology available at launch, namely GTX 780 / GTX 880 or HD 8970 / HD 9970 (or whatever they are going to be called), instead of the (especially then) quite old and probably much less powerful HD 6870/6950 (as rumored in the other thread), wouldn't it?

Why should gaming consoles lack behind the very latest and greatest technology when they launch?
 
Last edited by a moderator:
So both apparently came out in the same week, but still the RSX (unfortunately) was only based on an old and much less exciting G70/G71?

Why? If they really wanted to, they probably could have based it on the G80, couldn't they

Are you serious? The answer is quite easy to come too....

G80 consumed nearly twice the power under full load and kicked out a lot more heat then G70, Sony could not of put that into the PS3 without having to delay the machine and redesign the whole console.

It was also a lot more expensive, people thought PS3 as it is was really high priced at launch so can you imagine the cost with G80 in there?

You seem to be having problems yet again understanding heat out put and power draw.... RSX/G70 consumed around 80-85w under full load, now look at modern PC GPU's and Sony has around 200w to play with, maybe 250w at the absolute maximum.

power_maximum.gif


Now do you realise why a GTX 680/670 & HD 7950/7970 won't happen?
 
G80 consumed nearly twice the power under full load and kicked out a lot more heat then G70, Sony could not of put that into the PS3 without having to delay the machine and redesign the whole console.

They probably could have developed/designed the G80 and a G80 based PS3 alongside each other, couldn't they?

Or do you want to suggest that the first time SCEI heard about the G80 was on November 8, 2006 :rolleyes:?

It was also a lot more expensive, people thought PS3 as it is was really high priced at launch so can you imagine the cost with G80 in there?

What would you prefer:

Buying a product that would feature the latest and greatest technology on launch?

Or buying a product that would feature already obsolete and much less powerful technology on launch?

Also, who knows how much more expensive it really would have become if they really would have wanted RSX to be based on G80? Maybe it wouldn't have been THAT much? Again, who knows.

Also, do you only see drawbacks/disadvantages from that?

Games would probably look much better today, if that actually would have happened, wouldn't they?
 
Last edited by a moderator:
They probably could have developed/designed the G80 and a G80 based PS3 alongside each other, couldn't they?

Or do you want to suggest that the first time SCEI heard about the G80 was on November 8, 2006 :rolleyes:?

You are still not getting it... are you being a tool on purpose or are genuinely that slow?

Sony only had 200w of power to play with and Cell consumed the best part of 100w on it's own. Why is power consumption so hard for you to grasp? Sony had a 200w budget so they stuffed in there the best hardware that they could, would you of sooner had the PS3 that we have today or a G80 GPU and a single core Intel Celeron?

So how would making PS3 along side G80 have any effect what so ever on G80's power consumption? I'll tell you, IT WOULD OF MADE NO DIFFERENCE AT ALL.

You also have no idea about the cards in question, even at PS3's launch the G70 was still only used in only the highest end of gaming PC's, i.e a very very small percentage of the global PC install base.

So while G80 released a few weeks later PS3 still had a GPU that was more powerful then practically every PC except for the ultra high end ones.

And if you even bother reading about the birth and concept of G80 you'll know that Nvidia was very very tight lipped about it.

Sony again only have a power budget of 200w again for PS4 so why don't you look at my chart and price together a system that remains within that budget.

And it wouldn't bother me what Sony put in a console as I don't own a PS3 or a 360.

The days of new consoles shipping with the latest and greatest hardware is over with so deal with it.
 
Back
Top