Robbie Bach says this generation will last longer

And how do you expect the console manufacturers to achieve such a market without mutual cooperation?

And how would this benefit consumers more than the market does right now. If I want a cheap console with cheap graphics, a huge library of quality titles and with current titles with cheap budgets, you only have to look at the PS2. If you want a more nominally priced console with great graphics then you have the 360 arcade. Add $100 more and add $5 a month fee and you get great graphics with additional services like multiplayer. Add $200 more and you get all those feature minus the additional $5 monthly fee plus BluRay.

Tell me how a market based on less powerful consoles with cheaper dev costs can produce a more diversified market that exists right now.

Today's market might not be the healthiest from a dev or manufacturer point of view especially if you are Sony or a dev that poured 10 of millions into a title that underperformed but from a consumer stand point this is literally the best market for a generation ever.

You make no sense. When I'm talking about a next gen console at an affordable price, you name the cheapest 360, which is essentially the exact thing I'm talking about, a $200 to $300 console next generation.

Cheaper development costs for developers = more games, less risk. If they don't have to invest huge budgets into titles, they have more "leg room" to experiment more, meaning some titles that are available on XBLA / PSN may come around as larger full-on releases in the future. The market would have more room to grow to more consumers because costs wouldn't be so high.

Consumers buy the 360 and PS3 because they are the best that are available. Obviously the Wii is outpacing BOTH of these consoles world wide, you'd be foolish to think Sony and Microsoft don't want a chunk of that market, and the only way they'll get it is with an affordable console at launch with new intuitive features that expand interaction and accessibility.
 
You make no sense. When I'm talking about a next gen console at an affordable price, you name the cheapest 360, which is essentially the exact thing I'm talking about, a $200 to $300 console next generation.

You saying cheaper consoles, whose cheapness is the effect of using lesser performing tech will provide a better market. I asked you how do you achieve that without mutual cooperation of the manufacturers? Natural market forces will always encourage manufacturers to maximize value to customer by maximizing the graphical prowess of their consoles. Nintendo stepped away from that battle, but its highly unlikely you will see both Sony and MS step away.

You claim that this market based on cheaper console with cheaper development cost will produce a better market for consumers. I asked you how is that better from consumers stand point? Its the high price of the 360 and PS3 and the stagnant price movement of the Wii, that has allowed the PS2 to prosper years into this generation. You literally have a generation where viable consoles have existed at every price point minus the 179.00 to 199.00 range from the outset. Consoles that are priced lower and fall faster will only serve to suffocated the sales of its earlier brethen faster while only offering a minor upgrade visually.

What we now have is a broad market that offers a pleothra of choices to consumers.

Cheaper development costs for developers = more games, less risk. If they don't have to invest huge budgets into titles, they have more "leg room" to experiment more, meaning some titles that are available on XBLA / PSN may come around as larger full-on releases in the future. The market would have more room to grow to more consumers because costs wouldn't be so high.

Devs can right now choose to develop cheaper titles with higher volume of releases. Cheaping the tech might forced them into areas they haven't chosen, but those areas are available to them now. Its a given that everyone can't produce high budget high quality titles that sell well. But why do you think its better to sheep herd the devs who can't produce high budget, high quality and well selling titles reliably by cutting those that can off at the knee by providing lesser performing consoles?

There are plenty of devs losing money because consumers have basically told them that "you can't get away with spending a lot of money on titles based in common genres with derivative gameplay". The market as we speak is forcing devs to rethink their approach. If they don't they will die, but the potential profits in the gaming industry will encourage the creation of other developers.

Consumers buy the 360 and PS3 because they are the best that are available. Obviously the Wii is outpacing BOTH of these consoles world wide, you'd be foolish to think Sony and Microsoft don't want a chunk of that market, and the only way they'll get it is with an affordable console at launch with new intuitive features that expand interaction and accessibility.

The difference between an Arcade that is more visually capable than a Wii was a $50.00 premium at the Wii's launch and its now available at $50.00 dollars less. Wii's success is due to its controller and software and not due to cheap price. At launch $250.00 seemed like a cheap price but as it stands right now the Nintendo is on the verge of selling at or above $250 more units than any other console in history. If MS or Sony wants a chunk of that market then they have to offer something other than cheap prices. If you haven't notice as the sales price gap has shorten in favor of the 360 and Sony while the sales rate has widen in favor of Nintendo.
 
Can they be for your linear extrapolation on transistors? Keep in mind, yields for the 360 chips were abysmal at launch and for some time after. Their cooling solution left much to be desired, and how many units have failed despite their additional heatpipe in Zephyr :?:



You're missing the point, or at least not addressing it. As has often been mentioned, the roadmap for transistor scaling becomes muddy beyond 22nm. Future cost reductions are at stake, and the designers cannot bet on good die reductions as they have in the past. Power density, static/leakage power, pads/power supply, analog components... At sub-20nm, designers will be looking at rising importance of quantum effects... i.e. easier said than done.

Thats why I looked at designs that work perfectly fine on 65nm. Intel is readying 32nm for 2010 , TSMC apparently has 40nm ready and rumor sugests the refreshes of the 4850/70 are on that process along with ati's dx 11 chips. IF a 1.4b tranistor chip is avalible on 65nm and while it does use lots of energy you still have a 55nm drop a 45nm and a 40nm drop before the 32nm drop . 1.4b on 40nm shouldn't be a problem.

I'm also not sugesting that the chip be 1.4b tranistors. I was thinking more of a 900m tranistor chip (in line with the 4870 but a dx 11 variant based on the xenos) and the other 500m uses on edram. If the 10MB of edram in the xenos is 100m trasnistors then rough math says you can place 50MB in that 500m tranistor space.. That should prevent tiling at 1080p with 4x fsaa but i'm not good with that , someone better with the math for it can tell us. Regardless though it doesn't have to be 500m tranistors , they could go 600m or even less or use one of the new forms of ram.

They can make it a daughter die like they did with the xenos.

I also envision that the cpu will be much smaller and less important than the gpu and i would think 500m tranistors or so.

32nm and 22nm shoudl provide enough of a drop in power ,heat and costs through its life time. If as many of us expect next gen may last alot longer than this gen.

Well you do have the fundry too (in MS side not sure about Sony) also IBM/AMD and if they are not making proffit right now, soon you will have MS too making proffit in each console sold., not so diferent in the end.
29-Jan-2009 11:20

perhaps , the foundry does make a profit and i'm sure that ati get some cash and ibm. Though i'm sure that what ibm and ati get is much less than what ati charges the board makers. At some point the 360 will be sold for a profit , but we have no idea of when that is and we do know its been sold at a loss for a long time and may still be sold at a launch. However on just one process shrink the 360 has already droped in cost by half.


I am not sure I understand you post But I do expect a XB3 to cost more than a 360 in 2011/12.

Thing is if you are to release a console in 32nm, you will want it to cost cost as litle as possible at launch because wit will be hard to reduce cost later (but not impossible).

Anyway like I showed before inexpensive HW from today is already quite competitive with a 360.

I'm saying that the geforce gt 280 is from today , its actually almost a year old i believe and its 1.4b tranistors on 65nm . I don't see why MS couldn't go with a similar tranistor budget on 40nm or even 32nm

Why would a new generation console limit itself to KZ2 level visuals? While it would probably be easier to reach those with a more powerful machine, there would be pressure to get more out of that machine and you're back to a similar situation to what we have currently.

Thats my point. A 2011 system would easly out do killzone adn the effort required would be almost non existant. The amount of time and effort they put into the ps3 to hide the limitations of the texturing capablitys and other limits to get killzone to look as good as it does would be removed from the equation.

1) If you desire to get "easier results" for a game that looks "as good" as KZ2, why even increase the hardware if that's your benchmark? Further more, no engine is infinitely scalable, so if you increase the hardware to such a degree, you *have* to redo your framework and create the engine to utilize the hardware, unless you are using some sort of middleware, in which case there is still a lot of work to be done.

Sure, WoW runs on a multitude of cards, but how many of them actually "significantly" improve the visuals of WoW? What do you gain playing a game like WoW on a high end system, aside from some AA and high res/frame rate?

Have you seen what the crytech 2 engine is capable of on todays hardware ?

these are all user mods btw

http://teamyo.org/xzero/Images/Crysis/Levels/New Realistic Forest/NewForest5.jpg

http://i35.tinypic.com/qodb39.jpg

http://www.youtube.com/watch?v=6djX...d.com/showthread.php?t=48584&highlight=crysis

and lets not forget the toy shop demo from ati based on the x1800 tech
http://developer.amd.com/media/gpu_videos/toyshop.html

you don't need an infinate scaling engine , you just need a good engine that can be modded to take advantage of new features in hardware.
 
Thats why I looked at designs that work perfectly fine on 65nm. Intel is readying 32nm for 2010 , TSMC apparently has 40nm ready and rumor sugests the refreshes of the 4850/70 are on that process along with ati's dx 11 chips. IF a 1.4b tranistor chip is avalible on 65nm and while it does use lots of energy you still have a 55nm drop a 45nm and a 40nm drop before the 32nm drop . 1.4b on 40nm shouldn't be a problem.

I'm also not sugesting that the chip be 1.4b tranistors. I was thinking more of a 900m tranistor chip (in line with the 4870 but a dx 11 variant based on the xenos) and the other 500m uses on edram. If the 10MB of edram in the xenos is 100m trasnistors then rough math says you can place 50MB in that 500m tranistor space.. That should prevent tiling at 1080p with 4x fsaa but i'm not good with that , someone better with the math for it can tell us. Regardless though it doesn't have to be 500m tranistors , they could go 600m or even less or use one of the new forms of ram.

They can make it a daughter die like they did with the xenos.

I also envision that the cpu will be much smaller and less important than the gpu and i would think 500m tranistors or so.

32nm and 22nm shoudl provide enough of a drop in power ,heat and costs through its life time. If as many of us expect next gen may last alot longer than this gen.

Dont overstimate transistores counts is there many things hapening , GDDR5/XDR2 alone will give you lots of BW for 720p 4xAA (both exist for some time, if I am correct), things like Z-Ram will give 4x L3 in the same die size, SPUs/P54c(larrabee)/stream processors if well balaced will give you a edge, acelerators (like the defunct PPU, and the many new Fusion based ones)...

All of those can give you many times the performance, yet a relative small increase in price



perhaps , the foundry does make a profit and i'm sure that ati get some cash and ibm. Though i'm sure that what ibm and ati get is much less than what ati charges the board makers. At some point the 360 will be sold for a profit , but we have no idea of when that is and we do know its been sold at a loss for a long time and may still be sold at a launch. However on just one process shrink the 360 has already droped in cost by half.

Even if they dont make the money in the HW they make it on SW, it is only a ilusion that a console is cheaper (in fact the games/online are quite more expensive)



I'm saying that the geforce gt 280 is from today , its actually almost a year old i believe and its 1.4b tranistors on 65nm . I don't see why MS couldn't go with a similar tranistor budget on 40nm or even 32nm
 
Sure, WoW runs on a multitude of cards, but how many of them actually "significantly" improve the visuals of WoW? What do you gain playing a game like WoW on a high end system, aside from some AA and high res/frame rate?

WoW is a very old game i have no idea where you are trying to go with this rant.

Yes, there is significant improvements in playing WoW on a high end system and on a low end system, the difference are rather big, however WoW still looks like **** because its old..

Just like pretty much any game out there on the PC , will look much much much better on a high end system than on a medium or low end one. Crysis for example. Looks better than any other game if you own a 8800GTX or better, looks\runs like crap if you got something worse.
 
Just a few things;
I think we've hit a point, visually, where the increase simply isn't going to mean as much to the consumer, especially not when the HUGE majority of next generation titles are stylized and not 'photo-realistic'.
I don't think we've quite hit saturation point yet with graphics. According to Iwata we already had, but I think its not only too early to call, we're not certain when that turning point will arrive. Even at this point, there are few titles out or prospective we can actually call near flawless in thew visual sense or with little desire for improvement. This debate would probably make more sense in 2012, where we're possibly seeing a level of flexibility and clarity in graphics in which match or surpass pre-rendered CGI of a generation or two ago.
The better question is, if we only see a 2x power increase, how much money would developers save using and tweaking their code for longer than a few years? It could have a significant (positive) impact on the cost of "big budget" games. It could also relieve pressure from the developers shoulders to get big budget games out of the door "as soon as possible", which in turn would improve polish, since the budgets would be smaller, engines would be more familiar, and more time could be spent on polish rather than engine assembly.
Don't underestimate the rising trend of outsourcing and licensing of graphics technology. What's to stop anyone 1st or 3rd party from utilizing CryEngine 4 or id Tech 6 next gen should they choose to spend more time on non-graphical development? Undershooting on hardware isn't the answer to more focus on fundamentally great experiences, when new h/w technology can really help drive that. Furthermore, developers and publishers have a responsibility for great quality in software which goes for anything from DS to PS3, how they go about it is their own choice.
I mean, does absolutely no one see where I am coming from? I understand you all want performance gains, this is a 3D gaming site afterall, geared toward technical analysis and 3D modeling, but if I were to be completely honest, I think the constant increase in power is exactly what put us in the PS3 boat in the first place, and gamers pretty much rejected that idea.
I do see where you're coming from, but I disagree about the next couple of statements.

The reason why PS3 is in the state its in, is because all Kutaragi and co. could see was multimedia supercomputer, with gaming as just a mere subset of the more general purpose media computer that Playstation had become. With that, an expectation the previous PS2 installed base would transfer to PS3 just like that, we're increasingly seeing those same consumers move off to 360 or Wii.

It wasn't technical advancement in graphics and processing that was the issue, look at 360. It was everything else from several non-game related applications to flawed marketing that was largely rejectected.
 
Dont overstimate transistores counts is there many things hapening , GDDR5/XDR2 alone will give you lots of BW for 720p 4xAA (both exist for some time, if I am correct), things like Z-Ram will give 4x L3 in the same die size, SPUs/P54c(larrabee)/stream processors if well balaced will give you a edge, acelerators (like the defunct PPU, and the many new Fusion based ones)...

All of those can give you many times the performance, yet a relative small increase in price

Oh I agree, I've been interested in other forms of edram or cache for gpus for awhile , i've just been having a hard time fidning information. Z ram could be a huge deal if its 4x the amount in the same space as edram. With the same 100m tranistors of the edram on the xenos they can put in 40MBs.

I sitll think MS will go with some form of edram on the gpu again be it edram , t1-ram Z ram or what have you. The amount of bandwidth offered for the price is a huge deal. They would be able to get away with more slower gddr 5 by going with z ram or something similar. I'm also thinking they will go with 720p as their target again. 720p with 8x fsaa and 2x super samping or whatever ati's hybrid verison of that is. I don't know how much space in the dram you'd need for it. But i think MS would go with that amount or mabye slightly more since tiling doesn't seem to be a big win for them this gen.
 
720p with 8x fsaa and 2x super sampling or whatever ati's hybrid verison of that is. I don't know how much space in the dram you'd need for it.

16x the framebuffer size. If you meant 2x super sampling in both dimensions, then it is 32x the framebuffer size with 8xMSAA and 4xSSAA. You would be asking for nearly 340MB in the latter with an FP16 format and 32-bit Z @720p. ;)
 
Oh I agree, I've been interested in other forms of edram or cache for gpus for awhile , i've just been having a hard time fidning information. Z ram could be a huge deal if its 4x the amount in the same space as edram. With the same 100m tranistors of the edram on the xenos they can put in 40MBs.

Xenus Edram already is 1 transistor per byte, Z-Ram may be a replacemente for L3 cache, anyway it is a interesting thought, having a eg Phenon II with more cache than a Phenon at the same price/die size. More even if the CPU and GPU are on the same die
 
I think this generation will last longer for many reasons, but I think the biggest reason is the economy. I think the USA is headed towards a very long severe recession.
 
Xenus Edram already is 1 transistor per byte, Z-Ram may be a replacemente for L3 cache, anyway it is a interesting thought, having a eg Phenon II with more cache than a Phenon at the same price/die size. More even if the CPU and GPU are on the same die

is it only amd with acess to the zram ?
 
is it only amd with acess to the zram ?

Anyone who use SOI can use , if they license it. Althought it is based on an fx so I guess that others may have some R&D on it to.

Anyway the only one that I know is AMD, for some time now, but see more on Wiki.
 
Back
Top