JPR Q3 2011 shipments

I think it's fair to say that this is no longer the case, and that future APUs (from both Intel and AMD) will make it even less true. At some point, developers will start considering APUs as the primary target for (at least mainstream) games and when that happens, they'll have to choose where to spend the largest amount of effort. Intel will be the biggest player, but with relatively slow graphics, while AMD will come second but with faster integrated GPUs, and much faster discrete GPUs based on the same (or a very similar) architecture. How much time will be left to optimize for NVIDIA?

I don't know if this will become reality in 2012, 2013 or even later, but eventually it seems unavoidable.
Where's the memory technology to make that happen? 128-bit DDR4 isn't really enough and you can't realistically scale beyond that for the mainstream market in the next 5 years. I suppose you could do some fancy things with silicon interposers (or TSV longer-term) inside the CPU package (orthogonal to the socket) but the short-term interest there seems to be power consumption rather than performance. I agree it will happen someday but I am nowhere near as optimistic about the timeframe as you are - we'll see...
 
??? The typical usage for a Desktop PC is connected to a widescreen monitor not a TV. People typically sit relatively close to their screens and want to be able to play at their monitor's native resolution.

So is 720p adequate in most cases? Sure as long as you are talking about playing on a TV while sitting several feet away from the screen. I'm not sure how much of the Desktop PC market that covers though...

I don't really see it as orthogonal. Your post was about devs focusing on where the market share is (unless I misinterpreted it) and mobile is where the market share is. http://www.reuters.com/article/2011/11/02/us-rovio-idUSTRE7A137Q20111102

The current GPU arch in Tegra has little in common with its desktop counter parts, but I imagine that will change over time. Nvidia could probably fit 1 Fermi SM minus DP on a 28nm SoC if they really wanted to...

Sure TV isn't the dominant gaming environment, but neither is 1080p on the desktop. 1600x900 or 1680x1050 will still be the dominant resolution at the desktop, especially if anyone is using the APU for gaming.

Heck, I'd argue that most people using an APU for gaming will either be at 720p (APU's make great HTPCs), 1440x900, 1600x900, or 1680x1050. All of which can probably provide a satisfying gaming experience in most games. Although the latter two will obviously require lower setting in enthusiast type games.

If someone is sporting a 1080p or higher resolution monitor it's unlikely they'll be using an entry level GPU anyway, much less an integrated one. And as mentioned in the case of an HTPC hooked up to a 1080p TV, going to 720p for games results in no image quality losses unless you are uncomfortably close or have a huge screen.

Regards,
SB
 
I don't really see it as orthogonal. Your post was about devs focusing on where the market share is (unless I misinterpreted it) and mobile is where the market share is. http://www.reuters.com/article/2011/11/02/us-rovio-idUSTRE7A137Q20111102

But in most cases, those are different games. If you're developing the latest WoW expansion, Starcraft 3, Civilization 6 or Crysis 3, what does it matter that lots of people play Angry Birds on Tegra? If they're not going to play the game you are developing, it's none of your concern.

The current GPU arch in Tegra has little in common with its desktop counter parts, but I imagine that will change over time. Nvidia could probably fit 1 Fermi SM minus DP on a 28nm SoC if they really wanted to...

Exactly. They could if they wanted to, but they're not doing it, which means they don't want to. This is either because of area-efficiency or power-efficiency reasons (probably the latter) but in any case, I wouldn't expect any drastic change in this strategy over the coming years.

Plus, this is getting a bit off-topic but although NVIDIA's status as reference design earned them a lot of design wins in the tablet market for the last generation, I'd be pretty surprised if they managed to do it again without being the reference design for Ice Cream Sandwich. I guess we'll see.

Where's the memory technology to make that happen? 128-bit DDR4 isn't really enough and you can't realistically scale beyond that for the mainstream market in the next 5 years. I suppose you could do some fancy things with silicon interposers (or TSV longer-term) inside the CPU package (orthogonal to the socket) but the short-term interest there seems to be power consumption rather than performance. I agree it will happen someday but I am nowhere near as optimistic about the timeframe as you are - we'll see...

That's a good question, but I'd argue that Llano is already pretty decent, and AMD claims a 30% improvement in gaming performance with Trinity. That's pretty good scaling in my opinion. I'm not saying integrated graphics will somehow become as fast as discrete, just that over time, it will become fast enough that fewer and fewer OEMs will bother with discrete graphics. And game developers will naturally adapt to the market.

Plus, and it may just be that I haven't been paying enough attention, but I believe most interposer and TSV talk has come from Intel and ARM companies, not much from AMD. It stands to reason that the former would be mostly interested in power savings while the latter might care more about bandwidth, and implement such solutions with that concern in mind.

Sure TV isn't the dominant gaming environment, but neither is 1080p on the desktop. 1600x900 or 1680x1050 will still be the dominant resolution at the desktop, especially if anyone is using the APU for gaming.

Heck, I'd argue that most people using an APU for gaming will either be at 720p (APU's make great HTPCs), 1440x900, 1600x900, or 1680x1050. All of which can probably provide a satisfying gaming experience in most games. Although the latter two will obviously require lower setting in enthusiast type games.

If someone is sporting a 1080p or higher resolution monitor it's unlikely they'll be using an entry level GPU anyway, much less an integrated one. And as mentioned in the case of an HTPC hooked up to a 1080p TV, going to 720p for games results in no image quality losses unless you are uncomfortably close or have a huge screen.

Regards,
SB

Not to mention that PC gaming is just as much of a laptop phenomenon as a desktop one, perhaps even more so. And if not now, probably in a couple of years. The shift from desktops to laptops in the consumer space is as clear as can be.
 
That's a good question, but I'd argue that Llano is already pretty decent, and AMD claims a 30% improvement in gaming performance with Trinity. That's pretty good scaling in my opinion. I'm not saying integrated graphics will somehow become as fast as discrete, just that over time, it will become fast enough that fewer and fewer OEMs will bother with discrete graphics. And game developers will naturally adapt to the market.
Your argument might have been true , If developers hailed PCs as the primary gaming platform , while in fact , this is no longer true , Consoles are the primary platform for which developers code and optimize .

Next generations consoles will push the envelope even further requiring more graphics horsepower from discrete GPUs , while putting a huge pressure on the integrated GPUs , requiring them to ramp up their technical specs to catch up , or else they will fail , and since integrated GPUs are being held back by so many technical limitations , their evolution will still be not fast enough for that to happen .

They could catch up eventually if circumstances changed , but that will happen over a long time course .
 
Your argument might have been true , If developers hailed PCs as the primary gaming platform , while in fact , this is no longer true , Consoles are the primary platform for which developers code and optimize .

Next generations consoles will push the envelope even further requiring more graphics horsepower from discrete GPUs , while putting a huge pressure on the integrated GPUs , requiring them to ramp up their technical specs to catch up , or else they will fail , and since integrated GPUs are being held back by so many technical limitations , their evolution will still be not fast enough for that to happen .

They could catch up eventually if circumstances changed , but that will happen over a long time course .

I am ready to go out on a limb and say that the next gen consoles will be not much faster than APU's of the time.
 
Silent_Buddha said:
Sure TV isn't the dominant gaming environment, but neither is 1080p on the desktop. 1600x900 or 1680x1050 will still be the dominant resolution at the desktop, especially if anyone is using the APU for gaming.

Heck, I'd argue that most people using an APU for gaming will either be at 720p (APU's make great HTPCs), 1440x900, 1600x900, or 1680x1050. All of which can probably provide a satisfying gaming experience in most games. Although the latter two will obviously require lower setting in enthusiast type games.

Actual numbers from Steam:

1280x720: 00.74%
1440x900: 10.49%
1680x1050: 17.70%
1920x1080: 14.92%

1920x1080 is growing 6x faster than the rest of those resolutions.

If all Nvidia has to be worried about is people intending to hook up their PC to their TV, then they can sleep easy for a very long time. I wonder what the actual % market share HTPCs account for in the overall PC market.... real numbers would be nice. And I know I really can't imagine playing a PC game on a TV unless it is extremely casual... sure as hell not going to play Starcraft or a competitive FPS without my mouse, keyboard, and a large solid desk.
 
Last edited by a moderator:
I am ready to go out on a limb and say that the next gen consoles will be not much faster than APU's of the time.
How so ? they will have more cores , bigger and faster memory , and GPUs that are more in line with contemporary high end GPUs !

A lowly Xbox360(PS3 too) GPU is capable of running an optimized code of Crysis , retaining most of the visual fidelity of the original version , Yes it had lower texture resolution (due to memory constraints), and lesser shadow precision , but the end result was nevertheless fantastic , what previously was an impossible and unthinkable task , is now a reality , imagine what developers could do with more technical horsepower !
 
what previously was an impossible and unthinkable task , is now a reality , imagine what developers could do with more technical horsepower !

If only the original crysis was designed for a platform that was more powerful than the 360...
 
How so ? they will have more cores , bigger and faster memory , and GPUs that are more in line with contemporary high end GPUs
Their GPUs will not be anywhere near Maxwell, though they might have a slight edge in CPU performance.
 
Actual numbers from Steam:

1280x720: 00.74%
1440x900: 10.49%
1680x1050: 17.70%
1920x1080: 14.92%

1920x1080 is growing 6x faster than the rest of those resolutions.

If all Nvidia has to be worried about is people intending to hook up their PC to their TV, then they can sleep easy for a very long time. I wonder what the actual % market share HTPCs account for in the overall PC market.... real numbers would be nice. And I know I really can't imagine playing a PC game on a TV unless it is extremely casual... sure as hell not going to play Starcraft or a competitive FPS without my mouse, keyboard, and a large solid desk.

And you just proved my point. 14.92% is a minority of PC gaming. It doesn't matter if it is growing faster, it's still going to be the minority for at least the next half decade while the majority will be at 1680x1050 or below. In other words, for any game currently in developement if it isn't targetting the enthusiast they still have to consider those on lower resolution displays with less than enthusiast graphics cards.

How so ? they will have more cores , bigger and faster memory , and GPUs that are more in line with contemporary high end GPUs !

A lowly Xbox360(PS3 too) GPU is capable of running an optimized code of Crysis , retaining most of the visual fidelity of the original version , Yes it had lower texture resolution (due to memory constraints), and lesser shadow precision , but the end result was nevertheless fantastic , what previously was an impossible and unthinkable task , is now a reality , imagine what developers could do with more technical horsepower !

Price point. All the manufacturers would love to be able to hit a 399 USD price point or lower for launch. Anything higher and you risk slow adoption (PS3).

Are you really expcting massive multicore CPUs and massive GPUs at a 399 USD price point? Especially considering that neither Sony (still losing money as a company) nor MS (likes margins) want to go with a loss leading platform again? Which likely means the target BOM is likely to be around 299 USD to 349 USD.

At most, and I'm being extremely generous here, I'd be looking at 6770/6870 levels of GPU performance or whatever the Nvidia equivalent is. But there's a good chance it'll be 6770 or lower level of performance. As well looking at 100 USD or less CPUs on the market to see roughly what level of CPU will be used.

And yes imagine what developers could do with a static system where no component's will be changed for 5-10 years. Where they do not have to test to see whether their game will work on the millions of potential hardware combinations. Smaller developement budgets (more for developement, less for testing and QA) and able to code to a GPU's strengths without having to worry about whether it works on hardware combination #322195 versus #29102. They just have to worry about whether it works on hardware combination #1 or #2 (for multiplat) or just combination #1 (console exclusive). It's no surprise they can do more with inferior hardware.

Regards,
SB
 
At most, and I'm being extremely generous here, I'd be looking at 6770/6870 levels of GPU performance or whatever the Nvidia equivalent is. But there's a good chance it'll be 6770 or lower level of performance. As well looking at 100 USD or less CPUs on the market to see roughly what level of CPU will be used.
I disagree , At least , up until now , that never happened before , The first Xbox and the Xbox360 both used state of the art GPUs , that were available at their times , (PS3 too) , they also featured more CPU cores than any other consumer CPU out there .. and it seems to me that this trend will continue .

Xbox360 and PS3 have already taken their sweet time in the market , Right now they are nearing a decade in age , just like their previous console generation , they will be withdrawn at the 2013-2014 time-frame , and some new powerful hardware will replace them , and the cycle will continue.
 
The first Xbox and the Xbox360 both used state of the art GPUs , that were available at their times , (PS3 too) , they also featured more CPU cores than any other consumer CPU out there .. and it seems to me that this trend will continue .

But there were faster gpu's and cpu's out at the time...
 
Silent_Buddha said:
And you just proved my point. 14.92% is a minority of PC gaming. It doesn't matter if it is growing faster, it's still going to be the minority for at least the next half decade while the majority will be at 1680x1050 or below. In other words, for any game currently in developement if it isn't targetting the enthusiast they still have to consider those on lower resolution displays with less than enthusiast graphics cards.

Oh look Steam just updated their data:

1920x1080: 23.73% and number 1 overall resolution...

Still waiting on that HTPC market share....
 
I disagree , At least , up until now , that never happened before , The first Xbox and the Xbox360 both used state of the art GPUs , that were available at their times , (PS3 too) , they also featured more CPU cores than any other consumer CPU out there .. and it seems to me that this trend will continue .

Xbox360 and PS3 have already taken their sweet time in the market , Right now they are nearing a decade in age , just like their previous console generation , they will be withdrawn at the 2013-2014 time-frame , and some new powerful hardware will replace them , and the cycle will continue.

The G80 was already well into developement and released in a similar timeframe to the PS3, yet they went with a G71 derivative that was inferior to the GPU in the X360 despite launching 1 year later. The GPU in the X360 was at least somewhat contemporary with what ATI was capable of at the time, but also much more expensive.

The CPU in the X360 was certainly nowhere near the top of what Intel could produce. And it could be argued either way whether the CELL in used in the PS3 was better. It did some things quite well and yet many other things quite a bit worse than even the budget CPUs at the time.

And you seem to conveniently ignore the part where I mentioned that neither company is likely to go with another loss leader for a console.

The BOM for PS3 was significantly higher than the 499 and 599 USD that it launched at and contributed heavily to the company going from profitable to losing money. Even the PS2 before that was a loss leader and sold for a loss.

The BOM for the X360 as well was likely higher than the 299 and 399 USD pricepoints that it launched at. But not so high that had it not been for the RROD issue with the launch units that MS probably wouldn't have been in the hole with the royalties from software sales to shore up the bottom line.

Again, I'm expecting both companies to target the 399 USD pricepoint for launch with a BOM well under that. Which does not allow the use of enthusiast level hardware.

Oh look Steam just updated their data:

1920x1080: 23.73% and number 1 overall resolution...

Still waiting on that HTPC market share....

23.73% is still quite a bit less than 65% (1920x1080, 1920x1200, 2560x1440 add up to 32.76%) which still makes 1080p and above a minority of the PC gaming install base. On the other hand as I mentioned, 1680x1050 and below still constitutes the majority of the PC gaming install base.

You'll be waiting a long time for those HTPC numbers as I never said I'd provide them. As you'll note I used them just a convenient example of a situation where you would not notice a difference between a game at 1080p and the same game at 720p. BTW - most HTPCs will be using that 1920x1080p resolution from the Steam Survey, but the people will most likely be gaming at 720p in demanding games, or 1080p in less demanding games like I do on my budget/integrated GPU. Although I'm sure some are using something beefier than my equivalent of a 5450.

Regards,
SB
 
And you seem to conveniently ignore the part where I mentioned that neither company is likely to go with another loss leader for a console.

Again, I'm expecting both companies to target the 399 USD pricepoint for launch with a BOM well under that. Which does not allow the use of enthusiast level hardware.
And you seem insistent on ignoring the fact that this will not be something new , same thing happened in the previous console generation , Sony and MS will exhaust the current generation , probably for 2 or 3 more years , and when the time comes , new shiny hardware will be introduced , just like the days of PS2 and Xbox360 .

In fact I think , a 399 USD price is unthinkable right now for a new console , it will capitalize the sales of the old hardware , this is not how a triple A console cycle is introduced , and history is the ultimate teacher in this matter .

Also newer consoles will probably be shipped with increased emphasis on extra features , like 3D and motion sensors , this should push the prices up a bit .

The G80 was already well into developement and released in a similar timeframe to the PS3, yet they went with a G71 derivative that was inferior to the GPU in the X360 despite launching 1 year later. The GPU in the X360 was at least somewhat contemporary with what ATI was capable of at the time, but also much more expensive.
You don't expect GPU manufacturers to hold their development plans just because they released a console version , do you ? they have plans for GPUs at least 3 years in advance !

Besides , the PS3 launch was delayed at least once , and it's design philosophy was different , with emphasis on the CPU instead of the GPU .

The CPU in the X360 was certainly nowhere near the top of what Intel could produce. And it could be argued either way whether the CELL in used in the PS3 was better. It did some things quite well and yet many other things quite a bit worse than even the budget CPUs at the time.
Yep , but they had a higher thread count than most CPUs in the market at that time ,and a frequency that is right up there with the high-end desktop CPUs , A fact that helped them stay in top shape up until now , the CELL can even have graphics workload offloaded to it !
 
Silent_Buddha said:
You'll be waiting a long time for those HTPC numbers as I never said I'd provide them.
Without marketshare, your "point" is rather pointless in the context of this conversation. Regards.

Alexko said:
Exactly. They could if they wanted to, but they're not doing it, which means they don't want to.
I wasn't aware Nvidia had released any information regarding their 28 nm mobile chips, so how do you know they aren't doing it?
 
Without marketshare, your "point" is rather pointless in the context of this conversation. Regards.

I wasn't aware Nvidia had released any information regarding their 28 nm mobile chips, so how do you know they aren't doing it?

Well, they haven't done it so far, and their first 28nm SoC will be Kal-El+, basically a shrink of Kal-El. I suppose they could change their mind in the future (e.g. with Wayne) but I can't really see why.
 
Back
Top