Is the writing on the wall? ATI and NV doomed!

What is going to happen to ATI and NV over the next 5-10 years?

  • No, they will figure out a way to survive!

    Votes: 0 0.0%
  • They will probably merge with larger companies like AMD and Intel.

    Votes: 0 0.0%
  • ATI and NV will merge together to stay competitive!

    Votes: 0 0.0%
  • ATI and NV are the future!

    Votes: 0 0.0%
  • Power VR is the future!

    Votes: 0 0.0%
  • I don't care as long a more episodes of RED DWARF get made!

    Votes: 0 0.0%

  • Total voters
    209
Mulciber said:
which is quite obviously not their intent, givin the design targets of the athlon64 and their intentions to migrate to multi-core processors.

I don't think going multicore and increasing clockspeed are mutually exclusive.
 
Well I am a noob in terms of the knowledge you guys have. But all I can say is looking at the current trend I sincerely believe we will be in no hurry to merge cpus and gpus together anytime soon. There is still room for processors to increase in performance. I mean it seems like AMD is on the right track with thei 64 bit Athlons and FXs. Multicore will be the wave of the future. Then there is the SLI technology being released by Nvidia and the X2 by Alienware. I am sure ATI will come out with something of their own. Then dual core gpus or even quad cores might come out. And then there is the possibility of having them run in parallel because of PCI- Express and prob running 4 such graphx cards is not too hard to imagine. Seems like there is plenty of room for improvement. But I do not think for one moment that software renderers running on CPUs will ever be as fast as hardware. I dont think it makes sense. Then there are people like Tim Sweeney and John Carmack pushing the limits beyond what we are able to imagine...overall I think for the next at least 10 years there are going to be amazing increases in processor gpu or cpu horsepower. My 2 cents :)
 
Not sure if anyone has mentioned this, but....

The only threat to 3D graphics on the PC is the rather unhealthy state of the PC games market, not any possible move towards general purpose software rendering. PC games sales are a poor relative to console sales (Doom 3 is currently the execption proving the rule that console games are always at the top of the games charts). Plus IMHO, PC games are pretty awful. Let's face it, Doom 3 is a petty awful game, sure its the best looking game ever, but I couldnt manage more than 30 mins before deciding that it was simply too tedious to play, and nearly everyone I know who has tried it feels the same way. There's an awful lot of stagnation in the PC games market and far too few genuinely good games (God help us if HL2 turns out to be as krap as Doom 3), plus if you throw in the excessive extent to which PC games suffer from piracy its not a pretty picture. I'm not saying PC gaming is dead or anything silly like that (online games are perhaps the key to its future success) but it is a struggling industry and one that I can't really see driving sales of 3D hardware to the same extent over the medium term as it has in the past. I think the fact that both NV and ATI are moving to a slower product cycle is in part an admission of this - on the whole, I dont think the demand is really there, outside a relatviely iny number number of enthusiasts. Likewise i dont see that Longhorn is really going to drive technology forward - surely current mid to high-end dx9 cards already have the horsepower required by Longhorn, so all longhorn will provide impetous for is to push that technology into the mainstream.

Of course, if Xbox 2 and XNA really kicks off, then the PC can dragged along on its coat tails. Anyway, PC 3D tech will obviously progress, and perhaps some of the core tech between consoles anf the desktop will merge, but I think the one thing that does not threaten ATI and NV is a shift to doing rendring work on the CPU.
 
caboosemoose said:
I think the fact that both NV and ATI are moving to a slower product cycle is in part an admission of this - on the whole, I dont think the demand is really there, outside a relatviely iny number number of enthusiasts.

I suppose they pretty much became their own competition, much like Microsoft Windows now, or most CPUs for office machines. The 'old' products are good enough, so people don't really have a compelling reason to upgrade. I guess it's up to NV and ATi to 'invent' a reason to upgrade again. Or else they will have to wait for games to increase detail and demand faster hardware, which may be a bit of a chicken-and-egg problem.

On the other hand, the slower product cycle could also be a reaction to the fact that they couldn't significantly improve their products in the faster cycle. For example, 9700Pro vs 9800Pro/XT is not exactly shocking.
Perhaps the complexity of the chips at this time simply requires longer cycles if they want to achieve any significant improvements.
 
OT:
Plus IMHO, PC games are pretty awful.
Whoa, hang on a sec. I know you say its in your opinion but that is obviously highly subjective. Console games may have more mass appeal but there's a big market for PC games too. IMHO most console games are horrible (I'm a strategy game fan for the most part). I don't think the PC games industry is struggling (not any more than the film or music industry anyway, sure they may whine all day long but there'll always be some who make it big and some who don't).
 
caboosemoose said:
The only threat to 3D graphics on the PC is the rather unhealthy state of the PC games market, not any possible move towards general purpose software rendering. PC games sales are a poor relative to console sales (Doom 3 is currently the execption proving the rule that console games are always at the top of the games charts).
1. There's no f'in such thing as the exception proving the rule. It's an idiotic saying with no logical merit. Pisses me off every time I hear it used.

2. PC games have always undersold console games. This is nothing new. But PC gaming still isn't going anywhere. The interface devices are just different from consoles, and so is the hardware. This means that many things that are just not possible on consoles are done on the PC, and it also means that, therefore, there will always be PC gamers who prefer the style of game that you get on the PC to the style you get on consoles.
 
The only threat to 3D graphics on the PC is the rather unhealthy state of the PC games market, not any possible move towards general purpose software rendering. PC games sales are a poor relative to console sales (Doom 3 is currently the execption proving the rule that console games are always at the top of the games charts). Plus IMHO, PC games are pretty awful. Let's face it, Doom 3 is a petty awful game, sure its the best looking game ever, but I couldnt manage more than 30 mins before deciding that it was simply too tedious to play, and nearly everyone I know who has tried it feels the same way.

Why has Doom3 to be mentioned in about anything as a point lately? Frankly it didn't knock me out of my socks either, but if it stank for you on the PC, then it'll stink even more on a console. Why? I'll come to that.


There's an awful lot of stagnation in the PC games market and far too few genuinely good games (God help us if HL2 turns out to be as krap as Doom 3), plus if you throw in the excessive extent to which PC games suffer from piracy its not a pretty picture.

There's a high amount of the bestsellers of both worlds ported for either side. Piracy? Are you trying to convince me that I won't be able (if I'd want) to find ANY console game out there in the form of a pirated copy?

I'm not saying PC gaming is dead or anything silly like that (online games are perhaps the key to its future success) but it is a struggling industry and one that I can't really see driving sales of 3D hardware to the same extent over the medium term as it has in the past. I think the fact that both NV and ATI are moving to a slower product cycle is in part an admission of this - on the whole, I dont think the demand is really there, outside a relatviely iny number number of enthusiasts. Likewise i dont see that Longhorn is really going to drive technology forward - surely current mid to high-end dx9 cards already have the horsepower required by Longhorn, so all longhorn will provide impetous for is to push that technology into the mainstream.

I think you've thrown everything you could come up with into a pot and jumped to too many vast overgeneralizations.

If sales are decreasing the numbers would show inevitably. Besides that the highest sales volumes for the IHVs in the PC graphics markets are in the low-end/budget segments, which hardly has anything to do with "enthusiasts".

Real enthusiasts have a damn hard time today to find an ultra high end accelerator from both IHVs; that fact alone doesn't only speak volumes about the real demand out there, yet also is part of an explanation why product cycles have to slow down. In reality it means only less refresh products; generations already take quite a longer time than just a year for the past few years. Just because the box says GeForce4 as an example it doesn't necessarily signify a new product generation does it?

Of course, if Xbox 2 and XNA really kicks off, then the PC can dragged along on its coat tails. Anyway, PC 3D tech will obviously progress, and perhaps some of the core tech between consoles anf the desktop will merge, but I think the one thing that does not threaten ATI and NV is a shift to doing rendring work on the CPU.

ROFL ....tell you what I'm pretty confident that if ATI's next generation R520 will deliver as much in performance (not necessarily features) as the Xenon GPU they'll be rather in deep shit.

....there will always be PC gamers who prefer the style of game that you get on the PC to the style you get on consoles.

Beyond any doubt yes.
 
Chalnoth said:
1. There's no f'in such thing as the exception proving the rule. It's an idiotic saying with no logical merit. Pisses me off every time I hear it used.

FWIW, an old meaning of "prove" was "test", which survives in expressions such as "proving ground", "the proof of the pudding is in the eating", and "the exception that proves the rule." Studying exceptions is a good way to test a rule -- why does anyone think exceptions can validate a rule?
 
Regarless of the apparent original meaning of the phrase, it seems to be used in an illogical way today. People seem to use it as a way to, quite literaly, discount exceptions. They seem to use it as a way to classify: this is an exception. Therefore it is not the rule, and not the norm, without any measurement of the supposed exception.
 
Would there be any merit, as a transition architecture, where instead of adding a 2nd CPU core, use that space for a dedicated, feature-rich vector unit + a high-performance (but reasonably compact) GPU? So on a single die you would have a fast CPU + VU + GPU + embedded RAM cache all tied together on a native, high-speed bus. Then you can connect to a unified, high speed RAM bank (ala nForce). To scale performance, you hang additional iterations of these "modules" to the RAM bank. The one obvious design feature to all of this is that now there is a very direct route between CPU to GPU, instead of what exists now where they are separated by all sorts of buses, memory architectures, and interfaces.

Additionally, this could open the door to more intimate processing of data between CPU and GPU where certain tasks lend themselves to beneficially to either CPU or GPU, but still require the other one for "traditional" setup. This is sort of like how some enterprising developers are trying to figure out how to do physics on a GPU. Now the door would be wide open to share data between CPU and GPU anyway you wish.

I think maybe the whole philosophy behind all of this is to lift data off main memory once and keep it tied up in processing until you are absolutely finished with it and have a pixel to write back to memory (where a general video processor would skim off frames to throw onto your screen).

I dunno- it's just a crazy idea born out of "what else could you put besides a 2nd CPU core"?
 
randycat99 said:
Would there be any merit, as a transition architecture, where instead of adding a 2nd CPU core, use that space for a dedicated, feature-rich vector unit + a high-performance (but reasonably compact) GPU? So on a single die you would have a fast CPU + VU + GPU + embedded RAM cache all tied together on a native, high-speed bus.
I'd definitely say so. But it wouldn't be high-performing. This could be one way Intel could go for their value line of processors, for example. But for high-end designs, well, we'd need something drastic to occur before that happens.
 
Actually, I was thinking about something like that... but more the opposite...
I mean, what if you had a simple Celeron-ish CPU on the graphics board?
It would be able to do things you'd now have to do on the regular CPU, but it could do them without the bus bottleneck.
Then you could for example upload an entire scenegraph and let the CPU handle all culling and interpolation setup for animation and such. It doesn't need to be a really fast CPU for that.
Perhaps it could also reduce the overhead of small batches and things like that.
 
Just came across this panel discussion "GPUs and CPUs: The Uneasy Alliance?".

The panel is very impressive

Mike Doggett (ATI Technologies)
Dave Kirk (NVIDIA)
Adam Lake (Intel Corporation)
Bill Mark (University of Texas at Austin)
Neil Trevett (3Dlabs)

Moderator: Peter N. Glaskowsky (MicroDesign Resources)


Panel Description
Today's high-end 3D chips are fully programmable microprocessors with extraordinary computational power. On suitable tasks--such as those associated with 3D rendering--these graphics processing units (GPUs) are orders of magnitude faster than conventional CPUs. We have invited representatives from the four most important providers of 3D chips, and a leading academic expert on this subject, to discuss how GPUs and CPUs will both cooperate and compete as computational resources within personal computers and workstations.





http://www.cs.unc.edu/Events/Conferences/GP2/program.shtml

I didn't see anyone from IBM on the pannel.
 
Back
Top