Competiton over ?

Just read this over at Legion hardware.
The NVIDIA CEO Jen-Hsun made it quite clear that the graphics industry would no longer be the primary focus point of the company.

The graphics industry could get quite interesting in the next several could mean for the evolution of a new main competitor for ATI.Although it makes me wonder why they even spent for development of the NV40 if this is the case ? ;)
 
Ascended Saiyan said:
Just read this over at Legion hardware.
The NVIDIA CEO Jen-Hsun made it quite clear that the graphics industry would no longer be the primary focus point of the company.

The graphics industry could get quite interesting in the next several could mean for the evolution of a new main competitor for ATI.Although it makes me wonder why they even spent for development of the NV40 if this is the case ? ;)
In my opinion,Jen-Hsun means nVIDIA is more than a Graphic Compnay
 
Ascended Saiyan said:
In my opinion,Jen-Hsun means nVIDIA is more than a Graphic Compnay
Agreed. I think Legion Hardware was applying their own tabloid-style spin on Nvidia's announcements.

They're diversifying, like all companies have to do as they age. This latest fiasco is a prime example of exactly why they have to do that - you can't expect every new GPU or processor design to totally dominate the competition.

When you release a turd, you have better have some other avenues to generate revenue while your R&D team gets back to the drawing board. Otherwise...3dfx, anyone?
 
Interesting how two business survival paradigms constantly replaces the other.

1) Diversify to limit risk

or...

2) Sell off superfluous assets and concentrate on the core competence
 
LoL! I thought this was about the shader competition. BTW, shouldn't this be in the industry forum?
 
Dave Glue said:
Agreed. I think Legion Hardware was applying their own tabloid-style spin on Nvidia's announcements.

They're diversifying, like all companies have to do as they age. This latest fiasco is a prime example of exactly why they have to do that - you can't expect every new GPU or processor design to totally dominate the competition.

When you release a turd, you have better have some other avenues to generate revenue while your R&D team gets back to the drawing board. Otherwise...3dfx, anyone?

I agree as well.

Things don't look too good IMO, for nVidia in the "high end discreet GPU" business for the next few years. IBM is having issues with low-k...(isn't better low-k at IBM the suppossed reason for nVidia switching the high end parts to IBM?) ATI has the "in" with Microsoft in terms of the next DirectX via X-Box....

I don't see nVidia having a good shot at redeeming itself in the enthusiast space as a "clear leader" until we are in a post X-Box2 / Longhorn era.

This of course, doesn't mean that nVidia is doomed as a company...far from it. It does mean that nVidia might be recongnizing this, and sending out the signals about a potential shift in R&D expenditures and investment for the time being. Which is arguably what they should do if their high-end market is going to take a hit.
 
NVIDIA were always a diverse company but I think now that their core business is under attack (still not defeated IMO - we will see soon if ASUS decide to go with ATI too).

It might also mean, rading between the lines, that NVIDIA do not think NV40 will rescue them either from their current predicament.

NVIDIA has a lot of work to do to be competitive with ATI and R3xx and we have no real idea what the R4xx is yet. Brace yourself NVIDIA.

I can see the future and it says, NVIDIA do badly in their diverse markets (already it seems not many manufacturers are going with NFORCE3 probably due to the expense of the design), and in the future NVIDIA release a statement saying that they will be concentrating on their core business and selling off some of their other business'. Well it did happen to Intel recently. ;)

Of course that is one version of the future... the other one says NVIDIA do better in the chipset, PDA etc markets and leave their current core business.

Another version is NVIDIA do well in their core business and subsidiaries in the future.

.... yea I am covering all the bases here. Hehe, and I too thought this thread was about the shader competition!!!
 
I actually think that he meant they didn't want to put all their eggs in the same basket, that's all, not that they were quiting.
it could also be an elegant way to excuse themselves for the poor DX9 performance with their current cards.
Anyway, in all cases, competition is not over, there's still XGI and perhaps S3
 
CosmoKramer said:
Interesting how two business survival paradigms constantly replaces the other.

1) Diversify to limit risk

or...

2) Sell off superfluous assets and concentrate on the core competence
Well, diversification takes investment, so diversification is what smart companies do when they are becoming successful. nVidia started diversifying a little while ago with the first nForce. This is sure to continue.

Focusing on core competence is something for companies to do when they're not doing as well. This is the reason that companies should diversify, so that they can focus on those things that are making the most money for them.

That is, your statement sounds to me like you think these two strategies are in opposition. I say that they complement one another, and are, in fact, part of the same strategy.
 
Joe DeFuria said:
Things don't look too good IMO, for nVidia in the "high end discreet GPU" business for the next few years. IBM is having issues with low-k...(isn't better low-k at IBM the suppossed reason for nVidia switching the high end parts to IBM?) ATI has the "in" with Microsoft in terms of the next DirectX via X-Box....
Having a console deal doesn't mean a company will succeed. While the revenues from X-Box sales have helped nVidia hugely over the past year, it takes a large investment to produce such a processor. Console deals have killed companies' PC parts before, and fingers have been pointed at nVidia's own X-Box deal at having hurt the NV30.

As far as nVidia's relationship with Microsoft is concerned, we'll have to see. Things can change very quickly. But yes, unfortunately nVidia does need a very strong relationship with Microsoft in order to have a part that is considered the enthusiast's GPU.
 
Chalnoth said:
That is, your statement sounds to me like you think these two strategies are in opposition. I say that they complement one another, and are, in fact, part of the same strategy.

Ostensibly that is the most rational strategy. However I know that many CEO:s have an affinity for one or the other of those approaches, regardless of how the company is actually doing.
 
CosmoKramer said:
Ostensibly that is the most rational strategy. However I know that many CEO:s have an affinity for one or the other of those approaches, regardless of how the company is actually doing.
Yep. Which is why I put the "smart" qualifier in there (well, in the first part...guess I could have used that qualifier to better effect...).

The simple reason is: there are a hell of a lot of stupid people out there. I'm very convinced that our school system isn't just "dumbed down" as so many seem to think, but the method of teaching, and the focus are just wrong. People learn to memorize instead of to think. I'm currently the TA in a physics course for bio majors (who largely go on to pre-med and stuff), and it's just scary how resistant they are to actual thought.
 
I'm not entirely sure why ATi having a fairly large hand in DX10 need be the final nail in the coffin for nVidia - It's hardly as if MS doesn't talk to people other than the GPU developer behind the XBox chip at that particular time. Sure, ATi might end up having more influence, but nV (or any other company) aren't going to be shut out. They still have just as much chance as ATi to make a decent DX10 GPU, if they're willing to put the work and effort into it.
 
PaulS said:
I'm not entirely sure why ATi having a fairly large hand in DX10 need be the final nail in the coffin for nVidia -

I didn't say nail in the coffin.

I said that I see it being extrememly difficult for nVidia to regain the top technological spot until post DX10, given ATI's hand in DX10.

Sure, ATi might end up having more influence, but nV (or any other company) aren't going to be shut out.

Nor have I claimed that.

They still have just as much chance as ATi to make a decent DX10 GPU, if they're willing to put the work and effort into it.

Disagree completely.

ATI has a much bigger chance at having the "best" DX10 GPU. (Best defined as combination of time-to-market, most DX10 feature support, and LEAST amount of wasted transistors for "non DX10 supported" features.)

I'm in no way claiming that ATI can't really mess up and drop the ball. Or can't fail out of pure bad luck. Anything can happen in this industry. But to say that nVidia (or anyone but ATI) has just as much a chance at the leading DX10 GPU is pretty silly, IMO. ;)

In short: ATI clearly has the best chance...though that doesn't guarantee success.
 
Firstly, my points were in response to a general theme which has been in people's posts - I wasn't answering your points directly :)

Regardless - Yes, ATi perhaps have a slight advantage, but i'd argue that it's far less than you're making out. ATi might well be able to sculpt DX10 to the direction they're taking already, but - as i said - no one else is going to be shut out. Everyone is going to know the specs well ahead of time, so it's not as if ATi is going to be 6 months ahead, because MS decided to get some kind of secrecy pact going.

It'll be easier for ATi than nV sure, but i don't think - if the effort is put in - that nV need be at much of a disadvantage. The key point is whether they WANT (or are able) to put the effort in now, like they didn't with DX9. If they're still going to push for standards other than DX, then they'll be in trouble.

...More so :D
 
PaulS said:
Firstly, my points were in response to a general theme which has been in people's posts - I wasn't answering your points directly :)

NP ;)

The key point is whether they WANT (or are able) to put the effort in now, like they didn't with DX9.

Who said nVidia didn't put in the effort with DX9? Or if nVidia decided to play to more directX9 specs, that they wouldn't be delayed by more than 6 months or a year, to revamp their architecture?

That's sort of my point.

Look at DX8...mostly to nVidia spec because of X-Box. ATI was both 6+ months late with their DX8 part, and it came in "over specced".
 
Joe DeFuria said:
Or if nVidia decided to play to more directX9 specs, that they wouldn't be delayed by more than 6 months or a year, to revamp their architecture?

And that's my point. They chose to direct their efforts elsewhere rather than sticking to DX9. They could do the same again.

Joe DeFuria said:
Look at DX8...mostly to nVidia spec because of X-Box. ATI was both 6+ months late with their DX8 part, and it came in "over specced".

But what's to say that had anything to do with how DX8 development was carved up? Internal problems at ATi (away from engineering, perhaps?) could well have delayed them.

I'm not saying that you're definitely not right, merely that it's a possibility.
 
PaulS said:
And that's my point. They chose to direct their efforts elsewhere rather than sticking to DX9. They could do the same again.
Actually, I think Joe's point was that NVIDIA put in work, but their vision of what DX9 was going to be wasn't the one that ended up being ratified, so it was either "make do with what ya got" or "redesign and be even more late"
 
And that's my point. They chose to direct their efforts elsewhere rather than sticking to DX9. They could do the same again.

But why? Because nVidia already had invested in "their vision" of DX9. (Just as they are now investing in "their vision" of DX10.) If MS happens to not choose "your vision", then you are at a disadvantage. And I don't see MS favoring anyone's vision of DX10 over ATI's, due to X-Box.
 
Back
Top