How will NVidia counter the release of HD5xxx?

What will NVidia do to counter the release of HD5xxx-series?

  • GT300 Performance Preview Articles

    Votes: 29 19.7%
  • New card based on the previous architecture

    Votes: 18 12.2%
  • New and Faster Drivers

    Votes: 6 4.1%
  • Something PhysX related

    Votes: 11 7.5%
  • Powerpoint slides

    Votes: 61 41.5%
  • They'll just sit back and watch

    Votes: 12 8.2%
  • Other (please specify)

    Votes: 10 6.8%

  • Total voters
    147
Status
Not open for further replies.
Scali, DirectX 11 will support DX10 hardware, and even DX9 hardware ; DX compute will have a "Compute-on-10" profile, supposedly compatible with all nvidia hardware and radeon 4xxx cards.
http://www.behardware.com/news/10380/1st-directx-compute-driver-from-nvidia.html

I believe game developers don't really care about DX10 but will be more likely to write for DX11 (fully supported under vista).
I believe as well that a solution splitting the userbase won't get more attention from developers, thus only getting a handful of games and techdemos (current PhysX).
Relevant physics-accelerated games will use DX11. Whether PhysX, OpenCL, Compute level 11 or Compute level 10 becomes the focus of attention, we'll see.

Probably PhysX is the most mature and fastest but looks like the black sheep in that list.
As for the argument that consumers will overlook the DX number and care about PhysX, I believe you overestimate them, and overestimate the mindshare of PhysX outside of hardware forums and hardware news sites. Those are people who bought AGP geforce 6200 for their clunker back then (if they had knowledge deep enough), and will choose their vid card (actually the computer that holds it) according to the model number (if they are knowledgable enough).
 
Nvidia: DirectX 11 Will Not Catalyze Sales of Graphics Cards.
Well i guess that means we wont see GT300 anytime soon. And we have clear answer how nvidia is going to counter rv870.

No way on earth are Nvidia going to convince the general public that graphics cards are not about graphics, that DX11 is not important, and that people should just ignore the upcoming graphics benchmarks where the red bars will be much higher than the green bars.

I think this is the first official statement from Nvidia that indicates that GT300 might really be in a lot of trouble.
 
I'm willing to listen to Nvidia's sales pitch but that's about it. No way will I be suckered into upgrading just for GPGPU.
 
How is this possible, how could Nvidia not even have a couple of cheap OEM Dx11 chips ready for the Windows 7 launch? So a top of the line chip with ridiculous amount of transistors is late, understandable. But nothing even for the bottom of the line Dell's and HP's of the world?
 
Wow, did he really just say nobody needs higher framerates or resolution? :LOL:

Though I guess he has a point considering only 10% of people on the steam survey are at higher than 1680x1050. But still, that's not the sort of stuff that should be coming from a graphics giant. Unless Nvidia now fancies themselves to be in the "compute" industry.....
 
How is this possible, how could Nvidia not even have a couple of cheap OEM Dx11 chips ready for the Windows 7 launch? So a top of the line chip with ridiculous amount of transistors is late, understandable. But nothing even for the bottom of the line Dell's and HP's of the world?

by thier own admission, they aren't looking to make the "experience" cheaper/more affordable. Instead they want to add value (more features at a higher cost-rate of return). They see themselves akin to Intel vs AMD cpu wise (9:02): (in regards to Batman: Arkanum with PhyX & Stereoscopic) "an experience games are turly asking for and when they are asking for it, they have to be willing to pay for it"(9:46)

My favorite comment: "You don't revolutionize it (the market) by making your product cheaper"
 
Last edited by a moderator:
Yeah that is an effective strategy. But it's completely dependent on consumers perceiving what they're doing as value creation. And of course, it will blow up in their face spectacularly if AMD is able to match them at their compute game while continuing to pursue their "cheap" strategy.

I think Nvidia is barking up the wrong tree trying to position general computing as a competitor to their core graphics business (which is still their bread and butter by far). Yes developers will integrate more and more compute stuff into their engines through DXCS but there's no indication that AMD's hardware will be deficient at that in any way and Microsoft isn't going to favor Nvidia hardware the way Nvidia can with CUDA. And PhysX can only do so much. While I agree that higher framerates probably aren't the focus in this consolized world we live in I'm not really seeing where else they can leverage general compute functionality in games to improve the user experience....
 
what confuses me a bit is that in the discussion, the representative rails against (MCM) offerings that offer many "old and cheaper" features in an all in one package yet 6 minutes later touts Tegra as being "a fairly good representation of how you would want to build a handheld computing device" and goes so far as to mention its performace to watt ratio when (again) earlier he made a distinction that ATI/AMD had chosen to do so much earlier and they (Nvidia) is looking to distinguish itself in a different category. All together though, besides the pr double talk I'd say it was informative and for the most part he delivered a sensible representation of their (nv's) goals and outlook.
 
Saying DX11 isn't important is a pretty bad indication about the status of GT300 or their other DX11 chips... at least in the near term. You sure as hell don't talk down on DX11 if you plan on releasing DX11 cards soon
 
Unless they come out with comparable performance, AMD will be sitting pretty market share wise at the end of the year.

Jack of all trades, master of none.
 
These stereoscopic doo dads and stuff are such silliness. Alot like eyefinity...

I agree. seeing gameplay across 6 monitors is an impressive demo, but is anyone actually going to put 6 monitors in their computer room? likewise the 3d shutter glasses produce an impressive stereo effect, but who actually wants to wear those dumb plastic toys on their face every time they play a videogame? these features don't have mass market appeal IMO.
 
Reminds me of NV saying in late '99 that you don't need anti-aliasing, just run at higher resolutions.
Wow dude, you're old! :oops:

;)

I'm sort of stunned speechless. I could almost see nVidia going this route since it's pretty much their only choice if they have no new card, but I can't for the life of me seeing someone say that with a straight face and honestly believing it if they knew anything about the graphics/gaming world. :???:
 
I agree. seeing gameplay across 6 monitors is an impressive demo, but is anyone actually going to put 6 monitors in their computer room? likewise the 3d shutter glasses produce an impressive stereo effect, but who actually wants to wear those dumb plastic toys on their face every time they play a videogame? these features don't have mass market appeal IMO.
I had the H3D glasses, and then the Wicked3D drivers. It was amusing at first, but after a while they went into the closet, and then later got thrown out. I doubt I'll go for 3D glasses in the foreseeable future, even if they have improved.

The 3 monitor setup looks interesting. I've seen some 3 monitor racing game setup that basically puts you inside the car. And I can picture having peripheral vision can help immersion in flight simulators and even shooters. Though I can't see that many people going to 3 or more monitor setups.
 
Status
Not open for further replies.
Back
Top