NVIDIA shows signs ... [2008 - 2017]

Status
Not open for further replies.
Nah, we don't know the first one is necessarily true. After all, G80 kicked ass in games even while burdened with all the extra CUDAness. Besides, for all their talk of being focused on games it's not like AMD has set a high target with RV870 so far. The disappointing scaling in light of the transistor increase isn't getting as much attention due to Nvidia's pitiful execution.
 
The last slides are indeed funny. And what would you expect anyway? Nvidia's CEO pretty much laughed at Larrabee's PowerPoint launch and now get's laughed at his "wood screws" mockup card. Sounds fair to me.

It's a shame that Larrabee won't launch before Fermi because that would have been hilarious.
 
I found the slides to be quite pleasurable to read. The pleasure comes from the change of pace in getting FUD spread from the red side of the fence.

So are we going to see this war of words escalate? Twitter, PDFs, and Nerf battles in the carpark next?
 
And Eyefinity is innovation now (poor Matrox).

So you play all those games on Matrox?
Yeah, poor Matrox for their crapy GPU's.

- Go and try play all those games AMD showed at 7000 resolution flawlessly on a Matrox?
- LOL. If the game even start on that Matrox with any Bug or crash it's a superb win. Not even going to talk about raw performance to keep up the pace with those huge resolutions.

Eyefinity is innovation and that's not even questionable because it put together raw power, massive multi-monitor, massive-resolutions in a flawless way. Just plug and play with ATI quality and stability provided by their drivers and hardware.

You simply can't get this on Matrox or Nvidia or any other vendor. Only ATI provide this key elements to massive multi-monitor. Being the first doesn't mean that anything that came after that will not be innovation. Otherwise anyone innovate because the wheel was inventend many thousand years ago.
 
Last edited by a moderator:
What's even more funny is that according to AMD's own slides their Cypress isn't far behind (if at all) Fermi on the GPU Compute front.
So what's all this fuss about then?

Are you comparing OpenCL support or actual hardware capabilities? Cause Fermi has some big features that Cypress can't touch.
 
Unified Address Space
Generalized Caches
Indirect Branching
Recursion
Pointers
Function Pointers / Virtual Functions
Exception Handling
ECC
 
Nicely done, AMD.
Charlie, I sure hope you will write about AMD's anti-Fermi FUD campaign soon.

What is FUD about it? To quote from Kyle's piece, "I am not going to editorialize a lot here except to say I find very little I can disagree with AMD on this.". Stinging criticism there, and PCGH is equally harsh.

Had I seen it before it was on at least two other sites, I would have written it up. Then again, I wouldn't have been any more critical than Kyle or Carsten. As far as I can tell, this presentation is dead on accurate, and they don't say anything that can't be backed up.

Compare and contrast that to something like this:
http://www.theinquirer.net/inquirer/news/1035036/nvidia-spreads-anti-intel
that contains 'facts' that are not only wrong, but easily provable to be such with the merest check on Google. Same with this:
http://www.theinquirer.net/inquirer/news/1004863/nvidia-slides-reveal-masterhood
specifically slide 20. Provably incorrect, and it was at the time. There is a term for that, something like "this puppy is...", no, not that one.

In the end, much as you are frothing about this presentation, it is factually dead on, and does very little if anything that could even be called distasteful.

But, since you brought it up, what parts about it do YOU find objectionable? Please include links to things that back up your arguments, and no, BSN doesn't count.

-Charlie
 
It doesn't matter who I like, it's how they quote, completely out of context and without any background.
And last slides are funny, yeah. And Eyefinity is innovation now (poor Matrox).

Did Matrox allow for a single surface in Windows on three (or more) monitors? Did matrox do 6 monitors?

-Charlie
 
The last slides are indeed funny. And what would you expect anyway? Nvidia's CEO pretty much laughed at Larrabee's PowerPoint launch and now get's laughed at his "wood screws" mockup card. Sounds fair to me.

It's a shame that Larrabee won't launch before Fermi because that would have been hilarious.

It is a much closer call than people think at the moment. It is a case of Intel driver people getting their act together before Nvidia silicon engineers. It could go either way, PR stunts notwithstanding.

You could point out however that Intel demo'd real Larrabee silicon before Nvidia did. :)

-Charlie
 
So you play all those games on Matrox?
Yeah, poor Matrox for their crapy GPU's.

Do you even remember when Parahelia came out? You could actually play games from the time on it using multiple screens. It was in reviews about how spiffy and awesome this triplehead gaming revolution would be. Now I guess eyefinity sounds better or something.
 
Parhelia did indeed allow a single virtual display surface to be stretched across two or even three monitors. That included the windows desktop (complete with your start menu across all display devices) and included games doing the same.

So to that point, Matrox did indeed beat ATI to the "Eyefinity" punch. I do believe there were some resolution limits, but can't remember what they were and I'm not immediately finding mention of it online. I'm sure Google has it somewhere...
 
I don't see anything wrong with the slides. It is funny how they use Jhen-Hsun's own quotes to point out how inconsistant Nvidia are.

Until Fermi ships it's in the same state that R600 was in compared to G80. Looked great on paper. Execution and performance when it launched however was a whole different story.

Fermi looks promising, but as with anything else, will wait and see how it actually turns out. Paper specs don't mean much if it doesn't do much once you have the hardware.

Regards,
SB
 
Parhelia did indeed allow a single virtual display surface to be stretched across two or even three monitors. That included the windows desktop (complete with your start menu across all display devices) and included games doing the same.

So to that point, Matrox did indeed beat ATI to the "Eyefinity" punch. I do believe there were some resolution limits, but can't remember what they were and I'm not immediately finding mention of it online. I'm sure Google has it somewhere...

I think matrox blamed it on DirectX back then. Don't know if it was true though. (The resolution limits)
 
Parhelia did indeed allow a single virtual display surface to be stretched across two or even three monitors. That included the windows desktop (complete with your start menu across all display devices) and included games doing the same.

So to that point, Matrox did indeed beat ATI to the "Eyefinity" punch. I do believe there were some resolution limits, but can't remember what they were and I'm not immediately finding mention of it online. I'm sure Google has it somewhere...

Eyefinity does add a bit more to it than just 3 displayed and a single virtual display.

For the current shipping cards it also adds 2x1, 1 monitor configurations. Where you can have 1 virtual display of 2 monitors and another display.

For the future card with 6 displays that configurability goes up. You could have dual 3x1, or triple 2x1, or a 4x2 + 2x1, etc... Or just 6 independant displays.

I don't think a 6 display card is going to be the bee's knees with gamers, but it is pretty compelling for developement environments and data analysis. As well for quite a lot of other professional applications.

Although for something like this, it'd probably be cheaper to just get 2x5750. Unless you needed flexibility in configuring multiple virtual displays. I could see some interesting applications for a 2x1 portrait layout display with a 4x2 landscape virtual display.

Regards,
SB
 
Status
Not open for further replies.
Back
Top