NVIDIA GF100 & Friends speculation

300 euro to 400 euro is a curiously wide number ... has NVIDIA simply not decided yet or are they simply not telling their AIBs yet?
I would hazard a guess that info is being kept tight and will be revealed sometime before Cebit is over from Nvidia itself.

rpg.314 said:
Good luck with getting people to cough up money if you lose in games.

Well, considering there's more than a handful of TWIMPTB titles, a few titles with phsyx including the upcoming Metro2033 (Which I'm going to hazard another guess and say it's going to be shown at Cebit with no FPS# visible but still running) that's been heavily supported by Nvidia. Along with a few titles still that have a performance bias towards nvidia I'd suspect they'd be showing the titles that they have a strength in, and people will buy based somewhat on those numbers. Not to mention there should be a folding/benchmarking improvement, brand loyalty, etc.

rpg.314 said:
Direct conversion. Sorry.

No problem at all. Was just asking.

Silus said:
He just added the direct conversions: ~400 USD (300 euros) with ~550 (400 euros) and divided by 2.

I'm still betting on 399 USD for the GTX 470.

Although costing 399 USD, doesn't exactly mean it will cost 300 euros. Quite the opposite. It will probably still cost 400 euros, as is usually the case: 399 USD is converted to 399 euros...
Ah, I see now. I get the basic math of it, just didn't know if he had done any searching into a possible VAT% difference along with any other factors from cards that are on sale now (Would be a tad closer to real numbers I'd think.) And I'd guess 470 being at the least 350, with a high chance of it being 400$

rpg.314 said:
Found that right after I was puzzled about lack of editing and the explanation. Tried to find it before posting and failed. Thank you very much for the extra heads up though.
 
Sorry for the off-topic, but i still cant send PMs. Hey Silus! Nice to see someone from Portugal too :)

Hi Picao84! Same here :)

PMs, the edit button and more, all require a certain number of posts, to be available. Follow the link rpg.314 posted for more info.
 
So, the salvage SKU has it's own board design. Does that mean the 512SP board will be bigger/longer, like HD5850 and HD5870?
 
So, the salvage SKU has it's own board design. Does that mean the 512SP board will be bigger/longer, like HD5850 and HD5870?

May also have something to do with the delays, at least in some small part. Designing two different PCBs would no doubt take more engineering resources, and I have to assume more time unless there are two design teams operating independently.
 
So, the salvage SKU has it's own board design. Does that mean the 512SP board will be bigger/longer, like HD5850 and HD5870?

any PSP/Paint gurus able to take a 5800 and maybe overlay the pictured 470 (matching PCIe tongue etc) and possibly GF200 product ? I would assume NV stuck with the same mounting holes for the cooler/pcb to help alleviate costs to partners.
 
The thing has PCI-E 2.0 as requirement? Bummer, I was interested but I certainly won't upgrade my mainboard for a video card.
 
The thing has PCI-E 2.0 as requirement? Bummer, I was interested but I certainly won't upgrade my mainboard for a video card.

but isn't PCIe 2.0 inherently backwards compatible with PCI 1.x ? They both deliver 75W right ? Just 2.x upped the data per pin (well actually I think clock rate was doubled from 2.5Ghz to 5.0 Ghz from 1.0 to 2.0).. I mean it's not like when AGP 1.0 used 3.3v then 2.0 moved to 1.5 .. some users inadvertently killing their cards/systems by trying to use them together.
 
but isn't PCIe 2.0 inherently backwards compatible with PCI 1.x ? They both deliver 75W right ? Just 2.x upped the data per pin (well actually I think clock rate was doubled from 2.5Ghz to 5.0 Ghz from 1.0 to 2.0).. I mean it's not like when AGP 1.0 used 3.3v then 2.0 moved to 1.5 .. some users inadvertently killing their cards/systems by trying to use them together.

If that's the case then why put PIC-E 2.0 explicitly on the box as a requirement?
 
Like the way you put that NO2 boost system into your stock Smart car?
Aside from PCI-E 2.0 the P35 chipset isn't really missing anything. I'm running a 45mn Yorkdale Quad Core OC'ed to 3.4 Ghz. Why on earth would I upgrade that? For a few % more CPU performance that I don't even need? The only thing I feel like upgrading is my GTX 260... and I really hope that Fermi doesn't suck.
 
Back
Top