AMD: R8xx Speculation

How soon will Nvidia respond with GT300 to upcoming ATI-RV870 lineup GPUs

  • Within 1 or 2 weeks

    Votes: 1 0.6%
  • Within a month

    Votes: 5 3.2%
  • Within couple months

    Votes: 28 18.1%
  • Very late this year

    Votes: 52 33.5%
  • Not until next year

    Votes: 69 44.5%

  • Total voters
    155
  • Poll closed .
Never. I'd never bet on that cause ever since rv100 they pursuit maximal integration, and since r500 series we didnt even see simplest VIVO solutions that were there in the past. Not to mention fully fledged TV-tuner. It a part of past for ATi. Even there was some announcement of hd3850 based Wonder card afair.

If it's really 190W it's nothing new. HD2900XT w/ 160-170W consumption had same solution, just like cheaply made 3870x2 (~190-200W) or top end 4870x2. It's really needed for 24/7 stream applications more than for gaming. But still 190W seems much more from 150W hd4870 and 300-320W HD4870x2 solutions. On the other hand 100W (33%) less power consumption for 20% more performance from last 4870x2 shouldnt be overseen.

Sure, any company want to pursue maximal integration when it makes sense in each particular design...

ATI learned the hard way (HD2900) what problems big die size designs can bring...

I assumed that around 350mm2 is their limit (what they want to pursue for die size, think X1900XTX...)

If they can achieve these specs at 330mm2 then fine, i just thought that they will need +350mm2 die size... (just guessing...)
 
The problem is that no-one knows the level / section bench was done on, on some levels GTX295 paired with ì7 965 Extreme stays still under 30 FPS, on some, obviously as your post shows, it's well over 30FPS with mere Core2 Duo

Nehalem memory controller problems :cool: ... well, unsaid issue that intel want to comply with JEDEC specs of DDR3 while various memory manufacturers had different vision of what to sell as triple-channel sets. Old ddr3 chips that works out at ddr2 OC voltages like 2.1-2.2V came out as an issue only 5-6 month behind Nehalem launch. Ntm, presales of these triple channel sets and cleaning up storages of old chips. So i'd bet that's the reason for various sites had such low benches even with high end graphic. And usually of nobody these benchers even compares it w/ older CPU generation. At least not that in detail even if they find some time to spare for it.


I'm sorry, but the way i see it, ATI is starting Nvidia's blood sucking strategies and please don't tell me it's just business, cause it's not!

Business is when you launch the 4850 at 150euros and you should do the same for the 5850!

Unfortunately, you're views of doing business are somewhat different from top notch GPU manufacturers.

In the time of rv770 they need some chip to gain back collapsed market share that didn't brought up to reasonable levels with rv670 renewal strategy. So in June 2008 they came back to eat up the rest of 25% of GPU market cake .... and they done even more. Now today when they dominate in mainstream segment just like nVidia did w/ 7600gt/7900gs series they done with hd4850/4870/(hd4770 almost). And then they decide it's time to make some HUUUGE profits (even with hd4770 series according to that OEM cheat sheet ;)) on 40nm. And they even have explanation for it ... poor yields on 40nm (ooh how yes no) as they had for their firs 40nm child and deliberately capping down that small chip to measly 850MHz so they won't make same mistake as in time of rv350 (more popularly know 9600xt) which OCed like hell and in fact mess up even with their hot priced 9800pro and later 9800xt series

If rv740 could make to 1000MHz+ @50-60W nobody would even think about hd4890 or now JuniperXT ;) and they'd have much of marketing explanation for price oriented customers why is DX11 so much better than good old DX10.1


The big question is - was Fuad right after all, and the card is 60% faster than last gen? When compared to X2/GTX295, not single chip models like everyone assumed?

It's a hype ... They ned to sell that babies as a fastest cars before everybody figure out what they paid for.

And maybe Crysis benchmark get some patch optimized for AMD PII architechture or pre-results are fake as usual :LOL:
 
Last edited by a moderator:
Shouldn't write and raid in WoW at the same time :LOL:
320mm^2 was what it was supposed to say there

My estimation was around 350mm2 or a little more, so you can see that with the 330mm2 figure (ABT), the difference is not that big...
(i mean i don't have a clue about what ATI engineers architectural design strategies/implementation is , i mean we don't have info about specific details like specific architectural design, xtor density, if they changed their priorities regarding design for power consumption or for clockspeed, etc...)

Like I said a few posts back if ATI managed to achieve something like that in 330mm2 then for me it is a very good thing.

If the specs are accurate, the performance is acceptable for a DX11 part at $399/299$ (in relation with what NV is offering with GTX285 models...) and AMD needs money in order to survive...
 
fbdsgo.jpg


Is this Cypress?
 
Is this Cypress?

Well PCB ain't red otherwise we could bet it's RV770 :LOL:

Hmm...doesn't that screenshot of 24 monitors displaying one single image pretty much confirm a non AFR based crossfire mode??

Only bad thing is that even w today's technology we still have to saw at nasty monitor edges. So do we see OLED based monitors entering market rapidly in 2011-2012 just to se how seamlessly could 24 monitor picture could look alike?
 
Last edited by a moderator:
What is all the excitement over this multi-monitor talk. The amount of gamers who will actually use this is probably .00000000001%.
 
Nice, now all they have to do is bundle 6 LCDs with each board :LOL: I can already see people springing for 24, $100 LCDs.

Heh... I wonder just how thin Samsung will be able to make the bezels on their line of Eyefinity LCDs?

Big bezels = annoying if you use it for that purpose.

Regards,
SB
 
What is all the excitement over this multi-monitor talk. The amount of gamers who will actually use this is probably .00000000001%.

Well, the fact it supports 3+ displays is already something that a lot more than .0000001% want, gaming support is of course another issue completely, but in racing & flyingsimulators etc, it's definately great
 
I already know I'll be using it for Eve Online, I'll be able to put the chat panels, market panels, fitting panels, overview, etc on the other monitors and just have the glorious visuals on the main panel. Although I already do that partially with 2 monitors. :D And with 3 monitors that'll still leave me space for a browser, IM client, remote desktop, and applications.

Now if only ATI would allow the setting of custom resolutions like Nvidia without having to resort to Powerstrip.

Regards,
SB
 
Back
Top