AMD: R8xx Speculation

How soon will Nvidia respond with GT300 to upcoming ATI-RV870 lineup GPUs

  • Within 1 or 2 weeks

    Votes: 1 0.6%
  • Within a month

    Votes: 5 3.2%
  • Within couple months

    Votes: 28 18.1%
  • Very late this year

    Votes: 52 33.5%
  • Not until next year

    Votes: 69 44.5%

  • Total voters
    155
  • Poll closed .
Well feel free to create a poll for what reason anyone would buy a 5870 for and include an answer for it being the fastest single chip GPU available for existing games and I'll tick that one w/o 2nd thought.
Too many threads already, lets wait for the launch first before coming to such conclusions. :p
 
Too many threads already, lets wait for the launch first before coming to such conclusions. :p

Well I realize it doesn't help the pro X11 vs. against X11 agenda of both sides. You don't sound the type of guy that would counter me with some idiotic PhysX argument, so I will take the liberty and consider the 5870 the fastest single chip GPU after its available of course.
 
Anyone that thinks DX11 is not worth it (the better part of an evening of coding) needs to go back to his IHV and ask for a new set of text to spew at this community.

Or maybe they'll change their mind when some numbers "leak" in the not so distant future?
Anyone who thinks that latest PhysX titles bring nothing but "eye candy physics" shoudn't expect much from DX11 in 2010. They should expect less than that actually.
 
Did anyone say D3D11 titles would flood the market this year? No
Did anyone say no D3D11 titles this year? Yes, Fuad did and by extension you did too on top of saying D3D11 isnt worth it.

Actually what I said was that it didn't make sense.

If you feel that there's more to be said about that, or indeed want to continue the 'spin' thing, I would suggest using a topic other than this one though.
 
Anyone who thinks that latest PhysX titles bring nothing but "eye candy physics" shoudn't expect much from DX11 in 2010. They should expect less than that actually.
So, we shouldn't expect perf improvements and we shouldn't expect IQ improvements over previous API modes any time soon? OK... :D
 
On the first link's images, i might be imagining but i think i read "04" as the last digits on the memory chip confirming 0.4ns (or 5.0Gbps) chips.

There are rumors floating round though, that the ES and retail boards have different memory manufacturer ie Hynix might be still in the picture. From memory i think some 4870x2 boards had Hynix chips.

Either way looks like they have emptied their stock of fire sale Qimonda memory. Nvidia still seems rather reticent so AMD will have tricky negotiations trying to keep the price down on its volume alone given the background lately of increasing memory prices(particularly DDR2). Imagine they would like Elpida and Winbond to also enter the picture as soon as possible.
 
He's probably not far off the mark either. Targetting DX11 this year does not make good business sense.

Seems rather harmless spin compared to some other rumour mongers out there.

http://www.pcgameshardware.com/aid,...of-Pripyat-Dirt-2-and-Alien-vs-Predator/News/
AMD confirms DirectX 11 games: Battleforge, Stalker: Call of Pripyat, Dirt 2 and Alien vs. Predator
AMD offers some insight on DirectX 11 powered games in a current blog on their website. According to AMD, Battleforge, Stalker: Call of Pripyat and Dirt 2 will feature DX11 so that their Radeon 5000 series will have serious advantages this year.
During the launch of Ati Eyefinity, AMD also talked about games that will come out in the next months powered by DirectX 11. There were a lot of rumours surrounding the DX11 games list, and until now just Dirt 2 was a serious contended. Now AMD lifted the curtain for:
• Battleforge (EA) - around September/October 2009
• Stalker: Call of Pripyat (GSC) - November 2009
• Dirt 2 (Codemasters) - December 2009

Also, it's rather obvious that the fourth game in the DX11 series will be Alien vs. Predator (beginning of 2010) as the Rebellion developers make an appearance in a DX11 video from AMD.
 
You shouldn't expect as much improvement from DX11 in 2010 as from PhysX in latest GPU PhysX-enabled titles. Hope that's clear.


Oh, you mean Arkham Assylum that even the CPU can emulate partial/adequate PhysX acceleration on :rolleyes:


Waiting for Frostbite II's deferred CS rendering, should be something amazing post X-Ray (with the zillions of bugs it has) and the Dead Space engine.
 
Will Eye-finity require that you saturate the native resolutions of each monitor, or can you render something lower res and stretch it across?

No, you are not limited to the max supported resolution. The software will enumerate the panels and pick multiples of resolutions that all panels support. If you have 3 panels of all the same type that support 1024x768, 1280x1024, 1920x1080 (for example) the following SLS (or Display Group, as they are called in CCC) resolutions are available:

3x1 Landscape:
3072x768
3840x1024
5760x1080

3x1 Portrait:
2304x1024
3072x1280
3240x1920

1x3 Landscape:
1024x2304
1280x3072
1920x3240

When a display group is defined, then this is "seen" by the OS as a single panel and the OS (or CCC) treats it as such, allowing all the resolution configurations that the display group supports to be used - that is then available to the OS itself and other applications.

Sorry i am slow but does Eyefinity really render a game at those really high res or does it involve smart scaling? I think it has to be some scaling tricks... Will these tricks make videos scaling look better than DX10 GPUs? But TBH...Eyefinity feels as gimmicky as 3D vision..maybe worse (no offence Dave) if it is only useful for game scaling...

As has been mentioned, this isn't scaling, this is rendering more pixels. Additionally, dependant on the aspect ratio this is increasing the field of view in the title allowing you to see more of what around, and potentially whats coming up.

Check out the view here: http://www.youtube.com/watch?v=teE5wqT2DNU

At around the 1m 15s mark the guy taking the video is just panning to see what is on the left panel, and then all of a sudden a car hoves into view to overtake - if you were just playing with a single panel then you wouldn't know that until it had hit the centre panel; if you think of other racing games, with more cars on track and things are closer, then that view could be very handy. Alternatively, if you think of an FPS, especially one like Left 4 Dead, having that peripheral vision can be very advantageous to react to whats coming at you from a field of view that you wouldn’t have known about with just a single panel.

No, this is:

:cool:

Hrumph. Cheat! ;)
 
Im wondering if ATi ever managed to fix powerplay when using a multi monitor setup. Ive had both the HD4850 and GTX260+ run at full 3D clocks so the whole low idle power consumption goes out of the window. The problem is that cards have to run at full 3D clocks (especially the memory clock) when running dual/triple/multi monitors or else is met by artifacts, crashes and what not.

If this is fixed, my wallet might have a burnt hole in the nearby future :cool:
 
21239137.jpg


Looks like the RV870 reminds me R600 in below image.


ati_r600_chip1_sm.jpg
 
Back
Top