Ati/AMD Radeon 58xx series poll.

Which next gen gpu

  • Radeon 5870 X2

    Votes: 11 7.6%
  • Radeon 5870

    Votes: 61 42.1%
  • Radeon 5850

    Votes: 26 17.9%
  • Actually I'll wait for Nvidias gtx300 series

    Votes: 29 20.0%
  • Other (aka where's my option)

    Votes: 18 12.4%

  • Total voters
    145
  • Poll closed .
I have about 11" clearance to the optical drive bays in my new BTX-style Lian Li. I haven't installed any optical drives, but if I have to make a choice between them and a 5870x2, the optical drives can go (outside to an external enclosure).
Wow so you translate "worth the wait" into "similar performance delta as GT200 to RV770"? Where did you pull that out from? Who in their right mind would consider that to be worth the wait? :LOL:
Depends on pricing. If they manage to match or beat the going rate for a 5870, it might well be worth it to people interested in the GT300's brand-specific features. Though I'm sure it would make a lot of NV execs cry to have to sell at such a price.
 
Well, looking at the performance I've seen so far, it looks very good. The power/performance ratios are particularly impressive. In raw performance, however, it often is bested by the current-generation 2-chip solutions, and as a result it may not be quite up to par for a next-gen solution. However, with the power headroom they have in this design, they can easily release an updated version that has significantly higher performance, if nVidia ends up doing better.

Anyway, we'll have to see how it goes. The very impressive power/performance ratio is likely to give this architecture some significant staying power over the next couple of years.

As for myself, I'm still quite skeptical as to the status of ATI's Linux drivers, so I'm definitely going to wait and see. I'm not feeling any pressure to upgrade my GeForce 8800 GTX yet anyway. I may wait until the refresh parts this time around.
 
Well, looking at the performance I've seen so far, it looks very good. The power/performance ratios are particularly impressive. In raw performance, however, it often is bested by the current-generation 2-chip solutions, and as a result it may not be quite up to par for a next-gen solution. However, with the power headroom they have in this design, they can easily release an updated version that has significantly higher performance, if nVidia ends up doing better.

Anyway, we'll have to see how it goes. The very impressive power/performance ratio is likely to give this architecture some significant staying power over the next couple of years.

As for myself, I'm still quite skeptical as to the status of ATI's Linux drivers, so I'm definitely going to wait and see. I'm not feeling any pressure to upgrade my GeForce 8800 GTX yet anyway. I may wait until the refresh parts this time around.

Linux drivers are great (I have a 4870). My only complaint is their limited distro support with their Stream package.
 
Well, performance isn't nearly as important to me as compatibility and reliability in Linux.

I have tested Ubuntu and Gentoo. You still will need to use the command line for multiple displays or crossfire but otherwise works as expected.

Edit: Catalyst 9.9 on Gentoo directly from portage at the moment.
 
I have tested Ubuntu and Gentoo. You still will need to use the command line for multiple displays or crossfire but otherwise works as expected.

Except if your display is connected by HDMI, then you have underscan with corresponding black borders.

They are fast though.

Cheers
 
I had that problem with HDMI on my TV aswell, when using Windows Vista on a GeForce 8800GTS320.
Setting the 'right' resolution (in my case 720p) made me lose part of the screen. I had to scale it down to make it fit on the TV.
When I connect the TV through VGA, I don't have that problem.

I also use a digital cable TV box with HDTV through HDMI, and that one doesn't have the mapping problem on my TV either (720p just shows up fine). I wonder what causes it.
 
hdmi input on tv's (especially samsung) is borked. Getting a quality display fixes this.
 
hdmi input on tv's (especially samsung) is borked.

Samsung TVs out-of-the-box aren't configured to do 1:1 pixel mapping for HDMI/DVI input, but 10 seconds with the remote control will fix this - at least it did on the set I have (LE40A656).

Call this borked if you feel you have to.

There was an additional issue with the ATI drivers which had them default to overscan, this may have gone away. I swapped my 780G for a GF9400 board and the NVIDIA drivers coped fine.
 
Samsung TVs out-of-the-box aren't configured to do 1:1 pixel mapping for HDMI/DVI input, but 10 seconds with the remote control will fix this - at least it did on the set I have (LE40A656).

What kind of settings would one be looking for exactly? I might be able to find something like that on my TV if I know what to look for, or where to look :)
 
What kind of settings would one be looking for exactly? I might be able to find something like that on my TV if I know what to look for, or where to look :)

On my TV it's configured under the Picture Settings -> Size in the config menu. I set it to "just scan" rather than the other picture-size options such as 16:9, wide zoom, 4:3 and so on. This is pretty much Samsung-specific of course, as I understand it most of their more recent 1080p sets have this, not sure about the 720p. If you search or post at the forums at avforums.com you might find an answer for your specific TV.
 
I have two Samsung tv's here one 32", and 47", both have that option.
Just hit the "P.Size" button on the remote.
 
Back
Top