R700 Inter-GPU Connection Discussion

No, i'm saying its better to use the performance that we do have on core graphics rather than waste it on insane framerates and resolution/AA.

Insane resolution/AA as judged by whom?

For those of us with 30" monitors 2560x1600 isn't an insane resolution, it's a day-to-day fact of life. Even a game as graphically simplistic as WoW can struggle at that res, turn up the TSAA and go to the wrong place where alpha textures rule and you're in the Less Than Tens Of Frames Per Second Land. Crysis is just a ROFLfest at that sort of resolution.

And no really when you're talking about $400-500-600 graphics cards of which folks will happily buy two just because, a ~$1000-2000 monitor which will last three generations isn't out of whack.
 
Insane resolution/AA as judged by whom?

For those of us with 30" monitors 2560x1600 isn't an insane resolution, it's a day-to-day fact of life. Even a game as graphically simplistic as WoW can struggle at that res, turn up the TSAA and go to the wrong place where alpha textures rule and you're in the Less Than Tens Of Frames Per Second Land. Crysis is just a ROFLfest at that sort of resolution.

And no really when you're talking about $400-500-600 graphics cards of which folks will happily buy two just because, a ~$1000-2000 monitor which will last three generations isn't out of whack.

I play all my games on my 42" 1080P plasma from 8' away. I love AA and high res.

These things always devolve into subjective debates about IQ which simply cannot be "won".
 
I gather Tech Report didn't find any special sauce with R700 scaling on their preview this weekend.

From HardOCP:

HardOCP said:
The new Radeon HD 4870 X2 uses a newer bridge chip that supports PCIe 2.0 and AMD has also improved the bandwidth between the GPUs. The bandwidth between both GPUs has been bumped from 6GB/sec, as found on the Radeon HD 3870 X2, to 20GB/sec on the Radeon HD 4870 X2. AMD has also beefed up the RAM on the Radeon HD 4870 X2; our samples have 1GB of GDDR5 accessible to each GPU (so 2GB total on the board.) Now, this memory is still not completely shared, the framebuffer is still duplicated and the memory is not combined. However, due to some inherent evolutionary upgrades present in GDDR5 memory modules there is actually a method in place to share some data between modules using that 20GB/sec bus.

I assume the increased bus speed is from the interconnect (right?). Therefore the ability to share some ram (not yet present in drivers?) should be a direct result from this interconnect. If true, it would make things very interesting.
 
Incorrect, Vista will support DX11 .. and Windows 7 runs on the same code base as Vista SP1. Below is a quote from a MS developer working on Windows 7.

US

Yeah, I double checked some articles and found that speculation to be false.

I wonder how people will react when games run better on Vista then on Windows 7, just like the current fiasco with DX9 games on XP vs Vista.

We can agree to disagree even though I'm right. I never said performance was the only factor, just the most important one, as it gives the other factors perspective and determines their worth. Without knowing how it performs, who cares about the other criteria?

Unfortunately most of the exclusive features available between the two are throwaways or PR fluff and not make-or-break gotta haves. I think VIVO capability is the big difference today and that's mostly forgotten.

Speaking of that, what happened to AVIVO?

:LOL:

But your argument is flawed because there are customers out there that don't care at all for performance. Case in point: Intel IGP's.

You're now acknowledging differences between what the IHV's offer (even for something as obvious as SLI/CF). However, whether or not a feature/characteristic is a deal breaker isn't up to you to call. Different people have different needs and wants, which is why each IHV has their own niche market as they do not supply the exact same product (as I said, the desktop 3D market is not in perfect competition).

I assume the increased bus speed is from the interconnect (right?).

12GB/s from the PCIE2.0 bridge and 8GB/s for the new interconnect?
 
I assume the increased bus speed is from the interconnect (right?). Therefore the ability to share some ram (not yet present in drivers?) should be a direct result from this interconnect. If true, it would make things very interesting.

I'm from Missouri on that one. Show Me and I'll believe. . . .what happened to those 15% faster benchies someone was claiming to have seen?

And, alas, 10.5" again. Tho I guess I'm not surprised.
 
I'm from Missouri on that one. Show Me and I'll believe. . . .what happened to those 15% faster benchies someone was claiming to have seen?

Hey I'm not trying to convince you of anything, just throwing out some ideas. :p
 
I play all my games on my 42" 1080P plasma from 8' away. I love AA and high res.

These things always devolve into subjective debates about IQ which simply cannot be "won".

Agreed, it comes down to whether you prefer better core graphics (as you say, the console approach), or better image quality/framerates. Obviously with a PC you can get both but I prefer more of the former than the latter.

I'm just saying that a game which scales well should allow the user to pick between the two approaches which can only be a good thing. Crysis on medium for example will let you have the high res and lots of AA etc... but IMO it will still look up there with most console games. Haze being a good example of how that type of environment can look on consoles.
 
Rangers said:
One of the R700 review sites even said they were thinking of canning Crysis from their benchmark suite because "it runs like crap on all high end cards" and that's pretty much true.
I remember that. I thought it was a pretty ridiculous thing to say actually.

So they don't like to test games that stress high end GPU's anymore? They prefer testing games which don't utilise their potential? :rolleyes:
I think the recent objections to Crysis as a benchmark are more because it apparently isn't GPU limited. If you double the power of the graphics card and Crysis' frame-rate remains unchanged (because the bottleneck is something else - CPU, VRAM size, PCIE bandwidth, whatever) then that makes it a pretty useless GPU benchmark.
 
I think the recent objections to Crysis as a benchmark are more because it apparently isn't GPU limited. If you double the power of the graphics card and Crysis' frame-rate remains unchanged (because the bottleneck is something else - CPU, VRAM size, PCIE bandwidth, whatever) then that makes it a pretty useless GPU benchmark.

I would agree if thats the case but its not. At least not when you max out the settings. I don't get why sites would complain that Crysis doesn't scale with graphical power while not testing it on its highest graphical settings.
 
I'm from Missouri on that one. Show Me and I'll believe. . . .what happened to those 15% faster benchies someone was claiming to have seen?
You mean like this?

http://www.anandtech.com/video/showdoc.aspx?i=3354&p=4

While I personally tend to read The Tech Report first, in this case, their review is kinda odd. I compared a number of previews (all I could find) and TR´s results don´t make any sense to me. 2x 4870 (CF) is faster than their 4870X2, which even has 2x1GB GDDR5 on-board. Their results differ a lot from what I can read on the other sites.
 
Last edited by a moderator:
I play all my games on my 42" 1080P plasma from 8' away. I love AA and high res.
Even though I agree with kyleb's assertion of that being excessive resolution for the viewing distance, I also think one can still notice aliasing so your opinion makes plenty of sense.

For things like crawling edges you don't need to discern adjacent pixels and see distinct points. Aliasing will still affect IQ in images beyond the eye's resolution.
 
Even though I agree with kyleb's assertion of that being excessive resolution for the viewing distance, I also think one can still notice aliasing so your opinion makes plenty of sense.

For things like crawling edges you don't need to discern adjacent pixels and see distinct points. Aliasing will still affect IQ in images beyond the eye's resolution.

Thank you. That's exactly the sort of subjective IQ difference I was referring to in the aforementioned resolution discussion w/kyleb. I can't see every single pixel at this distance, but you can bet your butt I can still see aliasing, sadly even with 16x CSAA :(
 
Thank you. That's exactly the sort of subjective IQ difference I was referring to in the aforementioned resolution discussion w/kyleb. I can't see every single pixel at this distance, but you can bet your butt I can still see aliasing, sadly even with 16x CSAA :(
And if you remember my comments in that thread, I said I agree with kyleb that 1080p offers no advantage over 768p under your viewing conditions provided that the image isn't aliased, e.g. a TV show or movie.

16xCSAA isn't perfect, and according to some of the IQ comparisons we've seen, some edges can actually be worse. For the most part, though, you're probably seeing shader aliasing, so your experience is not necessarily a justification for higher AA levels.
 
You mean like this?

http://www.anandtech.com/video/showdoc.aspx?i=3354&p=4

While I personally tend to read The Tech Report first, in this case, their review is kinda odd. I compared a number of previews (all I could find) and TR´s results don´t make any sense to me. 2x 4870 (CF) is faster than their 4870X2, which even has 2x1GB GDDR5 on-board. Their results differ a lot from what I can read on the other sites.

Well, that's a start at least.
 
No, i'm saying its better to use the performance that we do have on core graphics rather than waste it on insane framerates and resolution/AA.

We have a choice, settle for Bioshock/CoD4 (console) level of graphics at 19200x1200/8xAA/16xAF or get something that looks like Crysis at a straight up 1680x1050. Gameplay aside, i'll take the second one any day of the week. Especially when you can pump up the res/AA in that same game when more powerful GPU's get released. What do you do with Bioshock/CoD4? Buy a new monitor and run at 2560x1600?

that leaves people that don't upgrade very regularly in the cold.
Crysis looks much worse and plays worse than Far Cry on my comp (the 7600GT is a bit of a weak link here. should have got better but that was a sidegrade in a hurry from 6800GT AGP). thanksfully I've got Bioshock to have some good looking and playable recent gaming. you can play it in 1920, I can play it in 1024 ; but I can't play Crysis at all.

I never mind older games with maxed out IQ as well (waiting for quake live, 'cos these damn Q3 bots suck!)
I also hope id's Tech 5 (and that interesting Rage game) will run easily on lowly machines, as did the Q3 and doom3 engines (and even UE3 it seems)
 
Quote:
Originally Posted by pjbliverpool
Thats as long as the visuals match the performance requirement of course which in Crysis's case they do.

That is Not True. Crysis definatly DOES NOT look as good as it should for the processing power requirements!!!

Just a BAD ENGINE, anyone trying to defend other is just slow, sorry.
Compare 2 new games (GRID COD4) and play them at higest playable quality, does crysis then look best?

Hell NO! Why? Because you cant go higher than medium settings at medium resoltution with medium antialiasing.....

Look for all the benchmarks you want, it will prove my point.

Why do you think The new Crysis has to make all that better and saying it will run better by needing less performance power??
 
@mintmaster
Quote:
And if you remember my comments in that thread, I said I agree with kyleb that 1080p offers no advantage over 768p under your viewing conditions provided that the image isn't aliased, e.g. a TV show or movie.

-----------------------------------------------
hmmm, on a 42" screen the difference between 768p and 1080p r very visible to me.
(even at high distance)
768p = +- 1MegaPixel
1080p = +-2MegaPixel

Maybe i just have very good eyes :)
 
that leaves people that don't upgrade very regularly in the cold.
Crysis looks much worse and plays worse than Far Cry on my comp (the 7600GT is a bit of a weak link here. should have got better but that was a sidegrade in a hurry from 6800GT AGP). thanksfully I've got Bioshock to have some good looking and playable recent gaming. you can play it in 1920, I can play it in 1024 ; but I can't play Crysis at all.

I never mind older games with maxed out IQ as well (waiting for quake live, 'cos these damn Q3 bots suck!)
I also hope id's Tech 5 (and that interesting Rage game) will run easily on lowly machines, as did the Q3 and doom3 engines (and even UE3 it seems)

By that same reason games today run much worse on my Voodoo 5 5500 than the games that were around when it was released. ;)

I'm not quite sure the point you're trying to make here. Eventually, of course, older hardware just won't be able to run newer games well or with any sort of IQ.

You analogy to how well Crysis runs on a 7600 GT would be akin to someone running a Radeon 7500 (or even 8500) trying to run Far Cry.

Yes, it's the bane of computer gaming. Older hardware is eventually left out in the cold. With more demanding games it happens much sooner.

Yes, it's unfortunate for those that can only afford to upgrade every 4-5 years, but I'd much rather see continued progress than not. Even if sometimes the result isn't a perfectly 100% optimized engine.

Regards,
SB
 
Quote:
Eventually, of course, older hardware just won't be able to run newer games well or with any sort of IQ.
--------------------------------------------------------------

Crysis cant be run with latest videocard but is "older" with any high resolution!

:D
 
By that same reason games today run much worse on my Voodoo 5 5500 than the games that were around when it was released. ;)

I'm not quite sure the point you're trying to make here. Eventually, of course, older hardware just won't be able to run newer games well or with any sort of IQ.

You analogy to how well Crysis runs on a 7600 GT would be akin to someone running a Radeon 7500 (or even 8500) trying to run Far Cry.

Yes, it's the bane of computer gaming. Older hardware is eventually left out in the cold. With more demanding games it happens much sooner.

Yes, it's unfortunate for those that can only afford to upgrade every 4-5 years, but I'd much rather see continued progress than not. Even if sometimes the result isn't a perfectly 100% optimized engine.

Regards,
SB

??? He compared Crysis to COD4 and Bioshock and said that the latter looked better at the same performance levels, which has nothing to do with old HW/new SW mismatching.You threw out his whole post to focus on one out of context sentence and only part of that sentence.
 
Back
Top