ATI Multi-VPU up and running. . .

Geo

Mostly Harmless
Legend
According to the Inq, at least: http://www.theinquirer.net/?article=21749 Could it be because they couldn't compete? ;)

Actually, looking at that die xray in Wavey's article at the portion of NV40 dedicated to SLI, it ought to be clear to all that SLI is not a spur of the moment tactical strategy. . .you don't --can't-- do that in silicon without some serious lead time!
 
I wonder why they went with the downscaling route (if the INQ is to be believed). Why wouldnt the faster card just take on more of the workload. Certainly sounds more flexible than Nvidia's solution.
 
"Scalable" prolly means in terms of DX capabilities (or special features like the AA or AF algorithms), rather than "performance". What, exactly, would you downscale on a 512MB R520 with a 512-bit bus to make it "compatible" with an X800XT? The core clock? The pipe-count? The memory bandwidth? It just doesn't make sense.

If MVP really does allow any two ATI PCI Express cards to work together, then it'll be a boon for any enthusiast who upgrades. You no longer have to find a little sister who's willing to let you fiddle with her computer - your old graphics card still has some use in your rig. Uhm, unless you're one of those enthusiasts who really does have to spend vast wodges on 2x the latest and greatest.

I wonder how long it'll take ATI to get the drivers stable? 6 months?

Spotted over on R3D, this new Sapphire mobo looks like an extreme overclocker's wet dream:

http://www.xtremesystems.org/forums/showthread.php?t=55734

The MVP version can't be far behind, can it?

Also, just a thought, do we know that Xbox 360 will only have 1 R500 in it? What if it's architected to have 2?...

Jawed
 
Jawed said:
"Scalable" prolly means in terms of DX capabilities (or special features like the AA or AF algorithms), rather than "performance". What, exactly, would you downscale on a 512MB R520 with a 512-bit bus to make it "compatible" with an X800XT? The core clock? The pipe-count? The memory bandwidth? It just doesn't make sense.

Good point. Doubt many people would want to buy a brand new card and cripple its new features by running it alongside an older one though. Might as well just buy a second older card cheap.

Jawed said:
Also, just a thought, do we know that Xbox 360 will only have 1 R500 in it? What if it's architected to have 2?...Jawed

Can a console title really be that GPU limited? I thought fillrate and bandwidth requirements were minimal compared to the PC.
 
trinibwoy said:
Jawed said:
Also, just a thought, do we know that Xbox 360 will only have 1 R500 in it? What if it's architected to have 2?...Jawed

Can a console title really be that GPU limited? I thought fillrate and bandwidth requirements were minimal compared to the PC.

If we're talking about gaming at 1280x720, perhaps not. But what about 1920x1080? I aint got a clue.

I'm still wondering what Xbox 360's going to do with 3 CPUs. Overall the next-gen consoles seems sorta "over-specified" to me. If Xbox is that over-the-top, why not have two R500s? Maybe as a refresh option in 2 years' time?

Jawed
 
Jawed said:
"Scalable" prolly means in terms of DX capabilities (or special features like the AA or AF algorithms), rather than "performance". What, exactly, would you downscale on a 512MB R520 with a 512-bit bus to make it "compatible" with an X800XT? The core clock? The pipe-count? The memory bandwidth? It just doesn't make sense.

Heh-Heh...;) No, what wouldn't make sense in that case is to buy a "512MB R520 with a 512-bit bus" and then buy a dual-slot PCIe mboard just to pair it with an x800 xt...;) That's probably why no one is likely to do that. In that case the R520 by itself would be sufficient, right? More likely is that someone who already owns an x800 xt will just buy another one to pair with it along with the motherboard, right?

Me--I'll stick with the R520 by itself because if it is what you say it is I'll definitely have to buy it--bankruptcy court, here I come!...:D And preferrably AGP at that.

If MVP really does allow any two ATI PCI Express cards to work together, then it'll be a boon for any enthusiast who upgrades. You no longer have to find a little sister who's willing to let you fiddle with her computer - your old graphics card still has some use in your rig. Uhm, unless you're one of those enthusiasts who really does have to spend vast wodges on 2x the latest and greatest.

Not to mention there should be no concern about precisely matching like cards of the same make and model number. IE, it seems like an x800xt by Asus and one made by someone else should work fine, etc., I suppose.

I wonder how long it'll take ATI to get the drivers stable? 6 months?

Dunno, but my Catalysts have been stable for the last two-point-five years, more or less...;)
 
Jawed said:
Also, just a thought, do we know that Xbox 360 will only have 1 R500 in it? What if it's architected to have 2?...

That would be very odd, given the economics involved (assuming duplicate GPUs).
 
MuFu said:
Jawed said:
Also, just a thought, do we know that Xbox 360 will only have 1 R500 in it? What if it's architected to have 2?...

That would be very odd, given the economics involved (assuming duplicate GPUs).

Yeah. So's 3 CPUs. ;)

Jawed
 
Well the first dual-core Intel and AMD CPUs are going to cost a minimum of around $700 a piece, at a guess, in roughly the same time frame, using roughly the same process technology...

Jawed
 
Jawed said:
MuFu said:
Jawed said:
Also, just a thought, do we know that Xbox 360 will only have 1 R500 in it? What if it's architected to have 2?...

That would be very odd, given the economics involved (assuming duplicate GPUs).

Yeah. So's 3 CPUs. ;)

Jawed

No - as already mentioned. They've increased the multithreaded throughput of a derivative CPU architecture by incorporating three dice into one package.

It would make little sense to do the same for a GPU and even less to have two completely separate chips (unless it became obvious very late in development that it was seriously lacking horsepower in the graphics dept - still, that would have massive implications). Think about the added PCB complexity, control logic, heat/power issues etc - all factors you have the opportunity to precisely regulate when asked to produce a single-chip solution from the ground-up.

Well the first dual-core Intel and AMD CPUs are going to cost a minimum of around $700 a piece, at a guess, in roughly the same time frame, using roughly the same process technology...

How can you compare the two? The ROI characteristics for high-end CPUs vs. consoles are worlds apart. Besides, each Xb2 CPU is not going to contribute anything like $700 to the bill of materials.
 
Quote:
Well the first dual-core Intel and AMD CPUs are going to cost a minimum of around $700 a piece, at a guess, in roughly the same time frame, using roughly the same process technology...


How can you compare the two? The ROI characteristics for high-end CPUs vs. consoles are worlds apart. Besides, each Xb2 CPU is not going to contribute anything like $700 to the bill of materials.

to be fair the two would most likely cost the same price . Its just that in the xb2 ms is not looking to profit off the box . They will profit off the royalitys from games made on the box .

Aside from that i think you will see the mid to high end cpus replaced with dual core cpus through out the rest of the year . To me it seems like it will be cheaper to once again move to almost full dual core cpus instead of single ones as they will speed bin them from mid to high
 
Dr. Ffreeze said:
Jawed,

Well the first dual-core Intel and AMD CPUs are going to cost a minimum of around $700 a piece, at a guess

Intel 2.8GHz dual core = $240 or so they say, WOW. =)

Now that sounds like fun! Blimey what with that and M$ apparently offering a free upgrade to 64bit WinXP, it seems the Wintel alliance is running scared...

Jawed
 
The main advantage with what is "thought" to be ATI's multicard implementation would be NOT having to have the exact same card. I just traded in my 6800NU for a X800XL..... 2 of those puppies - at a cost of about the same as a X850/6800U should rock. Lets just hope that, along with the freedom of using different videocards, that the whole package is much more usable than SLI......
 
It has yet to be seen whether or not ATI's solution has the same limitations as nVidia's with regard to games and such.

At least it's looking like you won't need two of the same card, so that's a point for ATI right there.
 
some more info about how it works

source: http://www.xbitlabs.com/articles/editorial/display/cebit2005-2.html

Here is what ATI says to its partners:

-Multiple ATI RADEON X800 XT boards cooperatively rendering a single frame;
- Requires two physical x16 connectors on the mainboard;
-Load balancing and synchronization implemented entirely in software;
-No physical connector requires between devices;
-Currently assumes two identical graphics devices installed in both connectors;
-Offers several user selectable modes of multi-processing;
-Works with any PCIe north bridge.
 
-Works with any PCIe north bridge.

If true, that would be key. The obvious reason: should work on any dual slot motherboard...regarless of manufacturer. Less obvious reason: you could then conceivably make a single slot AMR board...that works in ANY PCI-e mobo.
 
Back
Top