Time to scotch those R300 isnt fully DX9 rumours?

aye I thought that, assuming V means Visual an not say Visual.

They are also promoting the 9000Pro as the fastest GPU arent they? Which would make sense then.

9000 - fastest Radeon GPU
9700 - fastest Radeon VPU
 
phynicle said:
hmmm don't like the sound of only 9.0

err right, so you subscribe to the DX9 will be splintered into DX9 and DX9.1 again do you? On a DX version that isnt even out yet?

Is that what you mean?
 
If there is a DX9.1 I don't think there will be any big changes over DX9, not like their were for DX8.1 over DX8. DX8 was sort of a faliure, with some devs complaining about the limited pixel amd vertex shader support, so they brought out DX8.1 to fix that. But I doubt that will happen with DX9. Pixel/Vertex shader 2.0 will likely be as good as it gets until DX10.
 
But I doubt that will happen with DX9. Pixel/Vertex shader 2.0 will likely be as good as it gets until DX10.

I wouldn't be betting any body parts on that prediction Darren.
 
*ATI's Radeon 9700 128MB DDR compared with Nvidia Geforce 4 Ti4600 128MB DDR, as measured by Unreal Performance Test v918 and 3DMark 2001 SE.
Tested on the following system: P4-2.4 GHz CPU, Intel 850e (?) chipset 512MB PC800 memory, Windows XP Pro with ATI driver v.Alpha1 and Nvidia driver v2832. Resolution 1024*768 and 1280*1024.

[/quote]
 
I wouldn't be betting any body parts on that prediction Darren.

Nah, but it just seems sensible to me. MS have to stop somewhere for a decent amount of time or support for a good form of pixel shaders will never come from devs. The pixel shader tech in DX8 was obviously not good enough in allot of dev's opinions so it was changed in DX8.1 quite quickly to make it more adequate. But once 2.0 is out with DX9 it should be more then adequate, in which case it should be left like that at least until DX10. So that devs don't have a situation were they have one card supporting only 2.0, that they have to make sure runs the pixel shader ops in their game, and another card with 2.1 (or whatever) that they also need to support in their game to get the best out of that card. I just think their should be a standard pixel shading technique for a littlewhile so that devs have a chance to get their heads around it before they suddently have pixel shader 2.1/2.2/2.3/2.4 thrown at them.
 
1. Higher-level shading languages will make differentation between hardware designers easier.

2. We don't yet have any evidence as to whether or not the R300 will be truly a "DX9" part, provided you define this as follows:

A true DX9 part is one that is comprehensive, that includes everything truly important about DX9. If ATI has left out anything important that causes developers to just want to use the same codepath for the R300 as they use for DX8 cards, then the R300 is not a DX9 part.

I suspect that the R300 will be considered a fully-DX9 part, but that the NV30 will just be better (And it had better be...considering it's coming out later).

I further suspect that the R300 will be a very high-end part for its lifetime, meaning it will have little impact on the market as a whole for this fall.

Personally, I'm much more interested in how the R300 performs FSAA/anisotropic and PS/VS than its performance.
 
Fin said:
*ATI's Radeon 9700 128MB DDR compared with Nvidia Geforce 4 Ti4600 128MB DDR, as measured by Unreal Performance Test v918 and 3DMark 2001 SE.
Tested on the following system: P4-2.4 GHz CPU, Intel 850e (?) chipset 512MB PC800 memory, Windows XP Pro with ATI driver v.Alpha1 and Nvidia driver v2832. Resolution 1024*768 and 1280*1024.
[/quote]

Qoute from where?
 
T2k said:
Fin said:
*ATI's Radeon 9700 128MB DDR compared with Nvidia Geforce 4 Ti4600 128MB DDR, as measured by Unreal Performance Test v918 and 3DMark 2001 SE.
Tested on the following system: P4-2.4 GHz CPU, Intel 850e (?) chipset 512MB PC800 memory, Windows XP Pro with ATI driver v.Alpha1 and Nvidia driver v2832. Resolution 1024*768 and 1280*1024.

Qoute from where?

The advertisement. Its on the right hand side. Its the explanation of the asterix next to "Fastest"
 
I was told that the plan always to forge ahead with the launch, even if DX9 was delayed; Orton was very keen to beat nVidia to market by a few months. They were confident that any minor problems could be patched over in software after the specification was finalised. I'm sure they have as good an idea of concrete DX9 specs as anybody though, seeing as they were heavily involved in its development. Same thing goes for AGP 3.0 which is supposedly still in "draft" form.

MuFu.
 
AGP 3.0 is not in 'draft' form. It is finalised... KT400 and NF2 both are AGP 3.0 compliant - INTEL finalised the spec many moons ago.

To be fully DX9.0 compatible you can assume it is also concrete [in specs at least] too.

Remember the rumours that occured last year about DX8.0 being late? It was not.
I even remember rumours that DX7.0 was going to be late. It was not.
 
misae said:
AGP 3.0 is not in 'draft' form.

Ah, ok. Sorry, I was just going by this: http://www.intel.com/technology/agp/agp_draft9.htm

This draft version of the AGP Specification 3.0 (“Draft Specificationâ€￾) may be downloaded and reproduced only for your internal review and comment to Intel. This document is NOT the final version of the Specification and is subject to change without notice. A modified, final version of this Specification (“Final Specificationâ€￾) when approved by Intel will be made available for download at this Web Site.

MuFu.
 
Back
Top