Natoma said:Oi.... Just release the damn cards. :?
At least we know when ATI is releasing their stuff. Nvidia? Hello? Paper launched 3 weeks ago and nothing, nadda, el zippo? Gah.
I am NOT guessing!Natoma said:You're just guessing. I'm looking for an official statement.
jolle said:while on the subject, was the Albatron site numbers brushed off as nonsense?
pretty insane specs if people doubt 475Mhz..
Ardrid said:Natoma said:Oi.... Just release the damn cards. :?
At least we know when ATI is releasing their stuff. Nvidia? Hello? Paper launched 3 weeks ago and nothing, nadda, el zippo? Gah.
45 days from launch is the official tagline as I'm sure you already know. Some manufacturers are actually supposed to be releasing early this month, and Tamasi claims you'll definitely be able to get something by Memorial Day.
digitalwanderer said:I am NOT guessing!Natoma said:You're just guessing. I'm looking for an official statement.
I told you, my geek-sense got that special tingle about July....how more bloody official do you really need than that? :|
Ardrid said:jolle said:while on the subject, was the Albatron site numbers brushed off as nonsense?
pretty insane specs if people doubt 475Mhz..
Yeah, I'm thinking so. The funny thing is they still mention 600MHz on their site
'digitalwanderer said:Anyone else sort of find it funny that they specifically mention how nVidia is going to change the BIOS to adjust their clockspeed? It just sort of felt like they wanted to put in an extra technical term or something.
Applying it to math-critical blocks in graphics chips can reduce overall chip area by up to 10%, saving millions of dollars in manufacturing costs for each design. It also improves performance in processor designs in varying degrees, depending on the application.
It may emerge that the whole GeForce 6-series (NV4x-family) of graphics processors already uses the CellMath capabilities to boost efficiency of integrated circuits, however, NVIDIA has never announced this.
Actually, the speculation was that nV planned to up the clocks before shipping all along, especially given that their dev samples (UE3, Vegetto's cousin's uncle's roommate's parrot's) had 475MHz cores (though UE3 showed some artifacting, so stepping down to 450MHz may correct that). And it seems to me the BIOS controls clock speeds, so how else would they modify them? eVGA did the same with their 5900U (then later had to reduce speeds because of problems).tazdevl said:MuFu said:PatrickL said:Should be nice to stop to link every inquirer post
Especially ones that echo what has been said on here for weeks.
A lot of what's been echoed has been the release of a new uber card instead of a BIOS modified Ultra that runs at a faster clock.
PaulS said:MuFu said:I'm not sure that the official spec is actually changing.
Up the auto-overclock level?
jolle said:UE3 had artifacting? I never noticed, then again the vid is all ive seen and its pretty damn lowres..
where you get that from? never heard it before hehe..
Ardrid said:jolle said:UE3 had artifacting? I never noticed, then again the vid is all ive seen and its pretty damn lowres..
where you get that from? never heard it before hehe..
http://www.theinquirer.net/?article=15115
jolle said:Ardrid said:jolle said:UE3 had artifacting? I never noticed, then again the vid is all ive seen and its pretty damn lowres..
where you get that from? never heard it before hehe..
http://www.theinquirer.net/?article=15115
oh.. them..
EDIt
Sais UT2k4 there tho.. not UE3.0
PatrickL said:if you look in that forum, Fuad/Ardrid, you should find the posts that made Mark Rein posts
I still say that it's another card. Maybe the GT, unless we have evidence to the contrary. I mean, if you look at cars, GT says "UBER PERFORMANCE MODEL," not "Relatively High-End But We Still Make Faster Ones."DaveBaumann said:AFAIK the reference clocks of the Ultra do no move.