R420 Hints from people under NDA!

Ailuros said:
It has; history is simply repeating itself. Nothing that I wouldn't have expected whatever the storyline would have looked like.
So you are saying that since history is reapeating itslef then everything is fine, we should leave such irational statements uncommented :rolleyes:
I guess we could also start WWIII now, it will start sometime anyhow...

If you mean to tell me that there aren't any users on these boards with any type of agenda, then I guess yes my comment was inappropriate.
There you go again, degrading my (our) intelligence. Of course there are users with certain agendas, but that was not what i meant. What i meant was that there are some very open-minded people posting here, not to mention the guys who run the site who IMO are giving lessons everyday on how a tech web site should be run. The overall quality of posters and topics is well beyond that of the original post's that was the reason for this thread and you are in fact defending it using the "they would do the same" excuse. i find that infuriating.


For the record you didn't say anything either about NVIDIA's official reactions either.
I missed that, care to ellaborate ?

However if there should be something in that observation that is exaggerated or wrong, you could address it as an adult as you say. Shouldn't be too hard either.
I hope i did.
 
Doomtrooper said:
Yes a growing trend around here, Ailuros being the more respected knowledgable being slammed by another few posts wonders, the same kid that is telling everyone a 5950 is just as good as 9800XT in PS 2.0.
Every launch is the same.

The amount of respect i have for people on these forums are in no way connected to the nr of posts they have made.
 
Doomtrooper said:
Yes a growing trend around here, Ailuros being the more respected knowledgable being slammed by another few posts wonders, the same kid that is telling everyone a 5950 is just as good as 9800XT in PS 2.0.
Every launch is the same.
:?
I think you are the only person who has any idea of what you are talking about.
 
Re: No indications of valid info on Rage3d

Bouncing Zabaglione Bros. said:
UE3 is going to be a quantum jump. It's not like HL2 with a path designed to run on older hardware. It's going to be a complete break which will simply not work on anything older than NV40/R420, and even then only at it's lowest settings.

Not any more of a jump IMO than regular quake3 was.most likely less b/c they will likely have tons of backwards compatibility.

And btw on farcry perhaps you are right, I did not think about the fact that they used a tiny level. I was simply saying that it did look very nice.

edit:
I find a certain thing ironic. The people who are saying don't buy the 6800 for unreal3 (btw they are right) should have said (and maybe they did) don't buy the 9700/nv30 for hl2, or doom3. I cannot comprehend the number of people who bought cards like this specifically in their words for those two games and we are still waiting and waiting. It is always better to buy the cards after the game. If someone buys a 6800 before seeing the competition to me that is no more irrational than buying a 9800XT they were ridiculously expensive for a mere fraction of improvement over the pro model.
 
Bjorn said:
The amount of respect i have for people on these forums are in no way connected to the nr of posts they have made.

No shit....there is people joing this forum right now for the sole purpose to troll and attack members that may have a different view. That is my point.
 
Doomtrooper said:
Yes a growing trend around here, Ailuros being the more respected knowledgable being slammed by another few posts wonders, the same kid that is telling everyone a 5950 is just as good as 9800XT in PS 2.0.
Every launch is the same.
Sayeth the ultimate fanboy. :rolleyes:

























j/k! ;)

Sorry, I just can't resist sometimes. :LOL:

Oh yeah, btw....

DemoCoder said:
Owned and Pawned. Muahah. Feel it. Vindicated. Mother. Combat. Boots. Heat Sinks...

Damn it man, have I told you what an absolute bitch it is to clean coffee off the anti-glare screen on my new monitor...you bastage! ;)
 
martrox said:
Jeez, Dos, you've been here a few months, have 70 posts, and want to put the smack on a longtime member like Ailuros? Do you understand the term "chuzpah"? Having spend a few years and plenty of product cycle here, we have seen this kind of thing at every new introduction...... It's not going to stop, and denying it's happening is a waste of bandwidth. And if you haven't seen the fanboys BS crap around here, then you are either not checking out other threads or you are blind......

I wouldn't normally reply to this but i am really interested in undestanding your logic. First of all, i didn't know that posters are classified by the amount of posts they have. If that's the case then i should probably stop speaking my mind and bow to other members with 2000+ posts. Secondly, FYI the amount of time that one has been registered may not be directly related to the time he has been following a forum. You also say that we have seen this kind of thing in every new introduction, and i say this cannot be an excuse damn it. Ailuros in this case, thought the original post's flawed logic was ok just becuase "if ATi had launched first the same thing would have happened" and "the other side has been doing it in other forums for ages". Whenever there are similar irrational posts/statements by the "other side" they are discussed in B3D's forums most of the time and ppl discredit them and rightfully so. Lastly, i have seen very isolated occurances of f*nbo* crap here and mostly by some very specific idividuals.
 
Do you think that the X800 (at least XT part) will be more bandwidth limited than the 6800? It seems that the core will be clocked much higher but not the memory :/ Any hint/information/clues/expectation/theory accepted ;)
 
DemoCoder said:
It's not equivalent. How would you do motion estimation in the shaders efficiently? Only part of the MPEG pipeline is amenable to stream operations. The shaders help with removing artifacts, deblocking, potentially even block transforms, but there's alot more. There would be no need for the Theater chip otherwise.

For any given chip both system have a finite amount of compute abilities and so regardless of what happens some operations for both systems are done on the CPU. One advantages that VideoShader has is the quantity of compute ability scales with the 3D performance.

However, I suspect this isn't ATI's only answer to video functionality.
 
Re: No indications of valid info on Rage3d

Sxotty said:
Not any more of a jump IMO than regular quake3 was.most likely less b/c they will likely have tons of backwards compatibility.

They won't. Epic's new engine will be built around using tech that takes NV40/R420 performance and features as a minimum requirement. I don't know how it can be any clearer when that's what Tim Sweeny has publicly said.

When you've made that decision, you can't be backwards compatible any more than a DVD is backwards compatible with a VHS player. Once you've decided to build around new tech, your only options are to code the whole thing again for older hardware and not be building the same game, or simply cut loose on the backwards compatability. Epic have chosen the latter.

Sxotty said:
And btw on farcry perhaps you are right, I did not think about the fact that they used a tiny level. I was simply saying that it did look very nice.

Yes it does, but on a top end machine with a R3x0 running all settings high
on that same small level, it will look the same under PS 2.0 . It's just a bit of marketing BS on Nvidia's behalf to to present a pair of pictures implying they are a PS2.0 vs PS3.0 comparison, when actually they are a no-PS vs PS3.0 comparison.


Sxotty said:
edit:
I find a certain thing ironic. The people who are saying don't buy the 6800 for unreal3 (btw they are right) should have said (and maybe they did) don't buy the 9700/nv30 for hl2, or doom3. I cannot comprehend the number of people who bought cards like this specifically in their words for those two games and we are still waiting and waiting. It is always better to buy the cards after the game. If someone buys a 6800 before seeing the competition to me that is no more irrational than buying a 9800XT they were ridiculously expensive for a mere fraction of improvement over the pro model.

Yep it always the case that if you are looking to upgrade for a specific title, you should wait for that title to actually ship. Not only will there be new and better hardware available, everything else will have gotten cheaper in the interim.

Having said that there are always those people with lots of money looking for any excuse to spend it on their hobby, regardless of what that is.
 
Doomtrooper said:
I swear as long as I've seen you on the forums DW, I'm sure you must start your day with some Irish Cream in that coffee ;)
Believe it or not I don't think I've had a drink in years, and it's been at least 6-7 since I've had anything other than just a social drink at x-mas. I used to be a total lush, but "the love of a good woman" and all that rot and I just haven't wanted to.

Besides; by the time I'm sitting down to cruise the boards in the morning I've already gotten up/fed/cleaned/dressed my son/daughter/wife (well, I get to dress her if I'm lucky.. ;) ) and dealt with taking out the 3 dogs and greeting the two cats I'm kind of in a state-o-relief-near-hysteria by the time I'm sitting down in front of my 'puter checking out what the what has went on from the night before.

Oh, that and I'm a little baked this morning....
ylsmoke.gif


But I gotta agree that there is just too much partisan arguing going down before the contestants are even out! This isn't like last time where one IHV releases a card and the other IHV is screaming, "No, WAIT FOR OURS IT'S BETTER!", this is a case of two IHV's releasing within weeks of each other and we're just in the inbetween "silly season from hell to end all silly seasons" that is a first for me.

Ride it out, hang on...it'll pass. I'm spending some time upgrading/tweaking and actually playing some games until it passes. :)
 
Evildeus said:
Do you think that the X800 (at least XT part) will be more bandwidth limited than the 6800? It seems that the core will be clocked much higher but not the memory :/ Any hint/information/clues/expectation/theory accepted ;)

It'll be more interesting to find out how much that matters. R300 should have been quite bandwidth limited in many cases, and yet you could hardly every say it had high memory clocks, especially not in comparison to the FX parts. Clearly R300's memory bus was very efficient but not high speed - will R420's still maintain good efficiency whilst being higher performance?
 
John Reynolds said:
Guys, get back on topic or I'm locking this thread.
My apologies John, I posted my previous before I saw this up. I will abstain from anymore personalities/observations and will only post up if I have something useful/funny pertinant to the thread in question. :)

Sorry, I just didn't want you to think I was ignoring you...I'm not.
 
Cheers Doomtrooper - you really know how to make a new guy feel welcome ;)

In relation to ATI's videoshader. I believe im correct in stating that its only used in ati software and the divx player. Its not used in Media Player or Power DVD etc.

When comparing image quality i can see no difference between ati software and Media Player. Admittedly I haven't compared CPU usage, but i have to ask, whats the point.

If im watching a movie, im not going to be performing a cpu intensive task such as rendering, so who cares if the cpu is taking on the load?
 
DaveBaumann said:
It'll be more interesting to find out how much that matters. R300 should have been quite bandwidth limited in many cases, and yet you could hardly every say it had high memory clocks, especially not in comparison to the FX parts. Clearly R300's memory bus was very efficient but not high speed - will R420's still maintain good efficiency whilst being higher performance?
Yeah, but as the R3** and i suppose offer 6(-8?) AA i think that's the case when it will become interesting. When we look at the 6800U, it seems that most of the time it's CPU limited, which is a pity, and i would certainly say tat the R420 will also be CPU limited.

I don't see why the R420 wouldn't be as efficient as the R3**. But if the X800 XT is at 500MHz as hinted/rumors say, a 500-600 MHz seems a bit limiting, at least more than the 6800U...
 
DaveBaumann said:
It'll be more interesting to find out how much that matters. R300 should have been quite bandwidth limited in many cases, and yet you could hardly every say it had high memory clocks, especially not in comparison to the FX parts. Clearly R300's memory bus was very efficient but not high speed - will R420's still maintain good efficiency whilst being higher performance?

Surely thats a hint towards Extreme pipes if ever I saw one, I actually think dave gets a lot of enjoyment out of hinting at cryptic clues :oops:
 
DoS, There has been a real rise in the volume of fanboys in the last 2-3 months here.....and there as been a equal rise in responces..... It's happened before, many times...... all the way back to the original GeForce/Voodoo3 arguements. Nothing is going to stop it, so why add to the noise level, heh? There's no need to get personal about any of this..... It would be very nice IF everyone - yes, John, even ME! ;) - would stick to the subject.
 
One thing I am confused about is the video processing capabilities. I was of the understanding now that nvidia has a seperate chip to do it, isn't that kind of like comparing the tv wonder, or all in wonder series to a regular r3xx? And if so isn't the comparison equally obvious about which is better wrt to video functionality. Furthermore although I cannot find it now GRRR dave pointed out something like this "I don't think that ati will rely soley on the shader for video functionality in the r420" I swear he did ;) No really I think I remember it and if someone can point it out it would be nice. That says to me maybe they too will decouple it from the gpu which is a good idea for heat/power and downclocking in 2d.

edit:

BTW I find it humorous that nvidia's flub up has given the seperate 2d/3d clocks a boost out of necesity for them. I think the 2d/3d clock and the adjustable fan based on Temperature (<--from 9800xt I think anyway) combined are an excelent thing. If you base fan speed on tempm and not what mode the gpu thinks it is in you don't get meltdowns from screensavers and the like as nvidia almost experienced due to oversights on the designers of the drivers.
 
Back
Top