PVR5 to be unveiled on the 18th!

jvd said:
True , but how effcient is hyper z

When rendering is optimized for front to back, efficiency is quite high. Back to front or random order would be theoretically in favour for a TBDR.

One of the advantages of hierarchical Z is that it uses varying sizes of tiles (macro/micro tiling); I'd guess since IMG has a quite old patent for macro/micro tiling that it's being used past Series3, in conjunction with other improvements. You'll find references to it in more than one occassions such as this one:

http://v3.espacenet.com/textdes?DB=EPODOC&IDX=US6839058&F=0&QPN=US6839058

I haven't a single clue to which generation the following belong to, but I found them particularly interesting:

http://v3.espacenet.com/textdoc?DB=EPODOC&IDX=WO2004114222&F=0&QPN=WO2004114222

http://v3.espacenet.com/textdoc?DB=EPODOC&IDX=WO2004095376&F=0

What I'm trying to say is that w/o knowing more details on the exact techniques being employed, most calculations might be off base.

That would also be assuming that the x850 has no overdraw reduction techniques, which would not be the case when compared with any current video card.

True. The initial Radeon though had already a hierarchical Z-buffer, even the SDRAM Radeon256. ATI obviously has evolved it's techniques and added numerous of new ones ever since and I wouldn't expect that anything hasn't changed since Series3 either, quite the contrary.
 
Personally, I don't think we'll be seeing any PowerVR technology on the PC desktop as the market is now dominated by the two behemoths that are ATI and NV.

I'm still interested in the technology itself, however, so I'm looking forward to seeing what they come up with on the Arcade side of things. The Naomi boards were highly thought of so it will be interesting to see what they can come up with using technology several generations ahead of that. It's also good to see plenty of new design wins with Philips/TI and others for use in mobile phones, PDAs etc.

I still think that the PowerVR tech would be absolutely ideal for PC on-board video but it looks as though the big two + Intel have this market sown up as well now. I suppose our only hope is that Intel will license PowerVR for use in a newer (and less crappy) GMA++!
 
Chalnoth said:
For what? To hear nothing of interest to the PC world?

if by that you mean "nothing of concequence to the PC world" then I believe that's entirely correct.

however, i do believe that, if there is an announcement that goes into specifics on their hardware, it will most certainly be interesting!
 
I don't know. For me the two words are absolutely identical. After all, the PC is a very different market, and thus a part based upon the same technology but made for the PC market may turn out to be very different indeed.
 
Chalnoth said:
I don't know. For me the two words are absolutely identical. After all, the PC is a very different market, and thus a part based upon the same technology but made for the PC market may turn out to be very different indeed.

yes but surely any new graphics technology interests you, not just PC graphics chips.
 
I guess what I'm saying is that I don't expect PVR to be presenting anything that either we haven't seen before, or that wouldn't an obvious convergence of stuff we've seen before with process shrinks.
 
well I expect that, if it happens (and I'm not convinced it will) they will be announcing a series5-based product.
 
Sage said:
well I expect that, if it happens (and I'm not convinced it will) they will be announcing a series5-based product.
Which, as I attempted to imply, I don't expect will bring any new technology that we haven't seen before, just a tile-based renderer that supports shaders. I just seriously doubt PVR will bring anything truly new to the table, with the only mildly interesting thing being the confluence of these technologies. But, since we can't compare performance, which is the sole interesting thing about bringing these technologies together, it's just uninteresting.
 
But surely a deferred renderer supporting shaders is, in itself, something new and therefore something interesting?

Deferred renderers were interesting in comparison to traditional IMRs in the days before shaders so what has changed now?

I don't think we'll see anything for the PC space, yet I'm still interested in what they will show (allegedly).
 
I think I understand Chalnoth's point- he (gosh I'm so used to thinking of Chal as a she, I think it was Tag who told me Chal was a she like a year or two ago) just has this bias against PowerVR and deffered rendering in general. So, of course he wouldnt find it interesting. Just like I wouldn't find it too particularly interesting if someone made a new Superman comic book series- to me, it's all the same boring crappy stuff and my personal biases will prevent me from seeing anything more than that.
 
Sage said:
I think I understand Chalnoth's point- he (gosh I'm so used to thinking of Chal as a she, I think it was Tag who told me Chal was a she like a year or two ago) just has this bias against PowerVR and deffered rendering in general. So, of course he wouldnt find it interesting.

...which is perfectly reasonable, really. If you look at the history of deffered rendering products in the PC, they've always been mid-range or low-end, and then eventually slumping into mediocre. Plus, it's always been said that the high-end is where the mindshare is at. In which case, nobody has even heard of PowerVR. Until an actual high-end deffered renderer comes out, all these "What if...?" scenarios will stay old and boring.
 
dksuiko said:
...which is perfectly reasonable, really. If you look at the history of deffered rendering products in the PC, they've always been mid-range or low-end, and then eventually slumping into mediocre. Plus, it's always been said that the high-end is where the mindshare is at. In which case, nobody has even heard of PowerVR. Until an actual high-end deffered renderer comes out, all these "What if...?" scenarios will stay old and boring.

umm.... how long have you been into 3d graphics hardware on the PC? Unreal actually had a dedicated PowerVR path because back then PowerVR was really awesome. And Chal's bias is not only against PowerVR but against deferred rendering alltogether. Well, I suspect that if nVidia came out with a deffered rendered he would eventually come around and start liking it.
 
Sage said:
umm.... how long have you been into 3d graphics hardware on the PC? Unreal actually had a dedicated PowerVR path because back then PowerVR was really awesome.

As if it matters any, I've been following 3d graphics hardware since, I don't know, the first 3d graphics related article in PC Gamer? I was a big fan of that magazine back in the day, but who knows.

Anyway, the fact that Unreal had a dedicated PowerVR path means nothing when it comes to overall high-end performance. How long have you been following PowerVR? Do you not know that they never released a high-end performance part? Do you not know that they never dominated the performance leads? All of which were my main points. Yeah.. PVR250, Kyro and the family had cool technologies behind them and what not, but when it came to sheer high-end performance, they never showed their face. You can contribute whatever reasons you wish to that fact. But ultimately, the reason they never released a high-end part is less relevant than the fact that they never did. They have a track record and it's not about high-performance.

Well, I suspect that if nVidia came out with a deffered rendered he would eventually come around and start liking it.

I guess. But I say... just let him be.
 
Well, I do have objections to deferred rendering as a robust way of moving forward with 3D rendering, but I'm not going to argue that here.

What I will say is that the interesting thing about merging deferred rendering with modern shader technologies primarily revolves around performance aspects, which, as I will say again, you could not extrapolate to PC parts based upon this upcoming news release (which is the only way that I could get any sort of feel for how good the technology is). Thus it will be uninteresting (to me). I don't see how this statement actually has anything to do with bias.

Now, on the other hand, it is conceivable that there will be some interesting aspects of the technology that will mostly revolve around the limits of the technology that current hardware just doesn't have. But since these things probably won't be obvious at announcement, once again, the upcoming announcement will be uninteresting.
 
Back
Top