Series 5, Parhelia 2...guys where are they?

mmm kinda near Dallas, just on the Oklahoma side of the border actually. And yeah I got the the third saturday sale once in a while too. However, I think that this time I'm going to end up at the first in the moring.Got 2 or 3 friedns I'm taking and afterwards we are going go eat and spend some time at one of the Gaming Centers.
 
arjan de lumens said:
Snyder said:
What about Intel? Some form of Series 5 as part of a (Longhorn-capable...) chipset surely looks interesting to me...
Intel already has Longhorn capable 3d acceleration in their upcoming Grantsdale chipset, called Extreme Graphics 3; selling Series5 to Intel is as easy/hard as convincing them that the Extreme Graphics 3 (which AFAIK is a 4-pipeline tiler, albeit a bit less efficient than those of ImgTec) is not good enough for the markets they are trying to push the Grantsdale chipset into. Considering the past successes Intel has had with previous integrated 3d decelerators, I find that very unlikely.

Funny how almost anything that uses a tiled memory system, immediately becomes a tiler these days.

I haven't seen one "capable" integrated chipset from Intel ever, yet I'd love to be proven otherwise. As of course I'd love to see the details as to why and where Intel's sollution is even a deferred renderer.

In any case IMHO S5 is way too complex to make it into an integrated chipset that early. The integrated market is a totally different story of it's own; however if Intel should see it's integrated market share to get threatened seriously, it wouldn't be entirely impossible to think of lisence and no it hasn't have to be IMG at any price either and of course beyond Grantsdale. If Intel's graphic department was soooo capable why didn't they just pop one of their great little tilers into the PDA/mobile market?
 
arjan de lumens said:
Snyder said:
What about Intel? Some form of Series 5 as part of a (Longhorn-capable...) chipset surely looks interesting to me...
Intel already has Longhorn capable 3d acceleration in their upcoming Grantsdale chipset, called Extreme Graphics 3; selling Series5 to Intel is as easy/hard as convincing them that the Extreme Graphics 3 (which AFAIK is a 4-pipeline tiler, albeit a bit less efficient than those of ImgTec) is not good enough for the markets they are trying to push the Grantsdale chipset into. Considering the past successes Intel has had with previous integrated 3d decelerators, I find that very unlikely.

Ah, yes - forgot Grantsdale/XG3.
 
MMM Parhelia 2 Pitou. Errr back to the regularly scheduled specialty markets :(....

Ah well, Matrox I hardly knew ye .

Hrm Extreme Graphics 3 how do I want to call ye? Um well, don't expect miracles out of the part. I doubt seriously if Intel would have gone this route if not for Longhorn....
 
I'd be shocked if Intel licensed any core for their integrated chipset. The margins of the graphics portion of the chipset are probably so low it wouldn't make sense to share that small profit with another company. Plus, Intel probably just uses the same design tools they already have so they're not buying design tools just for this integrated chipset. Tools can make up a significant part of the budget for a small design team.
 
yeah I don't expect the intel chipset to be anything you would want to use for gaming, however I think it's great that even the value PC's will be shipped with PS/VS 2.0 capabilities, however small. We're still fighting to bring value computers up above the DX7 level (ugh GeForce PCX 4000....) and having Intel, the one with the largest total marketshare, putting PS/VS 2.0 in everyones hands will allow developers to start writing engines specifically for PS/VS 2.0 instead of adding it onto a basically DX7 engine.
 
Last I heard, matrox gave up on 3D. Ati and Nv basically killed everyone off in 3D and saturated the market so no one else can make money in it.
 
Ailuros said:
arjan de lumens said:
Snyder said:
What about Intel? Some form of Series 5 as part of a (Longhorn-capable...) chipset surely looks interesting to me...
Intel already has Longhorn capable 3d acceleration in their upcoming Grantsdale chipset, called Extreme Graphics 3; selling Series5 to Intel is as easy/hard as convincing them that the Extreme Graphics 3 (which AFAIK is a 4-pipeline tiler, albeit a bit less efficient than those of ImgTec) is not good enough for the markets they are trying to push the Grantsdale chipset into. Considering the past successes Intel has had with previous integrated 3d decelerators, I find that very unlikely.

Funny how almost anything that uses a tiled memory system, immediately becomes a tiler these days.
The Extreme Graphics 1 & 2 are 'tile-based' in the same sense that the ImgTech chips are; rendering every polygon in a tile before proceeding to the next tile; I don't see why Extreme Graphics 3 would be different. If you don't believe me, look at Intel's papers describing the operation of Extreme Graphics, in particular the section on "Zone Rendering" technology.
 
I always thought all these "extreme shits" are i740 based...
I even remeber review of EG2 somewhere - guys made some tests with RightMark, and from results it was obvious that it's not a "classic tiler"
 
arjan de lumens said:
Ailuros said:
arjan de lumens said:
Snyder said:
What about Intel? Some form of Series 5 as part of a (Longhorn-capable...) chipset surely looks interesting to me...
Intel already has Longhorn capable 3d acceleration in their upcoming Grantsdale chipset, called Extreme Graphics 3; selling Series5 to Intel is as easy/hard as convincing them that the Extreme Graphics 3 (which AFAIK is a 4-pipeline tiler, albeit a bit less efficient than those of ImgTec) is not good enough for the markets they are trying to push the Grantsdale chipset into. Considering the past successes Intel has had with previous integrated 3d decelerators, I find that very unlikely.

Funny how almost anything that uses a tiled memory system, immediately becomes a tiler these days.
The Extreme Graphics 1 & 2 are 'tile-based' in the same sense that the ImgTech chips are; rendering every polygon in a tile before proceeding to the next tile; I don't see why Extreme Graphics 3 would be different. If you don't believe me, look at Intel's papers describing the operation of Extreme Graphics, in particular the section on "Zone Rendering" technology.

I highlighted the quoted sentence from my former post once more; now let's have a look what Intel says here:

http://www.intel.com/design/graphics2/zr.htm

Zone Rendering 2 Technology is a unique technology developed by Intel for drawing (rendering) 3D graphics scenes. This technology optimizes system memory usage by reducing the required memory bandwidth for the graphics engine.

The 3D graphics engine divides the frame buffer into rectangular zones and then sorts the triangles into memory by zone. The 3D graphics engine then completely processes the zone, writing the pixel data to memory before proceeding to the next zone. By processing only a single zone of the frame buffer at a time, the use of on-chip memory (cache) is highly optimized and each pixel in each scene is drawn only once. As a result, the system memory bandwidth required to render each scene is greatly reduced. This ensures the most efficient system memory usage for optimal graphics and system memory performance.

Zone Rendering 2 Technology increases performance over the original Zone Rendering Technology by enhancing architectural efficiencies in both hardware and software.

Conclusion from the whitepaper:

The Zone Rendering 2 Technology architecture featured in the next-generation Intel GMCH, the
Intel 865G chipset, provides improved performance above conventional 3D architectures through
efficient memory bandwidth usage and optimal utilization of the render cache. The 865G chipset
drivers provide software support for this new architecture for both Direct3D and OpenGL,
allowing both existing and future 3D applications using these APIs to use Zone Rendering 2
Technology without the need for modification.

This far I haven't seen neither better performance compared to other integrated sollutions and not by far comparable image quality.

Intel Extreme Graphics 2 (845G):

http://www.extremetech.com/article2/0,1558,1133106,00.asp

More specifically:

http://www.extremetech.com/article2/0,1558,1153842,00.asp
 
Yes that graphics chipset is NOT a tiler, it just stores its framebuffer memory in a tiled format instead of just a congiguous block.


As for the question of whether IMGTEC have series 5 hardware...

I find it mighty suspicious that they have released (late last year) tech demos showing off PS and VS 3 hardware when the 'officially' have no hardware to run or develop it on.

They did this with the KYRO, released demos for it before the card actually came out.

When is the next big PC show thing? (like E3 and such)
 
Dave B(TotalVR) said:
I find it mighty suspicious that they have released (late last year) tech demos showing off PS and VS 3 hardware when the 'officially' have no hardware to run or develop it on.

I don't find it suspicious at all.

I would think emulators would be a fairly standard practice for these development houses...particularly those that license and sell IP.
 
"I don't find it suspicious at all.
I would think emulators would be a fairly standard practice for these development houses...particularly those that license and sell IP."


So why did they release those demo's then Joe?
Also, why did they release them when they did?
 
Dave B(TotalVR) said:
Yes that graphics chipset is NOT a tiler, it just stores its framebuffer memory in a tiled format instead of just a congiguous block.
Uhm, it is. It's a tile-based (or "zone-based") deferred renderer. At least according to the description.
The 3D graphics engine divides the frame buffer into rectangular zones and then sorts the triangles into memory by zone. The 3D graphics engine then completely processes the zone, writing the pixel data to memory before proceeding to the next zone. By processing only a single zone of the frame buffer at a time, the use of on-chip memory (cache) is highly optimized and each pixel in each scene is drawn only once.
 
Yeah, the element I've never been clear on is whether it actually actively reduces overdraw - I'm not sure it does, but just tries to get the benefits of onchip, regionalised rendering.
 
Xmas.... It divides the framebuffer into zones, not the scene. It then adjusts the order of polyons sent to the GFX card to go by zone, this way keeping the same page of ram open for longer.

good idea but its not the TBDR seen in PowerVR.
 
Back
Top