Who will get there first?

Dave Baumann

Gamerscore Wh...
Moderator
Legend
I’ve been involved in these Cell / Graphics processor discussions on and off for the past year or so now trying to feel out what’s going on. Given what I do with so much of my spare time (i.e. run Beyond3D) and given the contact I have with the IHV’s and spent so much time watching them grow and diversify I have a slightly different perspective that probably a number or people have in here and, fundamentally, I have a fair amount of faith in the 3D IHV’s – the adoption of the 3D IHV’s by all of the console vendors now, kinda reaffirms my position to some degree.

Now, it’s always been my belief that if 3D was ready for a fundamental shift in processing to a more general processing architecture then the 3D vendors would be looking at it or doing it already. From the likes of ATI and NVIDIA we will not see a large development shift from one processing construct to another, but an evolution, and one that is happening already (more so, it appears, from ATI’s front right now).

Some people looked towards Cell to being able to deliver the “general purpose processing†model to be good enough for 3D graphics processing as well as other forms of processing. The inclusion of NVIDIA in PS3 would appear to put that one on the backburner right now, but I see Panajev is already looking towards Cell 2 in order to provide this. However, this raises the next question:

If we are ultimately going towards a general purpose processing model Who will get there first? Will it be Intel? Will it be Sony / CE electronics vendors? Or will it be the 3D IHV’s?

Some might still scoff at the idea of the 3D IHV’s being able to go that far, however we must again look back to the fact that all the console vendors have chosen ATI and NVIDIA for their graphics processing, which both highlights their capabilities and their development vision in this area and will strengthen both vendors monetary bottom line allowing them further growth and R&D capabilities; they are also both making strides into other consumer areas (mobile being one, DTV being another) and are now being seeing as one of the primary differentiators by PC OEM’s which could soon replace the CPU as the primary purchasing factor.
 
I suppose when it comes down to it, there's as little reason to bring 3D rendering in under the same hardware architecture as general processing, as there is to bring all software programming in under the same programming language.

Or in other words, none whatsoever.

Or does anyone honestly believe we'll see a fundamental shift there too? It's been decades now and we can't even get rid of Pascal, Fortran or Cobol, much less any of the more recent languages. Quite the opposite, new ones just continue to sprout up...
 
I do believe there will be a shift in terms of archtectural design to a more general purpose approach. Whether it be an evolution or a revolution (ala CELL, supposed) has yet to be seen. So far we've been getting there little by little. It is most likely going to be an evolutionary step though. The thing that will also need a shift in archtectural design is CPU technology, not just graphics technology. CELL is the one to look out for in terms of the design that may possibly be highly configurable in that it would lend itself well to graphics processing. The parellity behind it (sp?) is something that Sony probably was looking into incorporating into their own PS3 GPU as a front-end.

With all that being said I don't think that conventional CPU's are on their way out either. While being paralell is ideal in the graphics world it still pales in comparison for general purpose processing compared to traditional CPU's for now. These are in various tasks that an architecture such as CELL would come to its knees with. Of course those types of scenarios may be overcome when programmers, devs, software architects get a grip on the change. I'm willing to bet that many of the people who work on CELL will have one step ahead than those who wait around for this type of processor.

Another thing that has caught my attention is the possbility for the graphics processor taking on a much bigger role within the computer. By that time it wouldn't be just a graphics procesor anymore but a full blown chip with a graphics portion to it (being highly configurable and programmable) and actually being used as a supplemental processor for the programs, code, tasks, that are unsuitable for the main CPU. This might be an ideal system for others because in a sense everything would be in software mode, it's just that the hardware is fully capable of delivering the graphics in a way that hardwired GPU's do today.


Nvidia may have an upperhand when it comes to future graphics architectures in a sense of where it will become general purpose. They have strong ties with IBM and I've got a feeling that it may be more than just manufacturing chips for Nvidia cards. Nvidia might have access to CELL in ways (other than with Sony and PS3) that it will be able to adapt the technology or at least some of the design elements to future GPU's.

For any CPU manufacturer to get there it would probably be ATI sooner than Intel. I have no evidence to support this, just that I think AMD is a little more open about taking bigger risks when it comes to evolving technologies.


I hope my post has been on topic.
 
Happy New Year everybody :)

I am very doubtful 3D IHVs will shift to general purpose model, unless they plan to replace Intel and AMD, they will just concentrate on their 3D purpose hardware, instead of general purpose processor. Sure things might become more flexible and might have spill over effects to other area, but I doubt they will attempt something like Cell.

Intel on the other hand can integrate some 3D hardware into one of their CPU in the future. Though with the domination they're enjoying now, they're even reluctant to integrate the memory controller, don't see them in a rush of integrating 3D hardware.

So overall, unless they have motive for general purpose processing, I doubt anyone will attempt it.
 
Happy New Year! :D

DaveBaumann said:
they are also both making strides into other consumer areas (mobile being one, DTV being another) and are now being seeing as one of the primary differentiators by PC OEM’s which could soon replace the CPU as the primary purchasing factor.

Unless GPU somehow affects a result of MS Office productivity benchmark, GPU can't replace CPU.

For Longhorn, MS will recommend DX9 hardware, but it's not in the requirement list. When you are on Aero GUI in Longhorn certain tasks are offloaded to GPU, but it's a very limited use of GPU power. And Longhorn will live at least 4 years if it's like Windows XP.

So it's basically a long long way for them to replace CPU in PC. Like a decade or never.

If graphics IHVs can build a chip on which an embedded OS can run with a low power consumption and with a cheaper price than the solution by separated processors, then there may be some prospects, but I doubt it.
 
I really don't have the answer. I believe 3D IHVs don't have the financials to get there first..so I expect some big company to acquire or to alleate with some 3D IHVs.
The nice thing is that I wast just reading this presentation from the GP2 conference and I found these projections by Bill Mark (the guy who designed the Cg language):

2-year predictions:

  • CPU’s: multi-core trend accelerates

    [list:36d99661ba]Multicore used by games and HPC
GPU’s: More powerful streaming model

  • Scatter, gather, conditional streams, reductions, etc.
    Start to see more success stories for GPGPU
    But limits of stream model become apparent
“Dark Horsesâ€￾ attract increasing attention CELL and others
[/list:u:36d99661ba]



6-year predictions


  • One processing chip for PC’s
    [list:36d99661ba]
    Who makes it?
Heterogeneous architecture for this chip:

  • Classical CPU
    Parallel fine-grained shared memory (pthreads)
    Parallel stream processor (Brook)
Supports ray-casting visibility
This architecture emerges in console space first
This architecture meets many HPC needs
[/list:u:36d99661ba]

Obviously this is the question many of us asked a multitude of times on this boards. I believe it's still to early to answer that question, but well, we love to speculate too ;)

ciao,
Marco
 
one said:
DaveBaumann said:
they are also both making strides into other consumer areas (mobile being one, DTV being another) and are now being seeing as one of the primary differentiators by PC OEM’s which could soon replace the CPU as the primary purchasing factor.

Unless GPU somehow affects a result of MS Office productivity benchmark, GPU can't replace CPU.

That particular quote relates to the fact that the graphics processor is already becoming a larger part in peoples purchasing decision as opposed to the performance/power/capabilities of the CPU - this is probably more so on the consumer end of the market (but that will take more importance in the corporate end with Longhorn, espescially in the mobile sector).

For Longhorn, MS will recommend DX9 hardware, but it's not in the requirement list. When you are on Aero GUI in Longhorn certain tasks are offloaded to GPU, but it's a very limited use of GPU power. And Longhorn will live at least 4 years if it's like Windows XP.

Its a requirement if you want any of the enhanced graphics interface at all. Anything less than DX9 and the interface defaults to the old Win2K 2D interface. With the 3D interface a lot more is moved across to the graphics than you would first consider - presently all the windows drawing is done by the CPU, very little is done by the VGA engine under the current interface. All the windows drawing and font AA'ing ect. are now fully accelerated by the 3D pipeline with Aero.

And Longhorn will live at least 4 years if it's like Windows XP

So it's basically a long long way for them to replace CPU in PC. Like a decade or never.

Well, this is not just about the PC.
 
DaveBaumann said:
If we are ultimately going towards a general purpose processing model Who will get there first? Will it be Intel? Will it be Sony / CE electronics vendors? Or will it be the 3D IHV’s?
I have to ask a clarifying question: are we headed towards a general processing model, or is that our utter goal? I agree we're headed towards more generality, but if we will end up at complete generality, Intel, AMD and IBM are already there.

Some might still scoff at the idea of the 3D IHV’s being able to go that far, however we must again look back to the fact that all the console vendors have chosen ATI and NVIDIA for their graphics processing, which both highlights their capabilities and their development vision in this area and will strengthen both vendors monetary bottom line allowing them further growth and R&D capabilities...
I should mention, to your benefit, that CPUs are looking more and more like GPUs. It's not just that GPUs are becoming more like CPUs. Dual-core, multi-core, Hyperthreading, SIMD, etc.

But I think that's about where it ends, personally. I can see PCs getting to the place where they all have tight ties between CPU and GPU, like the GC and X2. But merging the two? Not in the immediate future. I think the architectures will be different enough for quite some time.
 
DaveBaumann said:
Unless GPU somehow affects a result of MS Office productivity benchmark, GPU can't replace CPU.

That particular quote relates to the fact that the graphics processor is already becoming a larger part in peoples purchasing decision as opposed to the performance/power/capabilities of the CPU - this is probably more so on the consumer end of the market (but that will take more importance in the corporate end with Longhorn, espescially in the mobile sector).

Really? I'd thought gamers are not main customers of OEM PCs... even for gaming, some poll, done for Half-Life 2 users IIRC, showed mainstreamers are Geforce4MX users. Some people may choose non-IGP PC to play games, but they are hardly mainstreamers. Such people may be increasing, but not yet become siginificant and won't be in near future as there are game consoles for games. But for mobile phones things will go differently. Nice GPU can attract many customers if mobile phone can maintain the current pace of evolution.

DaveBaumann said:
For Longhorn, MS will recommend DX9 hardware, but it's not in the requirement list. When you are on Aero GUI in Longhorn certain tasks are offloaded to GPU, but it's a very limited use of GPU power. And Longhorn will live at least 4 years if it's like Windows XP.

Its a requirement if you want any of the enhanced graphics interface at all. Anything less than DX9 and the interface defaults to the old Win2K 2D interface. With the 3D interface a lot more is moved across to the graphics than you would first consider - presently all the windows drawing is done by the CPU, very little is done by the VGA engine under the current interface. All the windows drawing and font AA'ing ect. are now fully accelerated by the 3D pipeline with Aero.

Well, I don't argue how much it uses DX9 functions as when you can obtain Longhorn practically you can't purchase a new card without DX9 support. GPUs are comoditised too just like CPUs reaching plateau and Microsoft knows it as well, so they finally start to reap the fruit of their DirectX efforts in their core OS business. Older users without DX9 cards may not so much as upgrade to Longhorn anyway.

DaveBaumann said:
And Longhorn will live at least 4 years if it's like Windows XP

So it's basically a long long way for them to replace CPU in PC. Like a decade or never.

Well, this is not just about the PC.

Yeah, so I added a story about a GPU-derived chip to run an embedded OS at the end. Can it beat ARM?

At the head of the powerpoint link nAo posted, the president of Khronus Group (OpenGL/ES) talks about the prospect in the embedded space and in his view a CPU core and a graphics core coexist in a SoC for some time.
 
mm I get this from the board but apparently a post is successful

General Error

Could not connect to smtp host : 111 : Connection refused

DEBUG MODE

Line : 111
File : /home/beyond3d/public_html/forum/includes/smtp.php
 
one said:
Really? I'd thought gamers are not main customers of OEM PCs... even for gaming, some poll, done for Half-Life 2 users IIRC, showed mainstreamers are Geforce4MX users.

Gamers aren't, but thats the point I'm getting to - increasinly ordinary consumers looking for PC's are increasingly differentiating by graphics. This is a fairly recent trend by the sounds of it; I;ve had discussions with some board vendors about the shortage of parts and they are saying "we don't care what chip they give us, we can sell anything" and thats driven by OEM demand to differentiate by graphics.

Take a listen to an Analyst comments (9:12 AM ET, AM Business with Kim Parlee, The 2004 Semiconductor review, Jonathan Hykawy, analyst, Fraser Mackenzie).

http://www.robtv.com/shows/past_archive.tv?day=wed

Such people may be increasing, but not yet become siginificant and won't be in near future as there are game consoles for games.

Like I said, they are already significant, according to the discussions I've had. And this is a shift that appears to have happened this year.

But for mobile phones things will go differently. Nice GPU can attract many customers if mobile phone can maintain the current pace of evolution.

I was talking about mobile PC's - i.e. good graphics is going to be very important with Laptops with Longhorn as they need to accelerate the interface without having a 5 min battery time.

At the head of the powerpoint link nAo posted, the president of Khronus Group (OpenGL/ES) talks about the prospect in the embedded space and in his view a CPU core and a graphics core coexist in a SoC for some time.

So, I assume that you think Sony this is also true for consumer devices?
 
Who will get there first depends on who solves the software problem.

What needs to happen is more effort on programming languages that are non-sequential. Creating parallel architechtures out of transistors is the easy part. With "CELL" how much R&D has STI thrown at the software problem? I'm going to guess a lot, and the effort will continue, but really this area of research is the domain of Microsoft.

The future of extreme general purpose parallel architectures is going to be a playground for Microsofts top researchers. They will purge what isn't needed in the software infastructure and inovate where it's needed. Unless IBM pulls some software breakthrough from their Blue Gene research hat in the next few years, the game will become follow the leader. Bill Gates likes being a leader it seems.
 
DaveBaumann said:
presently all the windows drawing is done by the CPU, very little is done by the VGA engine under the current interface.

Not sure what you mean by "current interface", but the current GDI+ used in XP *does* leverage the VGA accelerator quite heavily for many tasks, including window moving and such. That's easily proven by using that whatsitscalled program that can make any window transparent when dragging, and then downclocking the core freq of the GPU. I notice a big difference in speed where anything below 250ish MHz core gives lagging movement of larger windows (1440*1080 screen res). I don't know if it can AA text though (Permedia actually features hardware support for that as you prolly know already, though it may not actually be supported by the driver and/or API - typical of Matrox these days)...
 
DaveBaumann said:
But for mobile phones things will go differently. Nice GPU can attract many customers if mobile phone can maintain the current pace of evolution.

I was talking about mobile PC's - i.e. good graphics is going to be very important with Laptops with Longhorn as they need to accelerate the interface without having a 5 min battery time.

I guess CPU is also very important in mobile PC right now. Pentium M, upcoming mobile Athlon64, and (maybe marketwise irrelevant) Efficeon. Surely mobile GPU for PC is incresingly getting attention, but there is only competition between 2 vendors (and I don't think Microsoft will support a GPU-derived general purpose processor on a laptop PC anytime soon :p ). In the mobile phone arena, the competition is more tough. Heck, besides the slipped-out (?) PS3 GPU contract, Toshiba is there too.
roadmap.gif


What laptop PC users want in graphics is not 3D, but more media-related features, and it's true in desktop too. I mean Tivo-like features and video acceleration, or Media in Windows Media Center Edition. It's related to stream processing rather than 3D-specific pipes. Maybe SSE does good enough in this matter and Intel has the vote in standardization of mobile PC platforms. In the end Office Productivity Benchmark is important in the mobile PC too. A PC is a PC, a generic tool, not more than that.

DaveBaumann said:
So, I assume that you think Sony this is also true for consumer devices?

It's said that Cell Processor has the power of processing multiple HDTV streams (read: huge), but I've not seen it's advertised as a 3D graphics processor in any (non-gaming I mean) news articles. If you look at the Cell network diagram in the initial patent, user terminals except for a PDA type have a something called "Realizer" which is not Cell.
 
I think we most devid bettwen mainstram market and professional, becausse we ( I almost sure ) will never see general porpose HW faster than dedicated, but for mains. I think we will see it is a lot esier ( to the consumer ) chose 1 thing than a whole system ( remember that most people dont know nothing from PCs ).

Anyway I think that ATI is going this way and more advanced whit tech from Intrisinity, Extensa , more products , unified shaders , I think that XB2 that more general porpose power too.
 
Dave:

I do not see CELL being the right architecture for doing software rendering without the aid of dedicated graphics processing hardware at the rasterization level. I agree with the others that CELL has never really been about graphics processing as far as the back end. there always will be dedicated gfx hardware backing it up whether its from Nvidia or elsewhere. I know I am stating the obvious.. heh.


if you look at the old SGI visualization systems like Onyx+RealityEngine, RealityEngine2, Onyx InfiniteReality, IR2, IR3, you see it has its stages broken up into (MIPs) CPUs <> Geometry Engines, Raster Manager and Image Generator stages.

I see CELL as being good to take over 2 of those stages, the CPU stage and the Geometry Engine stage. but graphics companies like Nvidia and ATI are still the best at the last 2 stages, the Raster Manager and Image Generator stages. modern PC hardware is still modeled after the SGI pipeline for the most part, aside from newer things like pixel shading.
I don't see CPUs being powerful enough, even with hundreds of GFLOPs (PS3) or a couple TFLOPs (Cell workstations) to handle all graphics processing.

(Panajev I still think Cell CPU for PS3 is going to be awesomely powerful)
 
one said:
I guess CPU is also very important in mobile PC right now.

Yes, thats why I said it graphics processing in mobile PC's will become increasingly important when Longhorn comes around.

one said:
Pentium M, upcoming mobile Athlon64, and (maybe marketwise irrelevant) Efficeon. Surely mobile GPU for PC is incresingly getting attention, but there is only competition between 2 vendors

Realistically thats no different for CPU's in this space.

one said:
(and I don't think Microsoft will support a GPU-derived general purpose processor on a laptop PC anytime soon :p ).

Not yet, no. I have asked if they are looking to begin moving more tasks to the graphics processor, in essence making it a co-procesor, and the rather evasive answers I got makes me think they have some research into this path already.

one said:
What laptop PC users want in graphics is not 3D, but more media-related features, and it's true in desktop too. I mean Tivo-like features and video acceleration, or Media in Windows Media Center Edition. It's related to stream processing rather than 3D-specific pipes. Maybe SSE does good enough in this matter and Intel has the vote in standardization of mobile PC platforms. In the end Office Productivity Benchmark is important in the mobile PC too. A PC is a PC, a generic tool, not more than that.

The graphics vendors are increasing the level of die they dedicate to specific processing of video as well - this has always been the case, and will continue to be; if it wasn't for the volume of processing already offloaded from the PC current HD content could not play without skipped frames on many current PC's, nor would they look as good (no motion compensation etc). This also carries across to the mobile space as without graphics accelerators the power drawn via the CPU to play back a movie would drain the batteries before it finished - the graphics IHV's are looking at way of ensuring similar processing lengths can still be achieved when HD disks become the norm. More of the consumer processing that ATI has (XILLOEN / Theater 550) will eventually be moved into the graphics cores in order to produce CE quality CE movie encode/decode, and presumably NVIDIA will look to similar things as their processors evolve.

However, this is all about the PC, which is not really the thrust of the discussion.
 
even for gaming, some poll, done for Half-Life 2 users IIRC, showed mainstreamers are Geforce4MX users.

http://www.steampowered.com/status/survey.html

The GF4 MX series was #1 on the survey a while back, but that changed right before HL2 was officially released. The good news is that the most common board, in this poll, is the Radeon 9800 series (11.89%), and the 3rd most common is the 9600 series (10.12%). There are still a lot of users using older boards, but it does seem that users with quality DX9 cards (i.e. non FX5200!) are creeping into the mainstream. Obviously this is a fairly narrow poll (i.e. HL2 users) but it does show that a lot of people are buying these cards.

I cannot talk about the entire market, but about 3 years ago I flip flopped on my building philosophy for computers. I used to get $400 CPUs and $200 video cards, and now I have reversed that because you will see more of a difference with a $400/$200 GPU/CPU combo than with a CPU/GPU combo of the same price ratios. e.g.

Cost . . . $400 . . . . . . . . $200
CPU . . . 3500-3700+ . . . 3200+
GPU . . . 6800GT . . . . . . 6600GT

Obviously the 6800GT/3200+ combo is going to perform better in 95% of situations in comparison to a 3500+/6600GT combo. With the CPU you run into so many other bottlenecks (system bus, RAM, MB chipsets, etc...) that the 9% is not the end of the world. $200 for 9% is really poor bang:buck in my book--especially considering most intensive apps, like games, are more GPU dependant. And since I build my families and friends this is the same advice I give and that of others I know who build systems. So, I cannot speak for the entire market (and OEMs do things their own way), but I would say an emphasis on GPU performance is not uncommon and is becoming more mainstream. Time will only tell how much influence it has... my guess would be Longhorn will be the straw that breaks the camels back. The average consumer will see the difference and there will be a much larger demand and consumers will be more knowledgable, in general, than currently. Just my opinion of course.
 
My guess would be Intel, after purchasing ATI ;) . Intel seems to be too much of a one trick pony. They have a large market cap that could evaporate extremely quickly. The need for GPU type processing is growing much faster than the need for CPU type processing. Without staying in the high volume consumer space Intel would become another Sun. So I would guess that Intel will ‘win’ if they buy ATI.
 
Back
Top