What other hardware/Technology is on the horizon?

it has been confirmed by several unnamed sources that 'Pitou' is no more and M is about to go back making business cards. Reasons for this are not in my knowledge.

I am still hoping that for once I would be wrong...
 
Joe DeFuria said:
However, I do see room at the low end for at least one more player: Intel.

One more player, Intel?
Afaik Intel already has quite a large percentage of the integrated graphics market segment. Not that long ago they were one of the companies shipping most graphics chips due to their low-end integrated stuff in loads of OEM boxes.
 
Nappe1 said:
it has been confirmed by several unnamed sources that 'Pitou' is no more and M is about to go back making business cards. Reasons for this are not in my knowledge.

Ummm, crap. Is it worth me finishing my Parhelia review then?? :-?

Humus said:
Afaik Intel already has quite a large percentage of the integrated graphics market segment.

Thats an understatement. As our news post stated in Q2 Intel held 62% of the Integrated market, which equated to a total of 27% of the overall market, making them the second largest in terms of graphics market share and pushing ATI into third.
 
PVR-S5: Some sort of fluke will come around and prevent it from seeing the light of day....

That's my biggest fear, even more since the recent KYRO III flop. But then again the latter was from what I was able to gather nothing to really write home about in terms of features.
 
Bjorn said:
I doubt that it's as easy as just "hey, why don't you engineers make a high end card instead of a low end/main stream one. make sure that it's ready by .... ".

Perhaps not, but it certainly stems that management is the one that controls the situations. I'm drawing this as a likeness to the software engineering industry. The following is a genuine and true story from my own experience.

CTO: "How's the subsystem X we just picked up from the acquisition?"
Me: "One giant collection of failed science projects. Complete total rubbish. Best to throw it all out the window. Start from the ground up with a design. Not a redesign as they never designed it from the first place. It was run amok for 2 years with no direction, vision or plan."
CTO: "It can't be that bad. Just bandage it up for now. Don't put any effort towards a new design. We'll fix it later."
Me: "We should bite the bullet now before its too late. It's like building a bookcase out of mashed potatoes. Are you sure you only want bandaids?"
CTO: "Yes."
Me: "That's a mistake."
CTO: "Just do it."
Me goes off muttering something about mawfs [management worthless-fs] being incompetent...

[Every single day for course of 6 months...]
Me: "Now that we may have some breathing time, can we design this subsystem?"
Management: "No, just add the bells & whistles they want."
Me: "We need to design this subsystem. It's less than worthless. It'll cost us more to 'fix it up' then to start from scratch with a good design."
Management: "You heard the CTO, just bandage current system up for now."
Me: "That's very short-sited. That's not going to help us out for the long run. It's causing us more harm then good."
Management: "It can't be that bad if the CTO said to just bandage it up."
Me: "You have no idea..."

[6 months later in a 5 hour demotivating and demoralizing meeting with engineering...]
CTO: "Has anyone tried to use the subsystem X? It's absolutely horrible. We need to fix this immediately. We needed this fixed yesterday. We have to design it. It's the one thing preventing us from having wide-spread sales. Why hasn't anyone done anything about this?"
Me: "With all due respect, You and the rest of management told us not to concern ourselves with it. You said you'd give us the go-ahead to fix it later."

[1 week later...]
CTO: "We're in a financial crisis. Our revenue streams are significantly less than we projected. We haven't closed out last round of venture capitol. We're asking for everyone to go on a two week unpaid leave of absence while we sort things out with the VCs."

[2 weeks later...]
Company lays off 100+ people, down sizes to about 30 people total, 25 of which are all upper management execs or VPs, they have 2 administrative assistants and 3 software engineers.

Granted, there was a whole series of bad decisions that lead to the company's fuxup, such as not doing its due diligence efore the acquisition, or the proper market research, but not fixing something before its too late was a major one. I find that very similar to the hardware industry. If you dont have a proper solution for soemthing beyond the near-term, then you are doomed to die.

I imagine the engineers at those companies [PVR-S5/Matrox] do indeed want to build an ass-kicker, but upper management will have nothing to do with it. When that happens, there's no way to pull of a high-end card.

--|BRiT|
 
I imagine the engineers at those companies [PVR-S5/Matrox] do indeed want to build an ass-kicker, but upper management will have nothing to do with it. When that happens, there's no way to pull of a high-end card

I seriously doubt that the upper management at f.e Matrox doesn't want the engineers to make a "ass-kicker" of a card. But it has to be released in time to be a "ass-kicker".

Take the Parhelia as an example. It sure looks like a serious attempt for the high end if you look at the specs. The problem is that it's just doesn't perform. At least not as well as it should with regards to it's specifications.
Then the R9700 was released which made things even worse.

I'm not putting all the blame on the engineers, far from it.
But i sure don't think that it's only the managers fault either.
(Although i do when it comes to the project that i'm currently working on :))
 
Ha. I saw somebody mention "Parhelia2 (Pitou!)".
I had no idea that was the codename, I thought it was the spitting onomonopia, more commonly written as 'ptooie!', as in showing disgust.
 
DaveBaumann said:
Nappe1 said:
it has been confirmed by several unnamed sources that 'Pitou' is no more and M is about to go back making business cards. Reasons for this are not in my knowledge.

Ummm, crap. Is it worth me finishing my Parhelia review then?? :-?

it's up to you... I am going to buy my parhelia retail in any case very soon.
 
Fuz - I agree with everything you said in your post at the start of this thread ( i have not read the rest of the thread yet)

I think though, that if Creative/3DLabs comes back with a consumer card in 2003, it will be a refresh of P10 with full DX9 or DX9.1 support. A P15 or P20. Something with 8 pipes and crossbar to take advantage of the 256-bit bus the P10 already has.

I think you're right on though.
 
Does anyone have a copy of the pdf of IMG's shareholder meeting presentation, I cant find it anymore ... it had a nice list of potential and actual licensees for all their different designs.
 
What`s about the revision of the Parhelia. I'll be very happy if the revision of this card hasn't the famous banding problems.
 
MfA said:
Does anyone have a copy of the pdf of IMG's shareholder meeting presentation, I cant find it anymore ... it had a nice list of potential and actual licensees for all their different designs.

That's probably because they replaced it with the new interim report/presentation. Don't bother, it'll tell you even less than before.

I wonder why Metcalfe even made an announcement about their future plans...... :rolleyes:
 
DaveBaumann said:
Thats an understatement. As our news post stated in Q2 Intel held 62% of the Integrated market, which equated to a total of 27% of the overall market, making them the second largest in terms of graphics market share and pushing ATI into third.

How long do you ppl think it will be before Intel buys either Nvidia or ATI?
Call it a merger if you will. My guess is that Intel and ATI merge, and AMD and Nvidia merge. If any merger did take place between these companies, that would definately mean the end for the smaller players out there.

I can't recall who said it, but one of the big shot developer was asked where he thought the graphics industry was headed, and he thought that with in 10 years the graphics processor would be intergrated with the CPU.

This seems like a logical step to me, seeing that we are seeing intergration all around us.
 
Fuz said:
I can't recall who said it, but one of the big shot developer was asked where he thought the graphics industry was headed, and he thought that with in 10 years the graphics processor would be intergrated with the CPU.

This seems like a logical step to me, seeing that we are seeing intergration all around us.

In an article in Wired Nvidia's CEO said he thought it would be the other way around with the CPU being integrated into the GPU. For low end the quoted developer might be correct, but I wonder if that is really any cheaper than having graphics in the chipset.

I'll read the review if you ever finish it Dave. I had the chance to use one the other day and I really like how the drivers support custom settings for programs so one game can always force AA on while another forces it off.
 
Here is what I think: We will get a processing unit that can act as a general processor, a vertex shader or a pixel shader as needed. This unit will have a full CPU-like instruction set extended with vector operations, fast complex scalar operations (like vertex shaders have today), and dedicated texture lookup instructions. It will operate using some sort of fine-grained multithreading, changing active thread every clock cycle from a pool of 10+ threads (for vertex shader mode: 1 thread = 1 vertex; for pixel shader mode: 1 thread = 1 pixel; there will be no SMT or OOO, as these are too expensive when the jump to this kind of multithreading has been taken). It will have a mechanism for fast barrier/rendezvous between multiple threads/processing units.

The processor as a whole will just consist of an array of such processing units (each of which should be rather cheap to make), plus a bunch of glue logic for the multithreading and the 3d graphics operation modes.
 
Fuz, IIRC I believe the developer was Brian Hook, of id Software and 3dfx fame. I think this was in his "Ask Hook" columns on VoodooExtreme. I think the full extent of what he said was the industry will go through a continual cycle:

Step 1) A new breakthrough of a feature is designed with specially dedicated silicon.
Step 2) Eventually general purpose CPUs become powerful enough that a software implementation of a feature is fast enough/faster/more economical than dedicated silicon.
Step 3) go to step 1.

I couldn't find any links, and couldnt locate it in google cache, so if someone has some hard/solid evidence, please post.

--|BRiT|
 
To me, it looks a bit like the 3d graphics industry is running that process in reverse, at times introducing features that at first can be run adequately in software (in particular T&L and vertex shaders come to mind), then 3d hardware evolves faster than CPUs to the point where the feature can no longer be done fast enough on the CPU ...

And the general-purpose CPU power needed to do what, say, a Radeon 9700 Pro does at the same level of performance and image quality - mindboggling. Consider, say, trilinear interpolation: with a hand-optimized software loop, you may be able to texture 1 pixel every 25 or so clocks on a CPU; which implies that you need 25*8=200 times the clockspeed of the 9700 Pro, or 65 GHz, to match it on this task alone. And there is no indication that 9700 Pro-level performance is close to 'enough' when CPUs do hit 65 GHz.
 
3dcgi said:
In an article in Wired Nvidia's CEO said he thought it would be the other way around with the CPU being integrated into the GPU. For low end the quoted developer might be correct, but I wonder if that is really any cheaper than having graphics in the chipset.

I think that eventually, this will be the case. A few things:

1. It appears that currently, quantum technologies probably won't be useful for general processing. At least, not for quite a while (They will be infinitely-useful for specialized processing scenarios...).

2. Processor speeds have been outpacing bus speeds for a long time. This is only natural, and will continue on into the future. Electrodynamics essentially forces this to be true. As this continues, the bus speeds will become the limiting factor.

3. When the bus speeds become enough of a limiting factor, then it will be beneficial, in terms of performance, to integrate more and more onto a single die. Eventually the fastest desktop computers, for realtime applications, will all include system on a chip designs. It'll be a while, though.

What will be really interesting is how this is dealt with in the future. I really, really doubt that either nVidia or ATI will merge with Intel or AMD. Since ATI and nVidia are both fabless, we may instead just see "strategic partnerships" with the companies, where, at first, nVidia and ATI will still sell their own graphics chips, but eventually their individual graphics chip sales will be relegated to niche markets, with most of their money coming from royalties. At least, that's what I see happening.
 
Back
Top