Could this be the worst ever GPU article ever written?

Can we stay on topic please?
BTW..Vista is not that bad per se, but when you have been using mac os x for over a year, vista feels like a cut-down/more resources hungry/slower versions of the original thing (and it's all Steve Job's fault if I'm becoming an Apple fanboy)
Please stay on topic now!! :)
 
Aye, the Vista Start Search is something I could not do without anymore. Works wonderfully well and just saves me tons of time on a system with a lot of applications installed. Especially ones that I only use once a week or once a month.

And with multiple monitors there's just no comparison. XP is about 10x slower.

Er, what was the topic again? :)

Regards,
SB
 
I don't think we'll find ourselves in a position any time soon where we've got more performance than needed. Graphics gurus keep figuring out new reasons to render the whole scene a couple of times more per frame, just to get the lighting again a bit 'better looking', even though the general population may not see any difference. Add the insights brought by the real time ray tracing vs rasterization story, about how real time indirect lighting is still a long way from solved, and I'd say we can 'waste' :) a whole lot more rendering power than what's available today. In the end, as long as off line rendering can do more than real time, it should be obvious where to use the added muscle.

I see the end of the road coming sooner on the feature side. We did indeed just get Shader Model 4, but what's in there over SM3 that will dramatically change the way real time rendering scenes look? You can always add features, but as long as the end result - the scene on screen - doesn't really look better, what's the point (except for selling some new graphics cards)?
 
Doesn't SM4.0 offer much easier programming compared to SM3?

Not counting the Geometry Shader, which wasn't in SM3...
 
The programming model for 10 is arguably more complex (you have to count the GS at some point) to grasp, but there are certain things that make it easier to work with once you do have a good grasp. IMO, of course.
 
There is still room to improve 3D APIs. We may reach an end on the shader part when we can feed any C/C++ (or any other language) program to a GPU. But there are other things that can be done.

We still had no programmable frame buffer blending and depth stencil operations. On the other end of the pipeline we may see some kind of programmable object shader or better support for object based culling.
 
The programming model for 10 is arguably more complex (you have to count the GS at some point) to grasp, but there are certain things that make it easier to work with once you do have a good grasp. IMO, of course.

Hell, I forgot to include learning curve in what I said... That always makes a difference.
 
On the other end of the pipeline we may see some kind of programmable object shader
Agreed. I even named one of them (there'll be more than one new shader stage in 11, I'll bet that right now :devilish: ) in a recent article, for the eagle eyed.
 
I can't decide what hurts me more: this or Jon Stokes on raytracing.
<strike>That's pretty uncalled-for.</strike>

Edit: I've thought about it some more, and here's what I have to say about this. I've never been comfortable in the GPU beat because it's not really my area. CPUs are my area, and the reason I was able to do a good job with the PS2 is that the Emotion Engine was just a CPU (albeit a weird one).

Thankfully, the CPU space got interesting again later in 2007 with the 45nm transition, and there are once again some interesting hardware stories there, so I hardly spend any time on GPU coverage at this point. As such, you'll be spared my graphics commentary until one or more of the following things happen: GPGPU starts to look interesting again (it won't); Larrabee launches and we have something concrete to argue over; or info on the next generation of consoles emerges. At that point, I'll handle the GPU beat like I always have: present a mix of informed speculation and sourced reporting to the best of my ability, while openly marking speculation as such and inviting anyone who knows or who thinks they know more about the topic than I do to weigh in via email or the attached discussion thread.

On a related note, if one of the B3D staffers, Tim Murray included, thinks they can do a better job than me at covering GPUs, GPGPU, and HPC for Ars Technica, then send me an email because we're hiring in 2008.

Oh, and one other thing. I've seen some posts here before to the effect that I wrote X, Y, or Z about some GPU or console topic just to drum up page views. Would that it were so. In reality, the traffic generated by posts containing technical information about, speculation on, and analysis of graphics hardware (or CPU hardware, for that matter) is pretty miniscule at a site like Ars. A timely story with an on-target angle and solid analysis of a law- or policy-related topic will draw about ten times the traffic of a blockbuster post on Larrabee. So no, I don't write hardware posts for traffic. I write hardware posts for a much more important reason than "to drum up page views": I write them because they're important to me personally and because they're appreciated by a relatively small but extremely key segment (in terms of brand, identity, focus, etc.) of the Ars audience.
 
Last edited by a moderator:
On a related note, if one of the B3D staffers, Tim Murray included, thinks they can do a better job than me at covering GPUs, GPGPU, and HPC for Ars Technica, then send me an email because we're hiring in 2008.
I'd count Tim out of the running, I think his upcoming job with nVidia might be a conflict of interests there. ;)
 
<strike>That's pretty uncalled-for.</strike>

Edit: I've thought about it some more, and here's what I have to say about this. I've never been comfortable in the GPU beat because it's not really my area. CPUs are my area, and the reason I was able to do a good job with the PS2 is that the Emotion Engine was just a CPU (albeit a weird one).
I don't think anyone questions that you do a good job with CPUs. I certainly enjoy all your CPU articles and find them quite informative, but I don't think you have enough of a background in graphics (or at least, you didn't, but to be fair that was some months ago) to be able to comment meaningfully on raytracing versus rasterization.

As such, you'll be spared my graphics commentary until one or more of the following things happen: GPGPU starts to look interesting again (it won't);
Well, that depends on where you look, doesn't it? GPGPU is still young. AMD just introduced its own SDK, NV either just reached or is about to reach the point where the dev environment is friendly enough for programmers that aren't in the ultra-high-end to look at it, and now we've got GPGPU support (for CTM from AMD and CUDA from NV) in the main driver revisions. Saying that GPGPU is just not going to look interesting in the next few months is probably very silly. All it takes is one killer app to turn GPGPU from "interesting idea, nobody will ever use it" to"everyone wants to buy a G92/RV670 because this makes Photoshop three times as fast" or whatever.

As far as Larrabee goes, do you actually think anyone will care about it as a GPU, at least for the first generation? Let's say it's insanely great, the true heir to R300 because it makes everything else look slow and ugly. It won't matter much. First, the shipping drivers will be buggy. Application developers don't always follow the DX specs, and Intel will have to scramble to put in app-specific workarounds just like NV and AMD have been doing for years. But that's only part of the problem! Then they need developers to actually develop and test using Larrabee (running stable drivers) as well as AMD and NV chips. We saw this with R300, and we'll see it with any serious competitor that appears. The second or third-generation Larrabee is when it could start having a real impact on the graphics market. And realtime ray tracing? By the time it could be in serious competition for use, every card from every IHV will be able to do it and there will be some well-defined portion of Direct3D for it (and the Khronos group will be arguing over some minute portion of it before it can be included in OGL... ;) )

So then the real question is how well it will do in terms of GPGPU--I don't know. Depends on the language it uses (I assume that will be Ct, but nobody's actually said that, and there's always TBB--sigh, why does Intel introduce twenty different things to solve the same problem, I don't want to have to know OpenMP, TBB, and Ct simultaneously), the quality of the compilers, and the ease of the development environment. I don't think that these are just going to appear fully-developed and wonderful upon the release of the hardware. CUDA and the AMD SCSDK will have had significant time to mature and potentially be much friendlier to developers. I honestly think we'll be talking about Larrabee as a GPGPU chip in terms of what Intel actually provides (say, what the Havok acquisition gets them) instead of what third parties actually write.

So, if I've come across as a jerk in previous postings, sorry--not my real intention. I just think you are missing numerous considerations, both technological and historical, in your analyses of Larrabee (at least as a GPU) and realtime ray tracing.

(insert standard note about "these are only my opinions and not those of anyone else" here. I've tried to avoid public speculation or long posts since I accepted the job, partially for fear of "NV shill" or shit like that cluttering up the forums as well as, but I don't start until June and nothing I've signed nor anyone I will eventually work for have said to not post until then, so I will make an exception for this. but yeah, this is basically "I think Hannibal is radically underestimating the effects of GPGPU over the next few years" more than anything specific. is that enough of a disclaimer? I sure hope so!)
 
CPUs are my area, and the reason I was able to do a good job with the PS2 is that the Emotion Engine was just a CPU (albeit a weird one).

That's my favorite Ars article, and the first one I ever read. I haven't taken a look at the site(Other than the gaming blog) in a while... Gonna have to go look at it.
 
Can we stay on topic please?
BTW..Vista is not that bad per se, but when you have been using mac os x for over a year, vista feels like a cut-down/more resources hungry/slower versions of the original thing (and it's all Steve Job's fault if I'm becoming an Apple fanboy)
Please stay on topic now!! :)

Heh...;) You're kidding, right? I'll bet that if OSX supported even a small fraction of the hardware and software Vista supports (in fact, OSX *does* support a small fraction of what Vista supports--I'm just being cute), it probably wouldn't run at all...:)

I have to fall on the side of people using and liking Vista. I am still running XP at work, and man does it feel and run kludgy by comparison. But that's not entirely fair because the hardware I use sometimes at work is much inferior to what I use at home...;)

All this strange, almost bizarre anti-Vista sentiment reminds me *exactly* of the reception the "pros" gave Windows 95 when it shipped. I thought right off the bat that Win95 was way better than Win3.1, yet I heard and read so many howls of protest that "Win95 was ruining Microsoft's OS" from these people that I can still remember them (are you reading this, Fred Langa?)...;) The problem was that these people were in love with the way they had "mastered" Win3.1, and figured they knew "all" the "tips and tricks" on how to make it run better and better--and then when Microsoft rudely shipped Win95 all these "pros" had to go back to the drawing board because everything was different and everything had changed--and oh, boy--how they hated that...;)

By 1998-99, though, Win9x was thoroughly accepted and these people--having learned a few new tricks and enlarged their horizons--no longer had any objections to rightfully consigning Win3.1 to the ash-heap of history. I predict that the same thing will happen with Vista. Already, I'm quite, quite sure and positive that many, many more people are using Vista and enjoying it than ever the "press" lets on about.

Based on my experience over most of the last year with both Vista x86 and Vista x64, I'd have to say that the complainers are complaining about change itself far more than they are about Vista--some of them just haven't realized it yet...;)

I guess I and a few others here had a leg up on using Vista from the start--we *expected* change and differences and were eager to embrace them, whereas I suppose the complainers were expecting Vista to be just a little warmed-over XP...:D

Ah, well. Said my piece...;)
 
I actually hadn't read that far into the thread before making that last post, so I'll try and make up for it by briefly posting my own point of view on a couple of things:

(1) Real-Time ray tracing, etc., for 3d gaming

This is based on Intel sponsored "research" that Intel generates in order to publicize a new generation of cpu. I first saw this concept written about (with Intel footing the bill for the "research") years ago during the PIII/IV era; and now I see it emerging yet again along with the Core 2 architecture. Basically, I find the idea as funny now as I found it then...;)

Yea, yea, yea--everybody and his mother knows that Intel is cpu-centric and that everything it does revolves around the cpu. It's no surprise, therefore, to see Intel take this dusty old concept out of the closet and parade it around every time the company wants to promote a new cpu. What surprises me, though, very truthfully, is that anyone with any skill or experience in actually doing ray-tracing with cpus might ever take something like this *seriously*...;) That was the real surprise for me. The rest of it is quite tedious and utterly predictable.

Rasterizers exist simply because they can do what cpus cannot. They are designed and engineered specifically to do those things very well that cpus by their nature can do only very poorly. The two complement each other--they do not compete with each other. From whence stemmeth this simple-minded "either-or" concept when it comes to gpus and cpus? Beats me. I mean, I can understand the promotional angle from Intel's point of view--that's easy enough. What surprises me, though, are the people who don't really seem to see Intel's long shadow falling over the entire subject..!

(2) Larabee

(Apologies if I misspelled the acronym.) The legend swirling around this mythical, non-existent product is monumental. I cannot recall ever having seen so much written about so little...;) That about sums up my views on "Larabee" (sp?)

What I don't understand is how perfectly intelligent, sane people can understand how and why many "product roadmaps" of the past (pick the company, it doesn't matter) which promised unbelievable glories to come were in the end discarded like so much smelly chaff, yet on other topics, for instance "Larabee," their eyes are so full of stars that they cannot even see straight...;)

I think we need to coin a new term to describe this phenomenon. A term like "PR blind" comes to mind. What's the old saying, "Dazzle 'em with BS..."...?
 
Back
Top