SEGA Developing MODEL 4 In Conjunction With Saarland University?

Here is more information on the Saarland DRPU (Dynamic Ray Processing
Unit).

http://graphics.cs.uni-sb.de/Publications/svenwoop-drpu-thesis-V1.1.pdf

A working FPGA prototype implementation specified in the developed
hardware description language HWML is presented, which achieves
performance levels comparable to commodity CPUs even though clocked at
a 50 times lower frequency of 66 MHz. The prototype is mapped to a 130nm
CMOS ASIC process that allows precise post layout performance estimates.
These results are then extrapolated to a 90nm version with similar hardware
complexity to current GPUs. It shows that with a similar amount of hardware
resources frame-rates of 80 to 280 frames per second would be possible
even with complex shading at 1024x768 resolution. This would form a good
basis for game play and other real-time applications.
 
Last edited by a moderator:
Saarland had real time raytracing on an FPGA years ago.
A company called Caustic do a real time RT card card (also an FPGA).
IBM have also shown real time software raytracing on 3 PS3s.

Real time raytracing for games is not that far off, that someone might be working on it should be no surprise.
I can do realtime raytracing on a 7 MHz Amiga too. Of course, it's just a cube, but it realtime, right ;)

Realtime raytracing at game quality with game effects is not happening any time soon. It has no advantages for what we're wanting to do at the moment in graphics. You only need look at the realities of the industry, that Saarland aren't selling these processors to graphics powerhouses to render their movies in realtime, to know it doesn't work!

Here is more information on the the Saarland DRPU (Dynamic Ray Processing
Unit).
I see you quoted the optimistic 'fund our research because it's really promising' line. Now look through the paper. Page 133 shows the example scenes. Look how simplistic they are, how lacking in eye-candy. Then see the table on page 144. Many framerates are <20 fps. The game example, UT2004 assets, run at 6 fps. Yes, it's only a small 66 MHz processor, but it's also rendering a 512x386 image without any AA, and the scenes do no represent current game complexity to any degree.

They do not have a real-time (30fps) raytracing processor that will produce HD scenes of even current generation quality, let any some amazing next-gen super-tech that's going to make existing GPUs look obsolete. Anyone believing as much just doesn't understand the issues of raytracing graphics and the limits of silicon-based processors.
 
I can do realtime raytracing on a 7 MHz Amiga too. Of course, it's just a cube, but it realtime, right ;)

Realtime raytracing at game quality with game effects is not happening any time soon. It has no advantages for what we're wanting to do at the moment in graphics. You only need look at the realities of the industry, that Saarland aren't selling these processors to graphics powerhouses to render their movies in realtime, to know it doesn't work!

I see you quoted the optimistic 'fund our research because it's really promising' line. Now look through the paper. Page 133 shows the example scenes. Look how simplistic they are, how lacking in eye-candy. Then see the table on page 144. Many framerates are <20 fps. The game example, UT2004 assets, run at 6 fps. Yes, it's only a small 66 MHz processor, but it's also rendering a 512x386 image without any AA, and the scenes do no represent current game complexity to any degree.

They do not have a real-time (30fps) raytracing processor that will produce HD scenes of even current generation quality, let any some amazing next-gen super-tech that's going to make existing GPUs look obsolete. Anyone believing as much just doesn't understand the issues of raytracing graphics and the limits of silicon-based processors.

130nm 66MHz.

Now lets imagine, multiple DRPU's @ 32nm 600MHz+ each

I'd love to see what that would be capable of.

Texan, are you responsible for the FGN site? We SEGA zealots can smell our own

...
 
130nm 66MHz.

Now lets imagine, multiple DRPU's @ 32nm 600MHz+ each

I'd love to see what that would be capable of.
Far less than the equivalent number of transistors in an ATi or nVidia GPU :p

If you haven't already done so, take a look at Deano's short article on the pitfalls of raytracing. Then you can read the opening posts of the discussion. It isn't going to happen. There's no point looking for possibilities here. You may just as well say 'SEGA are going to introduce a revolutionary new CPU that can process a petaflop a second'. The laws of physics prevent it. The laws that govern computer graphics mean a realtime raytracer producing good results comparable to a current GPU is not going to happen with the silicon budgets we have. It's not going to happen with a room full of processors. The numbers you are clinging onto as apparent evidence of the possibility are for simplified scenes that will not scale linearly. Stick Halo or KZ2 or whatever on that raytracing hardware and it'll keel over as well as looking like a mangy dog by comparison. Unless 1980's raytraced graphics have a particular aesthetic appeal with you ;)
 
Here are some more images I think the new board may be capable of producing in realtime -

sub13.jpg


sub10.jpg


sub16.jpg


sub29.jpg


2005_sub7.jpg


ART_Teaser03.jpg


I could have searched for and posted some supremely high end ray-traced images however I wanted to be realistic for first generation hardware.

Still you have to agree even these are head and shoulders beyond what rasterization will ever be capable of.
 
Real-time raytracing isn't going to happen. If it is, that same level of performance could be directed into a standard scanline renderer and produce far better looking results! Ray-tracing is a niche technology with niche applications. For high-performance realtime graphics it's a long way from being a suitable alternative to traditional methods.

Probably worth reading this again Texan...

I like the way the source website's author refers to himself in the third person (a la Doctor Doom).
 
Last edited by a moderator:
Here are some more images I think the new board may be capable of producing in realtime -

I could have searched for and posted some supremely high end ray-traced images however I wanted to be realistic for first generation hardware.

Still you have to agree even these are head and shoulders beyond what rasterization will ever be capable of.

Again, you live in fantasy land. None of these are from any arcade machine or console that currently exists or is likely to in the next generation of consoles AND you admit to speculating without fact or evidence to back it up.
 
None of these are from any arcade machine or console that currently exists or is likely to in the next generation of consoles AND you admit to speculating without fact or evidence to back it up.

The images that I have posted thus far are low end. For first generation hardware those are the sort of visuals that one must expect.

I don't expect to see something more advanced such as real-time global illumination until say a Model 5.

You definitely wont be seeing anything of this sort on the MODEL 4 -

attachment.php



v2s1.jpg


3D of this quality I admit is far off into the future.

Mod : Excesses of images removed.
 
Last edited by a moderator:
Once again, where is the evidence? Is there any point to this discussion when you're ignoring everything any one says in favour of a completely unsubstantiated report from a website where the writer can't even be bothered to pony up $20 for a year's worth of hosting?
 
Still you have to agree even these are head and shoulders beyond what rasterization will ever be capable of.

Technically maybe, but they look pretty artificial and lame IMHO.

And anyway, we had reflective floors in Unreal 1 already, the lighting can be baked and in higher quality, and so on... most of the fake techniques are already there and work perfectly too. So why bother with a new architecture to make it realtime?


The fun stuff, like glossy reflections, more complex lighting for moving objects, more complex shaders, well, that stuff on the other hand is far too taxing for realtime for at least another decade.
 
Here are some more images I think the new board may be capable of producing in realtime -
Based on what exactly?! Wishful thinking, that is all. You're providing absolutely no sane argument to support your view. You've read 'ray tracing' and looked at a few numbers, and now have the audacity to pick a few random internet CG images and state...
Still you have to agree even these are head and shoulders beyond what rasterization will ever be capable of.
...as if this is what SEGA will be unveiling shortly.

Hogswash!

It's gobblededook! When are you actually going to start discussing the technical hurdles and presenting some factual and relevant information to support the notion that raytracing will surpass rasterization in the next few years?

Here's another paper for you to ignore, from Saarland university (PhD thesis), dated June 2007, looking at their DRPU processor. Now you'll have to read past the sensationalist remarks in the opening and conclusion, the pie-in-the-sky claims of 80-280 fps rendering of scenes with complex shading at 1024x768, and actually get to the meat.

Page 132 shows images of the demo scenes being tested. These are the scenes getting single digit fps at 1024x768 with 0xAA on their 66 MHz processor. There are no secondary rays. There is no global illumation. There is one 'game like' scene consisting of a UT2004 arena with 2 skeleton objects and one morphing object, a total of 85,000 triangles for the scene. There is also a car model, a VW beetle which has a Cell equivalent where the 'game scene' doesn't.

They have provided their own extrapolations for us to consider. On page 159 is described a multicore DRPU chip, 196 mm^2 at 130nm, 186 mm^2 at 400 MHz (their own clock limit). The following page has a table of results comparing their hypothetical processor to Cell. The VW beele, rendered at 40 fps on Cell, would be rendered at 160 fps on their equivalently sized DRPU. Without shadows or secondary rays (the whole flippin' point of RT!). 4x's Cell's performance. Now look at that 1024x768 screenshot and tell me, yes, this is the future of gaming graphics! For 720p with antialiasing, necessarily supersampling with raytracing, you'd be increasing resolution by about 3x (2xAA). You'll need a load more lightsources, a load more models, far more changes between scenes screwing with the spacial optimizations, complex shaders (a bit of normal mapping would be nice...), etc.

What part of all this research leads you to believe that not only will they leapfrog their current performance and predictions to include all the eye-candy we are used to in current games but which their examples aren't matching, but they'll also get secondary rays in there with their exponential increase in processing demands and give us GI-lit scenes?! And you seriously think that rasterization tech is going to sit still and be overtaken? :oops:

Just randomly finding some raytraced images on the web in the 'low end' and deducing these are a realistic target shows a wanton lack of technical understanding that is unfitting for this technical discussion forum. If you want to propose a realistic case for the introduction of realtime traytracing here, please back it up with more than some random images. Failing that the thread will be locked on account of being a nonsense debate.
 
It's not going to happen with a room full of processors.

That's a silly thing to say to be fair. Are you saying that Roadrunner could not do it? Look again what 24 Cells can do through software alone on what IBM stated was totally scalable to the number of processors. I don't believe this report about a SEGA board for one second, but to dismiss the idea completely is naive as there could be many things we simply do not know about. Whether it can be cost effective or not is a different matter, but we are at a point where it can be done.
 
But what if it's true?
Never.

The money in the current arcade market is to port games to home consoles. A realtime raytracing-based arcade system would basically be unportable, in the sense you'd need to re-engineer your entire game and possibly its art assets as well - in all essence making it inventing the same wheel twice over at twice the price or more - for it to run on home systems.

There's no way such a rumored model 4 board could survive economically. Arcade markets in japan are shrinking and have been for years. In the US they're nearly gone entirely, and in most of europe they're completely extinct already, and have been for a long time now.

Current quantum mechanics theories predict any substance could spontaneously transmute into another, such as coal becoming gold. I place this rumor on the same level of probabilities of it actually happening... ;)
 
That's a silly thing to say to be fair. Are you saying that Roadrunner could not do it? Look again what 24 Cells can do through software alone on what IBM stated was totally scalable to the number of processors.

Yeah, like 24 Cells are a reality for a commercial product... If we could somehow hotwire a thousand of them together, maybe we could also render our scenes in realtime, but what exactly would be the point?

And we're pretty much getting near the end of what the current IC manufacturing processes have to offer, so we can kiss Moore's law goodbye kinda soon. Thus these computing powerhouses are far beyond the horizon for actual products - and the Wii has proven that no customer really needs them anyway. R&D money is better spent on input devices and actual game mechanics instead of an even better rendering system...
 
Never.

The money in the current arcade market is to port games to home consoles.

Then why is Sega's strategy to make arcade games where they don't convert them to the home market? Also, based on my own (very limited) observation, the chinese arcade game market seems to be booming....
 
They have provided their own extrapolations for us to consider. On page 159 is described a multicore DRPU chip, 196 mm^2 at 130nm, 186 mm^2 at 400 MHz (their own clock limit). The following page has a table of results comparing their hypothetical processor to Cell. The VW beele, rendered at 40 fps on Cell, would be rendered at 160 fps on their equivalently sized DRPU. Without shadows or secondary rays (the whole flippin' point of RT!). 4x's Cell's performance.

That's pretty interesting. So with a 32nm process, we're looking at 64X the transistor density at 400MHz. With a moderate clock bump to say 800MHz, that chip would be quite a beast and wouldn't have much trouble at 720P. Not only that but it would still be a single chip that's not too big in die size either. A board with 4 chips would be a RT monster.
 
That's pretty interesting. So with a 32nm process, we're looking at 64X the transistor density at 400MHz. With a moderate clock bump to say 800MHz, that chip would be quite a beast and wouldn't have much trouble at 720P.
Wouldn't have much trouble doing what at 720p? Did you look at the picture to see what quality that Beetle is? If you're thinking something like GT5, or Forza3, you're expecting way too much!

(Edit : Note I'm not saying better isn't possible. Only the sources people are referencing don't support the extrapolations they're making. At Siggraph 2005 Saaland University where discussing their realtime raytracing software on Cell and showed some better results. Realtime on Cell?? Let's look at how realtime raytracing is being used in PS3 to make everything look so wonderful...)

There are no useful performance results for gauging how this DRPU copes with recursive rays, the whole reason for choosing RT over scanline, the way RT manages its fantastic realism in offline renders, and when you throw secondary rays into the problem, the processing requirement increases exponentially. To get those lovely realistically lit rooms, you're going to need an order of magnitude or two rays, with a huge increase in processing requirements. Whereas 2 year old GPUs get the same look (played Mirror's Edge at all?) with textures. Why is every single test-scene reminiscent of 1990's graphics? Because they are aiming for realtime and sticking to scenes that'll render in a fraction of a second. But they aren't trying realistic contemporary game scenes.

Furthermore, compare what this DRPU is doing in their (theoretical) current configuration to what a GPU of the same size is managing in terms of visual output. Now imagine that raytracing monster alongside a GPU of similar size. Whatever the RT processor is outputting, the scanline GPU (actually a generic vector maths core going forward, that ought to be pretty good at raytracing itself) will achieve better results because scanline is a more efficient use of transistors. It uses more hacks to get better performance.

Until we max out what rasterization can achieve and need to look to RT to fill in effects that require recursive rays (which would still be better in a hybrid renderer), the graphical output per square millimetre of silicon will always favour the GPU. Raytracing is about realism, not speed, and games are all about speed. The idea of photorealistic games rendered with photorealistic mechanics is plain unrealistic. As someone who started with raytracing in the Amiga days and looked forwards to realtime tracing in the future, I've seen every generation of advancement remain slow, as the push for realism has always matched or exceeded the advance of technology. I look with excitement at some of the incredibly clever optimisations maths gurus are finding to simplify the process so the eye remains fooled, but I also understand that a minimum recursion of 3 rays required for colour feedback requires an astronomical amount of processing power, such that realtime in the home PC is a target not on the horizon but somewhere beyond it. And any processor that's buzzing through rays could achieve the same look in a miniscule fraction of the effort with some precalculated textures freeing up all that processing power to do other useful work.
That's a silly thing to say to be fair...
Okay, sure, it was a quick off-the-cuff remark. I'm sure some room of computers could raytrace stuff in realtime :D. But I was thinking of Pixar etc., who aren't raytracing in realtime to this day. It would be intersting to see what system would need to render Toy Story in realtime 30fps (although Renderman's not a full raytracing engine) and how that compares to a GPU steup producing the same results in a scanline renderer.
 
That's pretty interesting. So with a 32nm process, we're looking at 64X the transistor density at 400MHz. With a moderate clock bump to say 800MHz, that chip would be quite a beast and wouldn't have much trouble at 720P. Not only that but it would still be a single chip that's not too big in die size either. A board with 4 chips would be a RT monster.

That's what I was trying to get at.

It really isn't decades into the future at all. If somebody wanted to, they could do it within the next 2 years.

If the FGN article is correct, then that somebody may just be SEGA.
 
The money in the current arcade market is to port games to home consoles.

SEGA's business strategy since 2005 has been to keep arcade titles in the arcades. This was done to strengthen their amusements business.

Virtua Fighter 5, Virtua Tennis 3, Initial D 4 being the 3 rare exceptions out of the 27 titles released on the LINDBERGH platform.
 
Back
Top