Predict: The Next Generation Console Tech

Status
Not open for further replies.
I thought that was a joke... joke on me! Just another re-enforcement that the budget put forth toward rasterizing has an end result, currently, many times better than the same budgets put toward realtime RT. With all the hacks and engines coming out with realtime lighting (direct and indirect with nearly limitless lights), shadowing, etc it makes some of the benefits, due to the high costs, less appealing.
 
How long (if ever) will it even be before we see a convergence between the cost/benefit of raster versus cost/benefit of raytracing? When we can calculate 3 bounces in RT without complex material transmission/diffusion, how likely is it that the hacks can get good to great approximations for, say, 6 bounces with subsurface scattering, using equivalent processing resources? Ignoring edge cases.

Dunno, talking out my ass here.
 
How long (if ever) will it even be before we see a convergence between the cost/benefit of raster versus cost/benefit of raytracing? When we can calculate 3 bounces in RT without complex material transmission/diffusion, how likely is it that the hacks can get good to great approximations for, say, 6 bounces with subsurface scattering, using equivalent processing resources? Ignoring edge cases.

Dunno, talking out my ass here.

Meh, about the same time we move over to using REYES for realtime graphics.
 
this was actual raytracing - everything pre-rendered, of course. that's it, "raytracing" was a term you uttered back in the early 90s and everyone would marvel at how awesome it is.


too bad that infamous game was useless. hit the kick button until your opponnent is dead. continue until you're bored - chances are you won't ever see what the third bot looks like. :p
 
I don't see much hope for raytracing hardware in any next-gen console unless we hear about IMGTEC getting a presence. But then I haven't heard much about hardware developments in OpenRL/Caustic and don't believe there's even a hardware raytracing option from Imagination. Their website says OpenRL will be integrated in future cores.

There's an outside chance of a raytracing reveal in one of the next consoles, but otherwise I don't think that's an avenue worth exploring yet in trying to guess what the next boxes will be packing.
 
I don't see much hope for raytracing hardware in any next-gen console unless we hear about IMGTEC getting a presence. But then I haven't heard much about hardware developments in OpenRL/Caustic and don't believe there's even a hardware raytracing option from Imagination. Their website says OpenRL will be integrated in future cores.

There's an outside chance off a raytracing reveal in one of the next consoles, but otherwise I don't think that's an avenue worth exploring yet in trying to guess what the next boxes will be packing.

They are supposed to show of their Ray-tracing IP in a few days at SIGGRAPH 2012

http://withimagination.imgtec.com/?p=620

Accelerating look development *with* Rhinoceros interactive ray traced viewports
Wednesday 8th, 14:15 – 15:15

Interactive ray tracing plugins for popular 3D packages (including Autodesk 3ds Max, Autodesk Maya and McNeel & Associates Rhinoceros) are now bringing final-frame photorealism to even the earliest stages of modeling and lighting and in doing so creating exciting new creative opportunities for artists and designers.

Users of Rhinoceros will learn how real-time ray traced viewports help a designer make better-informed creative choices, shorten review cycles and save time by reducing unnecessary and time-consuming preview renders compared to working with traditional OpenGL or Direct3D viewports.

Accelerating look development *with* Autodesk 3ds Max and Autodesk Maya interactive ray traced viewports
Wednesday 8th, 17:05 – 18:00

Interactive ray tracing plugins for popular 3D packages (including Autodesk 3ds Max, Autodesk Maya and McNeel & Associates Rhinoceros) are now bringing final-frame photorealism to even the earliest stages of modeling and lighting and in doing so creating exciting new creative opportunities for artists and designers.

Users of 3ds Max and Maya will learn how real-time ray traced viewports help an artist or designer make better-informed creative choices, shorten review cycles and save time by reducing unnecessary and time-consuming preview renders compared to working with traditional OpenGL or Direct3D viewports.

Where to find us and other interesting sessions
We will be displaying at Booth 522, so drop by to discover the world’s leading GPU architecture powering the best smartphones, tablets and computing devices. We’ll have our Development Team close by to showcase the new PowerVR Insider SDK and what new features it will deliver to the 32,000 PowerVR Insider members.

The first in-house silicon demonstration for Imagination’s ray tracing acceleration is also a major highlight of the event, together with other OpenRL developer demos from key developers. Also, our very own James McCombe, Director of R&D for our PowerVR Ray Tracing IP will be part of a panel of experts for Jon Peddie Research discussing on the topic of “Doing More With Multicore” at the annual lunch event.
 
Last edited by a moderator:
So, they're just talking about replacing the rasterization portion of the rendering pipeline with a raytracer. We're not talking about rays for lighting?
 
interesting but it sounds like it's about getting an ugly preview with accurate lights and shadows.

So, they're just talking about replacing the rasterization portion of the rendering pipeline with a raytracer. We're not talking about rays for lighting?
No, to both of you. ;) Currently while modelling a scene, a rasteriser is used for the realtime interaction. This shows an approximation of the final scene which you only know what it'll look like for real when you render a preview. OpenRL (Caustic) is offering a reportedly new approach to raytracing that speeds this method up fast enough to be used during the scene creation phase, and Imagination are also offering custom hardware to speed this process up to 'real-time' speeds. What constitutes realtime is an unknown at this point. 0.5 fps rendering of a complex scene fully lit and shaded is realtime in the offline rendering world, but we'll have to wait and see if the hardware has something to offer the 30fps game market.

Hopefully after Siggraph there'll be someone on this board with IMGTEC connections who'll be able to talk. I think we have one or two members who are loosely invovled with IMG...
 
this was actual raytracing - everything pre-rendered, of course. that's it, "raytracing" was a term you uttered back in the early 90s and everyone would marvel at how awesome it is.

too bad that infamous game was useless. hit the kick button until your opponnent is dead. continue until you're bored - chances are you won't ever see what the third bot looks like. :p

I remember playing that game on a PC back in the day. Anyway get Capcom involved and you'd get an awesome robot fighting game.
 
Who are you calling loose?

Spill it!

powervr-rtx.jpg
 

Oh really!?!?!?!?!? Who ever knew he was being sarcastic? I had no idea!

I know at least one of these guys used to post frequently at STP/DCTP. I wonder whatever happened to that guy. He once had a website that had a good working of VQ texture compression that blew the other formats out the water. He also is one of the guy's who ever gave any actual good info on the Elan chip. I wonder where he is now....it's not like he ever posts on B3D undder the same name he used those years ago. Simon says?

All sarcasm aside. Where the heck did Kristof ever go? Is he still on these forums or just never posts any more, different name?
 
Last edited:
Status
Not open for further replies.
Back
Top