Historical rendering dead-ends

Discussion in 'Rendering Technology and APIs' started by Concerto, Sep 8, 2018.

  1. Concerto

    Newcomer

    Joined:
    Aug 25, 2017
    Messages:
    13
    Likes Received:
    0
    So on Twitter I came across a thread posted by Richard Mitton (@grumpygiant) asking for any examples of rendering dead-ends, one of his examples being Ecstatica’s ellipsoid technology. I found this interesting and wondered if there is anybody on this forum with experiences with that sort of thing.

    Some highlights that stuck out to me from the comments were Microsoft’s Talisman and its features like impostor textures, forward texture mapping and quadratics (nvidia nv1/ Sega Saturn), voxels, as well as some 8-bit computer trickery.
     
  2. milk

    Veteran Regular

    Joined:
    Jun 6, 2012
    Messages:
    2,952
    Likes Received:
    2,514
    First thing that comes to mind is the old arcades that rendered their graphics as the signal was being sent to the monitor's raster to avoid wasting costly memory on a frame buffer. Racing the beam.
    Going even further back, we can point to older arcades such as asteroids,that didn't even use raster displays, but rather oscilloscope like monitors where they controlled the beam directly.
     
    Heinrich4 and Zaphod like this.
  3. milk

    Veteran Regular

    Joined:
    Jun 6, 2012
    Messages:
    2,952
    Likes Received:
    2,514
    For modern 3d graphics, Doom engine and it's contemporaries also employed vertical line ray-casting, which although limited, allowed some form of pseudo-3d rendering at realtime speeds on 90's IBM PC's.
     
  4. jlippo

    Veteran Regular

    Joined:
    Oct 7, 2004
    Messages:
    1,329
    Likes Received:
    425
    Location:
    Finland
    Heightmap raycastsing, especially ray surfing variants which restricted camera rotations.
    Also voxel sprites from games like Bladerunner.

    Loved the idea of image based rendering when it was introduced as well, but it's pretty much transformed into something we see in advanced impostors and not on actual geometry.
    Sadly lot of the research and demos have vanished from the web, but things like proper object rotation etc running on ancient GPUs etc.
     
    Heinrich4 likes this.
  5. fellix

    fellix Hey, You!
    Veteran

    Joined:
    Dec 4, 2004
    Messages:
    3,486
    Likes Received:
    397
    Location:
    Varna, Bulgaria
    PowerVR's Infinite Planes primitive type?
     
    Gubbi, Heinrich4 and milk like this.
  6. corysama

    Newcomer

    Joined:
    Jul 10, 2004
    Messages:
    175
    Likes Received:
    122
    Continuous level of detail meshes. It's not worth any amount of PCI bandwidth to edit an index buffer on the GPU. Forsyth's sliding window technique is the only one I've seen that doesn't seem absurd on modern hardware.

    Related to ellipsoid rendering, point-based rendering is ever the future. There's a 4x4-vertex patch based approach with seems like a not-as-bad compromise. Meanwhile, Assassin's Creed is rendering in GPU-culled clumps of 256(?) vertices.
     
  7. OCASM

    Regular Newcomer

    Joined:
    Nov 12, 2016
    Messages:
    922
    Likes Received:
    881
    Doom 3 style stencil shadows.
     
    Heinrich4 likes this.
  8. Theeoo

    Newcomer

    Joined:
    Nov 13, 2017
    Messages:
    132
    Likes Received:
    64
    I believe Voxatron employs ray casting techniques to allow destructible terrain. Plus it runs entirely on the cpu.

     
  9. milk

    Veteran Regular

    Joined:
    Jun 6, 2012
    Messages:
    2,952
    Likes Received:
    2,514
    ok, that's 3d raycasting, the cool thing about doom, build, outlaws and company, is that it was basically a 2.5D process.
     
  10. Concerto

    Newcomer

    Joined:
    Aug 25, 2017
    Messages:
    13
    Likes Received:
    0
    I had to read up on the image based rendering and PowerVR’s Infinite Planes techniques. The former is like a Cube-map on steroids or something? At least that’s the idea I got when reading about it.

    The Infinite Plane tech is interesting though but I am left wondering what the real world application of the technique would be. One video that I did see from Scali’s OpenBlog showed some limitations with the original hardware and it seems to only work best with simple surfaces and shadow volumes.
     
  11. Alexko

    Veteran Subscriber

    Joined:
    Aug 31, 2009
    Messages:
    4,491
    Likes Received:
    909
    Euclideon's Unlimited Detail?

    OK, I guess that's too easy.
     
  12. keldor

    Newcomer

    Joined:
    Dec 22, 2011
    Messages:
    74
    Likes Received:
    107
    Parallax mapping.
     
  13. jlippo

    Veteran Regular

    Joined:
    Oct 7, 2004
    Messages:
    1,329
    Likes Received:
    425
    Location:
    Finland
    Variations of parallax mapping are still used in huge amounts of games.
    So while the tech has a lot of limitations, it still is useful.
     
    #13 jlippo, Sep 26, 2018
    Last edited: Sep 26, 2018
    Shifty Geezer, AlBran and milk like this.
  14. keldor

    Newcomer

    Joined:
    Dec 22, 2011
    Messages:
    74
    Likes Received:
    107
    Relief mapping is, but it's fairly different, being a ray cast through a heightfield to find the appropriate texel. As far as I know, the term "parallax mapping" is used exclusively for the algorithm that takes a single height sample and uses it to guess at the proper intersection. Elder Scrolls: Oblivion used parallax mapping (and I can't actually think of anything else that does at all).
    In any case, I expect relief mapping to be gradually replaced by displacement mapping with tessellation. Already, games use half and half for different objects in the scene.
     
  15. keldor

    Newcomer

    Joined:
    Dec 22, 2011
    Messages:
    74
    Likes Received:
    107
    Concerning voxels, I'd consider them alive and well. For instance, Nvidia has the GVDB framework which, while intended mostly for rendering professional stuff (visualizations of simulations and rendering fluids in various circumstances), is nevertheless currently used for "rendering".

    The problem with the term "voxels" is that the line between "voxels" and volumetric rendering in general is muddy, to say the least. The most basic meaning, a rasterized image in 3 dimensions, is straightforward enough, but what about something like distance fields, where you want to create an isosurface from volumetric data? If you call a normal map or a height map a "texture", a SDF can be called "voxels", right? SDFs (combined with marching cubes or surface nets to make triangles or quads) are pretty popular these days, with games like No Man's Sky and (formerly) Subnautica using them for procedural/modifiable terrain. Then of course there are games like Minecraft, which use voxels as a terrain representation, but generates triangles for actual rendering.

    So you can argue what constitutes "true" voxel rendering. Does meshing them into triangles count, or are you expected to directly render them through ray tracing? Most games and such do the former, while GVDB does the latter. What about using task and mesh shaders to generate geometry and render directly from the voxel image?
     
    milk likes this.
  16. Ethatron

    Regular Subscriber

    Joined:
    Jan 24, 2010
    Messages:
    858
    Likes Received:
    260
    It's commonly called Offset Bump Mapping / OBM, it derives the parallax from the xy of the normal map. In the original Parallax Mapping paper the derivatives are calculated from a depth map. As the normal map is the derivative of some heightfield, and because it's already available for free, that's adopted instead of the (more expensive) depth-sample version.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...