What exactly is ray traced here? All the lights look distinctly 12 years old.12 years ago ati was showcasing ray tracing for the 2xxx cards
https://streamable.com/973rc
anyone remembers those?
12 years ago ati was showcasing ray tracing for the 2xxx cards
https://streamable.com/973rc
anyone remembers those?
What exactly is ray traced here? All the lights look distinctly 12 years old.
And here's another straight-on picture of the PCB, straight from NVIDIA.Here's TU104 if anyone wants to try sizing it
Source: GN quick tear down video at Gamescom
View attachment 2644
I think the biggest mistake is calling $1,000 card "TI"
What exactly is ray traced here? All the lights look distinctly 12 years old.
And here's another straight-on picture of the PCB, straight from NVIDIA.
More specifically, it's a sanitized image. That's a real board and substrate; they've just brushed out all of the unique markings on the PCB and RAM, and prettied up the top of the GPU. We've gotten similar images from NVIDIA in prior generations, and they've been accurate.This one have been made for been calculated ... the core square is not a real one chips, it is an image But as it is coming from Nvidia, i think the dimensions are real.
Yup, no Damien review this time
More specifically, it's a sanitized image. That's a real board and substrate; they've just brushed out all of the unique markings on the PCB and RAM, and prettied up the top of the GPU. We've gotten similar images from NVIDIA in prior generations, and they've been accurate.
No further than Gameworks type of optional features.Supposing that neither AMD's Navi nor Intel's upcoming dGPU will have dedicated hardware for raytracing and Nvidia remains as the only one who has it, then how would that affect mass adoption of raytracing in future games?
I thought this part was interesting:NVIDIA's Move From GTX to RTX Speaks to Belief in Revolutionary Change in Graphics
https://www.techpowerup.com/246930/...to-belief-in-revolutionary-change-in-graphics
That souns to me like a misconception, and I can already read things here and there abou fake resolutions adn what not. Point "rendering" is as fake as it gets I can't see for my life how including some level of ray tracing (physics/optics) to the process make the thing faker, quite the contrary.The move from GTX to RTX means NVIDIA is putting its full weight behind the importance of its RTX platform for product iterations and the future of graphics computing. It manifests in a re-imagined pipeline for graphics production, where costly, intricate, but ultimately faked solutions gave way to steady improvements to graphics quality.
Not unless someone makes a new coin where the mining is based on ray tracing or tensor calculations.Is there any way that the RTX components can be used for accelerating GPU Mining?
Please, let the answer be no. Otherwise the insane prices will be even higher.
https://wccftech.com/nvidia-turing-gpu-geforce-rtx-2080-ti-ipc-block-diagram-detailed/The Turing TU102 GPU has 72 Streaming Multiprocessors (SM) featuring 64 CUDA cores each. The full die features 4608 CUDA cores while the GeForce RTX 2080 Ti features 4352 cores. The chip has 576 Tensor cores, 72 RT cores, 36 Geometry Units, 288 Texture Units (TMUs) and 96 ROPs (Raster Operation Units). In addition to the core specs, the chip has 384-bit memory interface supporting a 7 GHz GDDR6 (14 GHz Effective) DRAM design and 2 NVLINK channels. The chip features 6 MB of L2 cache too.
...
Lastly, there’s a new overclocking feature being talked about which the new Turing GeForce RTX graphics cards will be able to make use of. To be known as the Scanner (Final Name is still a work-in-progress), the feature will let the OC Utility detect the best clock speeds and voltages for you without the need to do anything. Just run a test through the overclocking utility and you are all set.