X-Silicon Unveils Low-Power C-GPU Architecture Combined With RISC-V CPU

del42sa

Regular
https://wccftech.com/x-silicon-unve...ith-risc-v-cpu-open-standard-supports-vulkan/

a667cd5f-7a46-4db3-87e2-6f7ebd441f42.webp

X-Silicon-C-GPU-171183c4.png
 
This is just Larrabee all over again except this time they're using RISC-V architecture over the traditional Intel architecture for their design basis and with potential to add more specialized hardware ...

In order to reinvent the GPU shader core, X-Silicon says it is creating a new scalable RISC-V vector-based unified compute-graphics engine (C-GPU) that can efficiently compute the kinds of next-generation workloads that traditional GPUs were never designed for.​

Such applications include AI, HPC, vision, geometric computing, as well as 2D and 3D graphics. The company says its MIMD architecture is uniquely capable of independently running CPU and GPU code in the same core affording capabilities such as low-memory footprint execution, bare-metal programming to the hardware registers, high-performance, low-power operation, and replacement of traditional shader programs with an open-source RISC-V ISA for CPU and GPU, using a single instruction stream.

As such, workloads can be implemented in parallel or pipelined, and run simultaneously on a core as opposed to sequentially on traditional GPUs. It can also run an operating system on a core, says X-Silicon.​

A unified ISA for CPU/GPUs and being able to run modern operating systems ? Sounds uncannily like Larrabee!
 
This is just Larrabee all over again except this time they're using RISC-V architecture over the traditional Intel architecture for their design basis and with potential to add more specialized hardware ...



A unified ISA for CPU/GPUs and being able to run modern operating systems ? Sounds uncannily like Larrabee!
yeah, but unlike larabee, risc-V isn't a +40 year old 16 bit isa held together by glue and duck tape!
 
yeah, but unlike larabee, risc-V isn't a +40 year old 16 bit isa held together by glue and duck tape!
I don't know about you but I like having simpler GPU designs either way ...

I don't really fancy the idea of GPU languages such as CUDA or Metal trying to be as close as possible to CPU languages like C++ and I don't think GPUs running modern operating systems is the future either ...
 
I don't know about you but I like having simpler GPU designs either way ...

I don't really fancy the idea of GPU languages such as CUDA or Metal trying to be as close as possible to CPU languages like C++ and I don't think GPUs running modern operating systems is the future either ...
want i meant to get across was, x86 should have died off long ago, let alone be extended for gpu code.
 
Back
Top