3D transformation on analogue hardware

Squeak

Veteran
Recently it has come to my attention that several people in the past has, with some success, experimented with 3D transformation on analogue equipment:

http://scitation.aip.org/getabs/ser...00018000009000819000001&idtype=cvips&gifs=yes (1948)

Barry, C.D., R.A. Ellis, S.M. Graesser and G.R. Marshall, "Display and Manipulation in Three Dimensions", Pertinent Concepts in Computer Graphics, eds. Faiman, M. and J. Nievergelt, Univ. of Ill. Press, Urbana, Ill., 1969.

The main problem with analogue computers, especially in marketable equipment has been (apart from bad precision) vulnerability to electrical noise and low flexibility.
But with the advent of analog FPGAs and neural net research finally maturing (neural nets has much better resistance to noise), I enquire if it could be a valid way of dramatically speeding up 3D transformation in realtime apps?

Now, of course I’m not suggesting completely analogue 3D hardware, but a hybrid computer where the analog part delivers the first digits of precision and the digital part “fills in the restâ€￾.
Such a combination has worked well for other applications.

What’s more, it appears we might have to go in that direction eventually:
http://www.eetimes.com/story/OEG19981103S0017

What do you all think?
 
Squeak said:
Now, of course I’m not suggesting completely analogue 3D hardware, but a hybrid computer where the analog part delivers the first digits of precision and the digital part “fills in the restâ€￾.
Such a combination has worked well for other applications.
Could you mention one such application? I can't think of any.

Back in the days of fixed function pixel pipelines and 24 bit being oohh so high end, I was playing with the thought of an analog pixel pipeline. (You'd need an "analog DRAM" too.) But now, with long pixel shaders, render to texture, and using the pixel pipeline for stuff that needs a lot more precision, it seems a lot less interesting.

And doing it for 3D transformation seems like a bad idea. You don't want it anywhere where you need to convert it back to digital again. The cost of the A/D converters would negate any benefits from going analog. (And that's event without factoring the reduced precision.)
 
london-boy said:
OK now i'm totally lost... Analog computers?
Before digital computers became powerful enough, a lot of high-important control computers, like in planes, were analog. Like a CCD. They used "buckets" (condensators) to store charges that represented a level. Analog numbers. And they stored the initial levels ("program") on things like laser disks or analog tapes.

Large, bulky and with low precision, but the only things good and fast enough to control machinery at that time.

Linky.
 
Basic said:
Could you mention one such application? I can't think of any.
http://en.wikipedia.org/wiki/Hybrid_computer
And doing it for 3D transformation seems like a bad idea. You don't want it anywhere where you need to convert it back to digital again. The cost of the A/D converters would negate any benefits from going analog. (And that's event without factoring the reduced precision.)
Well, looking at our own brains it certainly is possible to do it (otherwise we wouldn't be able to perceive the world, imagine stuff or dream (my dreams seem pretty high def. to me ^_^)).
Maybe if the digital part was a bit-stream computer (no bytes) the A/D conversion wouldn't be needed (just a wild conjecture).
That said, I can't see how the A/D would be that prohibitive, can you explain why? I mean there are many applications where a massive stream of analogue data needs to be rapidly converted.
 
Squeak said:
That said, I can't see how the A/D would be that prohibitive, can you explain why? I mean there are many applications where a massive stream of analogue data needs to be rapidly converted.
While low-precision (~8-10 bits) A/D can be done fairly quickly at a cost that is not too prohibitive for individual A/D converters, once you get above a certain level, the cost of fast A/D conversion (by 'fast', I mean >1 MHz samle rate) increases more or less exponentially with the number of bits of precision you need. Also, analog circuits generally benefit much less from semiconductor miniaturization than purely digital circuits.
 
arjan de lumens said:
While low-precision (~8-10 bits) A/D can be done fairly quickly at a cost that is not too prohibitive for individual A/D converters, once you get above a certain level, the cost of fast A/D conversion (by 'fast', I mean >1 MHz samle rate) increases more or less exponentially with the number of bits of precision you need. Also, analog circuits generally benefit much less from semiconductor miniaturization than purely digital circuits.
You can always use two 1-bit converters: add a one when the signal gets higher, and a zero when it stays the same or gets lower for the first, and do the opposite for the second. That gives you two bitstreams that together form a digital representation of the waveform, depending on the clockspeed. And you can get a very high speed for those, if you use bipolar transistors.
 
DiGuru said:
You can always use two 1-bit converters: add a one when the signal gets higher, and a zero when it stays the same or gets lower for the first, and do the opposite for the second. That gives you two bitstreams that together form a digital representation of the waveform, depending on the clockspeed. And you can get a very high speed for those, if you use bipolar transistors.
This would give you a mechanism to trade off accuracy versus sample rate and are presumably useful for capturing high-frequency continuous signals where some falloff of fidelity with increasing frequency is acceptable (sounds like a good idea for an SACD-type application). However, an A/D converter of such a nature does not sound like it would be very useful for the applications discussed in this thread.
 
arjan de lumens said:
This would give you a mechanism to trade off accuracy versus sample rate and are presumably useful for capturing high-frequency continuous signals where some falloff of fidelity with increasing frequency is acceptable (sounds like a good idea for an SACD-type application). However, an A/D converter of such a nature does not sound like it would be very useful for the applications discussed in this thread.
Well, things like that have even been used for CD decoding. So it's definitely working.

But anyway, while the first analog computers were some in-between hybrid, nowadays they might be a good and valid alternative for a lot of things, like some things where DSP's are used for.

And while it's a bad match to CMOS chips, all the interfaces are still analog by nature. So you might incorporate some analog circuitry on chips, like a Fourier transformation that executes on an output signal.

I think the main problem, is that analog is lagging significantly behind digital in just about any complex chip nowadays, and the preferred solution to any on-chip function is a digital one.

Broader use of optics for communication might change that.
 
I remember from listening to a prof talk about VLSI road map that as transistors gets smaller, SNR becomes lower. If that is the case, then analog portion of the chip will take up relatively larger area and become relativly slower. (from using larger transistors for smaller noise) So add in the D/A and A/D converters, you lose all the benefit in going analog.
 
DiGuru said:
Well, things like that have even been used for CD decoding. So it's definitely working.

Noone says it doesn't. But something that would perform all the functions of a current gfx chip would be probably as big as a house, suck up 2 MW of energy and be really, really lame-ass slow.
 
Yup. If you want to use an A/D converter to assist 3d transforms in a modern GPU, it would presumably need to be able to sample its input at several hundred Megahertz while at the same time maintain ~24-bit precision for EVERY sample. This is a performance requirement that goes many orders of magnitude beyond even professional/audiophile-class audio A/Ds.
 
I kinda thought it was obvious that we'd develop analogue technology far beyond digital technology at some point. I wouldn't have a clue of the timescale myself, but as humans are analogue, it seems that the most analogous (sorry! :p) technology for us would be similar. Hugely complex for us at this stage of development, though. :)

I'm at work now, but I'll have a look at those links when I get home - thanks, Squeak. :)
 
I struck upon this accidentally and remembered this old thread.
https://www.cs.tcd.ie/Michael.Manzke/fyp2004-2005/MuirisWoulfe.pdf

From what I can gather this proposed physics engine is crazy fast. Fifty vehicles with two spring suspended axles can be executed 15.27 times faster than real time!
And this is just something a single man dreamt up in a year starting from scratch. Imagine what could happen if one of the big hardware manufactures put their weight behind something like that.
We could have a truly dynamic simulated world.
 
Back
Top