Recently it has come to my attention that several people in the past has, with some success, experimented with 3D transformation on analogue equipment:
http://scitation.aip.org/getabs/ser...00018000009000819000001&idtype=cvips&gifs=yes (1948)
Barry, C.D., R.A. Ellis, S.M. Graesser and G.R. Marshall, "Display and Manipulation in Three Dimensions", Pertinent Concepts in Computer Graphics, eds. Faiman, M. and J. Nievergelt, Univ. of Ill. Press, Urbana, Ill., 1969.
The main problem with analogue computers, especially in marketable equipment has been (apart from bad precision) vulnerability to electrical noise and low flexibility.
But with the advent of analog FPGAs and neural net research finally maturing (neural nets has much better resistance to noise), I enquire if it could be a valid way of dramatically speeding up 3D transformation in realtime apps?
Now, of course I’m not suggesting completely analogue 3D hardware, but a hybrid computer where the analog part delivers the first digits of precision and the digital part “fills in the restâ€.
Such a combination has worked well for other applications.
What’s more, it appears we might have to go in that direction eventually:
http://www.eetimes.com/story/OEG19981103S0017
What do you all think?
http://scitation.aip.org/getabs/ser...00018000009000819000001&idtype=cvips&gifs=yes (1948)
Barry, C.D., R.A. Ellis, S.M. Graesser and G.R. Marshall, "Display and Manipulation in Three Dimensions", Pertinent Concepts in Computer Graphics, eds. Faiman, M. and J. Nievergelt, Univ. of Ill. Press, Urbana, Ill., 1969.
The main problem with analogue computers, especially in marketable equipment has been (apart from bad precision) vulnerability to electrical noise and low flexibility.
But with the advent of analog FPGAs and neural net research finally maturing (neural nets has much better resistance to noise), I enquire if it could be a valid way of dramatically speeding up 3D transformation in realtime apps?
Now, of course I’m not suggesting completely analogue 3D hardware, but a hybrid computer where the analog part delivers the first digits of precision and the digital part “fills in the restâ€.
Such a combination has worked well for other applications.
What’s more, it appears we might have to go in that direction eventually:
http://www.eetimes.com/story/OEG19981103S0017
What do you all think?