From ENIAC to CELL

j^aws

Veteran
Posted on Tue, Jun. 22, 2004

Computer pioneer Goldstine dies at 90

BY GAYLE RONAN SIMS
Knight Ridder Newspapers


(KRT) - Herman Heine Goldstine, 90, the scientist who persuaded the U.S. military to back the development of the first computer, ENIAC, died Wednesday of Parkinson's disease at home in Bryn Mawr.

Dr. Goldstine's part in ENIAC began in 1942, when he enlisted in the Army. The Army sent the accomplished mathematician to the Ballistic Research Laboratory at Aberdeen Proving Ground in Maryland, where he worked on ordnance projects.

In 1943, he came across a memo from University of Pennsylvania scientists J. Presper Eckert and John Mauchly proposing that a calculating machine could be used to determine "firing tables" used to aim artillery. Those tables - the settings used for directing artillery under varied conditions and taking into account such variables as rounds, weather and atmospheric conditions, and distance to target - took hours to calculate. Mr. Goldstine persuaded the Army brass to fund the two young scientists' project, and the computer age was launched.

Dr. Goldstine understood the sophisticated ideas and principles that Mauchly and Eckert employed in developing digital computers that operated with numerical values expressed as digits as opposed to analogs. Intrigued by their proposal, Dr. Goldstine lobbied the brass at Aberdeen Proving Ground in April 1943, requesting $500,000 to pay for the research. He believed the Army, which was already shipping guns overseas without firing tables, was in a big enough jam to put money on a long shot.

Dr. Goldstine made his case to a committee headed by the mathematician Oswald Veblen. Veblen brought his chair forward with a crash, got up, and said to Col. Leslie E. Simon, director of the Army's Ballistics Laboratory: "Simon, give Goldstine the money."

Dr. Goldstine ran the show for the Army, and a team of scientists and engineers was assembled at Penn's Moore School to build the computer, with Mauchly and Eckert supervising. The project was kept secret as a matter of national security.

ENIAC, an electronic computer that could compute a trajectory in one second, was born in 1945. It was enormous. It was 80 feet long and had an 8-foot-high collection of circuits and 18,000 vacuum tubes. ENIAC operated at 100,000 pulses per second.

It took so long to build that the war ended before it could be used for its original purpose - churning out firing tables used to aim artillery.

ENIAC was publicly announced in 1946 on the first floor of the Moore School. In 1947, most of ENIAC was moved to Aberdeen Proving Ground, where it continued to operate until 1955.

Some original parts remain at Penn, in the basement where it was built.

Dr. Goldstine left the military in 1946, and a year later he was appointed associate director for the electronic computer project at the Institute for Advanced Study in Princeton, N.J., where he collaborated with John von Neumann in designing the second generation of computers, EDVAC (Electronic Differential Variable Computer). This computer and its successors were put to scientific and industrial uses.

IBM, realizing the computer's potential, hired Dr. Goldstine away from Princeton in 1958. Within two years, the company dominated the computer business. Dr. Goldstine stayed with IBM for 26 years, serving as director of mathematical sciences in research, director of scientific development for data processing, and director of research. He retired in 1984.

IBM established a Herman Goldstine Fellowship in mathematical sciences. In 1985, he was awarded the National Medal of Science for his work in the invention of the computer.

The author of five books, Dr. Goldstine wrote the widely read "The Computer from Pascal to Von Neumann."

In retirement, Dr. Goldstine became executive officer of the American Philosophical Society. During his tenure, he oversaw the construction of Benjamin Franklin Hall on South Fifth Street in Philadelphia.

Arlin M. Adams, a retired judge on the U.S. Court of Appeals for the Third Circuit and a longtime friend of Dr. Goldstine's, said: "Although Herman was known as a mathematician, he was very active in attracting foreign visitors to Philadelphia and the Philosophical Society."

Dr. Goldstine was born in Chicago and earned a bachelor's degree in 1933, a master's degree in 1934 and a doctorate in 1936, all in mathematics, from the University of Chicago. He taught at his alma mater and at the University of Michigan before enlisting in the Army. Always a thin man, he stuffed himself with bananas and milk shakes so he could gain enough weight to pass the military medical entrance exam for World War II.

Mr. Goldstine is survived by his wife of 38 years, Ellen Watson; a son, Jonathan; a daughter, Madlen Goldstine Simon; and four grandchildren. His first wife, Adele Katz, a mathematician who wrote a manual explaining how to program ENIAC, died in 1964 after 23 years of marriage.

A ceremony celebrating Dr. Goldstine's life will be held at the American Philosophical Society in the fall.

Memorial donations may be made to the Herman H. Goldstine Memorial Fund, American Philosophical Society, 104 S. Fifth St., Philadelphia 19106.

Source

Firstly, deepest condolences... :(

From 18k valves to 1 billion transistors...
From 100 khz to 4ghz...
From 80*8 square ft to a size of a console...

The last 60 years sure have changed from ENIAC to CELL...What can we realistically expect in the next 60 years for CPU/GPU/RAM/Media in consoles! :?:
 
In this industry it is not a good idea to predict what will happen even 5 years from now, yet alone 60 years. As such I won't.

Of intrest is that ENIAC actually wasn't the worlds first electronic computer. The British during WW2 actually developed their own computer called 'Colossus'. It was built in 1941 and was used for cracking enigma codes, while it didn't have permanent electronic any storage for the program it ran, it did run it from a punch card like system that was run in a loop on a motor with a light sensor that would detect each hole in the card, the faster you could get the cards moving the faster the system would run.

The reason why you haven't heard much of this before it that it was build by the Britsh secret service and any information of the machine was only revealed just a few years ago! The machine was destroyed and every one involved had to sign a silect contract so that now body would find out.

As it stands Colossus was first but ENIAC was the first known by the public and all the technology for it was developed without knoledge of Colossus and as such was still one of the greatest achievments in modern engineering.
 
Jabjabs said:
In this industry it is not a good idea to predict what will happen even 5 years from now, yet alone 60 years. As such I won't.

Of intrest is that ENIAC actually wasn't the worlds first electronic computer. The British during WW2 actually developed their own computer called 'Colossus'. It was built in 1941 and was used for cracking enigma codes, while it didn't have permanent electronic any storage for the program it ran, it did run it from a punch card like system that was run in a loop on a motor with a light sensor that would detect each hole in the card, the faster you could get the cards moving the faster the system would run.

The reason why you haven't heard much of this before it that it was build by the Britsh secret service and any information of the machine was only revealed just a few years ago! The machine was destroyed and every one involved had to sign a silect contract so that now body would find out.

As it stands Colossus was first but ENIAC was the first known by the public and all the technology for it was developed without knoledge of Colossus and as such was still one of the greatest achievments in modern engineering.

Yeah I've heard of colossus, what amazing technology.
 
Nanotechnology. Although i think the shrinking limits of current materials used will become a problem shortly. NExt step is fidning other matierials to use, that can be shrunk to smaller sizes than silicon will ever be.
 
In other words secrecy killed british edge in science/technology and handed it to more transparent americans
 
Druga Runda said:
In other words secrecy killed british edge in science/technology and handed it to more transparent americans

Maybe that's being a bit overdramatic about it since the Einiac's a bit more advanced than punch cards spinning around in a machine, but there might be a kernel of truth in there. The same thing happened with jet engines after all.
 
Jabjabs said:
Of intrest is that ENIAC actually wasn't the worlds first electronic computer. The British during WW2 actually developed their own computer called 'Colossus'.
The Colossus used relays AFAIR, and the spinning punch cards you mentioned. So it was really very much an electromechanical computer, rather than the purer electronic, flipflop based static electronics of the ENIAC.
But even before Colossus, there were Konrad Zuses electromechanical computers from the thirties and forties. http://en.wikipedia.org/wiki/Zuse
 
colossus could crack codes as quick as modern computers . . could eniac ?

colossus used valves , though it probably used relays as well..

-dave-
 
davefb said:
colossus could crack codes as quick as modern computers . . could eniac ?

colossus used valves , though it probably used relays as well..

-dave-
http://en.wikipedia.org/wiki/Colossus_computer
Well I'll be... You’re right!
I don’t know why I wrote “AFAIRâ€￾ when I have the whole internet right in front of me? :)
As fast as a modern computer! Who would have thought that?
What an amazing design.
All the early computers had some mechanical parts in them I guess.
 
4004 vs Cell


intel 4004
1971
CPU 4bit
2250 transitors
108 khz
46 instructions
10 micrometers (10 000 nm)
12 mm2



intel-4004.jpg
 
Guden Oden said:
davefb said:
colossus could crack codes as quick as modern computers . .

:oops: Surely you must jest? How on EARTH would THAT be possible?
If the design is dedicated and parallel enough, and the task simple, very high speeds can be reached.
For example, you could easily build an analog circuit that could calculate the planets orbits around the sun as fast, or faster, than any electronic computer.
 
Squeak said:
If the design is dedicated and parallel enough, and the task simple, very high speeds can be reached.

But NOT with paper punchcards and relays! That's just a prepostrous suggestion.
 
Guden Oden said:
Squeak said:
If the design is dedicated and parallel enough, and the task simple, very high speeds can be reached.

But NOT with paper punchcards and relays! That's just a prepostrous suggestion.

Well Tony Sale is saying so, right there in the Wikipedia article I linked to. Seems like a pretty trustworthy person.

The Colossus was efficient for its purpose. Even in 2004, Tony Sale notes that "Colossus is so fast and parallel that a modern PC programmed to do the same code-breaking task takes as long as Colossus to achieve a result!".
 
Guden Oden said:
I didn't think it would be possible to link Cell with bananas and milkshakes, but Jaws did it! :oops:
:LOL: I'm still working on a Cell link to PS3! ;)

Something for memory tech for consoles in the next 10 yrs?

IBM, Infineon advance magnetic-memory prototype
Last modified: June 23, 2004, 12:15 PM PDT

By Michael Kanellos
Staff Writer, CNET News.com

          

IBM and German memory maker Infineon unveiled a prototype 16-megabit magnetic-memory chip this week as the struggle to establish new standards of memory rolls on.

The two companies presented the MRAM, or magnetic random access memory, chip at the Very Large Scale Integration Circuits Symposium in Hawaii on Wednesday. Unlike today's mainstream computer memory, MRAM relies on magnetism rather than an electrical charge to store data.

MRAM could significantly advance the state of memory technology, at least according to proponents. Like flash memory, MRAM continues to store data even after its host computer is turned off. It also retrieves data rapidly and can theoretically last forever.

Last June, at the same conference, IBM and Infineon published a paper describing how the companies produced an MRAM chip on the 180-nanometer manufacturing process that held 128 kilobits of data.

At the time, the two companies promised to more fully demonstrate MRAM in early 2004 and predicted that MRAM could go into commercial production by 2005. That deadline, however, has become a tad amorphous. In announcing its prototype, Infineon said MRAM could potentially enter many segments of the memory market "in a few years."

Memory, particularly flash, is expected by many to undergo substantial architectural transformations over the next decade or so because of the technical complexity and cost involved in improving contemporary memory technology a la Moore's Law. Simply put, it's just becoming too expensive and hard to shrink memory chips and add transistors.

MRAM, however, is only one of several contending ideas. Nanochip, a start-up that has received funding from Microsoft, is proposing a flash replacement in which tiny actuators heat microscopic points on a substrate, which can then be read as 1s or 0s. Motorola, meanwhile, says it will incorporate silicon nanocrystals into flash memory, allowing the company to continue to shrink flash chips. ZettaCore, a start-up with several prominent backers, touts designer molecules.

Naturally, each of these companies can explain the virtues of their own approach and problems with the competing ideas. But, as the vague deadline for MRAM's entry into the market indicates, displacing existing technology won't be easy because of the risks and costs involved.

"It is a conservative industry," said Randy Levine, CEO of ZettaCore, "But people are talking about it (the transformation). I don't know a major memory vendor that isn't."

Source...more sublinks here
 
Back
Top