That figure is WAY off. 10^8.214 maybe.zifnab said:Brain power cannot be calculated in terms of flops because the brain doesn't work that way. Memory capacity has however been estimated and it is thought to be something like 10^8214 bits (I cannot remember the precise number).
Deepak said:So, in layman's lan. a genious has P4/AMD 64 and an average person 386, a genious has DDR3 and an average person SDRAM.
blakjedi said:layman = 1xCD
genius = L1 cache
And you think our memory can be that easily related to bytes ? It doesnt quite work that way with retrieving nor storing information. We memorise things at a higher level, like having a set of pictures, but those pictures again are rather just a set of information of whats on that picture than a exact representation.zifnab said:Brain power cannot be calculated in terms of flops because the brain doesn't work that way. Memory capacity has however been estimated and it is thought to be something like 10^8214 bits (I cannot remember the precise number).
I managed to dig up the paper on it. It was published in Brain and Mind and the group behind the result has calculated the total memory capacity to be in the order of 10^8432 bits (here’s a link to a copy of the paper: http://www.enel.ucalgary.ca/IJCINI/ICfCI/JB&M-Vol4-No2-HMC.pdf). Given that we posses approx. 10^11 neurons, you are going to get a much higher number than your proposed figure.bloodbob said:That figure is WAY off. 10^8.214 maybe.
Please see what I wrote to bloodbob above. While the figure is an estimate, yes, you can certainly equate your memory to bytes so long as you accept you are dealing with an estimate. Each individual neuron in your brain transmits between approx. 50-1600 bits/s of information (depending on the neuron). While this isn’t the type of approach the people behind the 10^8432 result took, I think this can give you a sense of how memory capacity can be broken down into such figures. Certainly memories work in a complicated fashion, but if we are estimating the total potential capacity of the human brain, and by that we are looking at all the information stored in the brain regardless of what it relates to, then it is a fair estimate. But it is of course an estimate.Npl said:And you think our memory can be that easily related to bytes ? It doesnt quite work that way with retrieving nor storing information. We memorise things at a higher level, like having a set of pictures, but those pictures again are rather just a set of information of whats on that picture than a exact representation.
Its quite similar to the "FLOPS" of the Brain, as you wont be able to using a Hex-Editor and remember all bytes of a 1KB File. Thats not to say you cant store more information than that.
Multiplying two floats typically takes a few minutes, but at the same time your picking up 2 seperate 2D-Images with an additionally "Distance-Value" and your Brain easily works with it as 3D-Information.
Deepak said:Who is eDRAM?
zifnab said:I managed to dig up the paper on it. It was published in Brain and Mind and the group behind the result has calculated the total memory capacity to be in the order of 10^8432 bits (here’s a link to a copy of the paper: http://www.enel.ucalgary.ca/IJCINI/ICfCI/JB&M-Vol4-No2-HMC.pdf). Given that we posses approx. 10^11 neurons, you are going to get a much higher number than your proposed figure.
How did you come to that conclusion?pcchen said:This number is so wrong because there's no enough atoms (or sub-atom particles) in known universe for storing so much data.
If you look at the number of electrons alone, we have seven different energy levels with a respective maximum number of electrons 2*n^2 (if memory serves me right). So we have 2,8,18,32,50,72,98 as the maximum number of electrons per energy level. The combinatory possibilities here are 2*8*18…= 3.25*10^9 (approx.) which is about 32 bits. That’s still a small drop in the bucket for the numbers we’re trying to reach, but then that’s barely scratching the surface. I mention it as an example of the things you can look at. There is an ocean of information if you look at other aspects like the positioning of the electrons relative to one another, not to mention various aspects in relation to the neutrons and protons in the atom as well the quarks that they consist of. The amount of information capacity in an atom is extremely high given that you take such things into account. No doubt greater than 10^8432 bits.pcchen said:The estimated number of atoms in known universe is less than 10^100 (actually around 10^80, but we can have a safe margin). Suppose that we can store, say, 10 bits per atom (which is unlikely), we still have "only" 10^101 bits.
They have calculated this number by looking at the total number of possible combinations that can be made between the neurons given that there are 10^11 neurons with each neuron on average having 10^3 connections. In practice you naturally don’t have a brain that’s dynamic to the extent this might imply, but there are constantly connections that are dieing off as well as new ones being formed.pcchen said:In the paper, it's argued that the relation between neurons is the base of memory in human brain, that's basically correct. However, the computation in the paper is very strange. I don't understand why 10^11 neurons with about 1k connections each can store 10^8432 bits. Suppose that each neuron can store n bits, and each connection can have m bits. Now, each connection can connect to any other 10^11 neurons, that also be used for about 30 bits (log 10^11). Combine these (note that many duplications are not considered), we have 10^11 * (n + 10^3 (m + 30)) = 10^11 n + 10^14 m + 30 * 10^14. That's far from the 10^8432 bits number from the paper.
You have to take into account how many connections you have per neuron. You can then express it as a combinatorial problem as they have done, which gives you: n!/m!(n!-m!), n= number of elements (neurons), m = number of connections.pcchen said:A simpler way to see this: suppose that every neuron can connect to any other neuron, and there's no limitation on the number of connections on neurons. Then we have n * (n - 1) / 2 connections, that's about 10^22 connections. Still far from 10^8432 bits.
zifnab said:I managed to dig up the paper on it. It was published in Brain and Mind and the group behind the result has calculated the total memory capacity to be in the order of 10^8432 bits (here’s a link to a copy of the paper: http://www.enel.ucalgary.ca/IJCINI/ICfCI/JB&M-Vol4-No2-HMC.pdf). Given that we posses approx. 10^11 neurons, you are going to get a much higher number than your proposed figure.
Please see what I wrote to bloodbob above. While the figure is an estimate, yes, you can certainly equate your memory to bytes so long as you accept you are dealing with an estimate. Each individual neuron in your brain transmits between approx. 50-1600 bits/s of information (depending on the neuron). While this isn’t the type of approach the people behind the 10^8432 result took, I think this can give you a sense of how memory capacity can be broken down into such figures. Certainly memories work in a complicated fashion, but if we are estimating the total potential capacity of the human brain, and by that we are looking at all the information stored in the brain regardless of what it relates to, then it is a fair estimate. But it is of course an estimate.
I suppose the right amount of beer might do the trickDeepak said:How can I defrag my memory?
I think so, but it's a long way off imo. IBM are presently working on a big project that tries to do that for a small (relatively speaking) number of neurons that are working together. A lot of the low level stuff is understood such as the neural signalling that is taking place. How that signalling exactly get's decoded however, is still an unknown, although there are lot's of ideas on it. How the neurons should be pieced together is for example also an extremely difficult question given the number of neurons that we have, but there's a lot of research going on and it's all progressing.Deepak said:Seriously, will we ever be able to emulate Human brain and memory structure?
radeonic2 said:so how long till they can emulate a female brain
AlphaWolf said:I can assure you that by the time they do, she will have changed her mind.
zifnab said:They have calculated this number by looking at the total number of possible combinations that can be made between the neurons given that there are 10^11 neurons with each neuron on average having 10^3 connections. In practice you naturally don’t have a brain that’s dynamic to the extent this might imply, but there are constantly connections that are dieing off as well as new ones being formed.
Npl said:And how is this different to estimate your Brains speed to Flops? You cant store bits in your brain, simply doesnt work that way - Point.
From what Ive read from the Article, they assume the relation between neurons represent memory(Quote: "The new metaphor perceives that memory and
knowledge are represented by the connections between neurons in the brain, rather than the
neurons themselves as information containers."). I dont have knowledge on that matter, so Im not going to argue about that. But...
They then compute a estimation of different states - that 10^8432 number - and suddenly spoke of 10^8432 bits which is a ordes of magnitude more. 10^8432 states would relate to ld(10)*8432 bits, in other words, 28010bit
pcchen said:Actually, I think they forgot to do a log (as Npl said). But since the number doesn't add up, I don't know.