Has anyone measured the speed of human brain (in FLOPS)?

Brain power cannot be calculated in terms of flops because the brain doesn't work that way. Memory capacity has however been estimated and it is thought to be something like 10^8214 bits (I cannot remember the precise number).
 
zifnab said:
Brain power cannot be calculated in terms of flops because the brain doesn't work that way. Memory capacity has however been estimated and it is thought to be something like 10^8214 bits (I cannot remember the precise number).
That figure is WAY off. 10^8.214 maybe.
 
zifnab said:
Brain power cannot be calculated in terms of flops because the brain doesn't work that way. Memory capacity has however been estimated and it is thought to be something like 10^8214 bits (I cannot remember the precise number).
And you think our memory can be that easily related to bytes ? It doesnt quite work that way with retrieving nor storing information. We memorise things at a higher level, like having a set of pictures, but those pictures again are rather just a set of information of whats on that picture than a exact representation.
Its quite similar to the "FLOPS" of the Brain, as you wont be able to using a Hex-Editor and remember all bytes of a 1KB File. Thats not to say you cant store more information than that.
Multiplying two floats typically takes a few minutes, but at the same time your picking up 2 seperate 2D-Images with an additionally "Distance-Value" and your Brain easily works with it as 3D-Information.
 
bloodbob said:
That figure is WAY off. 10^8.214 maybe.
I managed to dig up the paper on it. It was published in Brain and Mind and the group behind the result has calculated the total memory capacity to be in the order of 10^8432 bits (here’s a link to a copy of the paper: http://www.enel.ucalgary.ca/IJCINI/ICfCI/JB&M-Vol4-No2-HMC.pdf). Given that we posses approx. 10^11 neurons, you are going to get a much higher number than your proposed figure.

Npl said:
And you think our memory can be that easily related to bytes ? It doesnt quite work that way with retrieving nor storing information. We memorise things at a higher level, like having a set of pictures, but those pictures again are rather just a set of information of whats on that picture than a exact representation.
Its quite similar to the "FLOPS" of the Brain, as you wont be able to using a Hex-Editor and remember all bytes of a 1KB File. Thats not to say you cant store more information than that.
Multiplying two floats typically takes a few minutes, but at the same time your picking up 2 seperate 2D-Images with an additionally "Distance-Value" and your Brain easily works with it as 3D-Information.
Please see what I wrote to bloodbob above. While the figure is an estimate, yes, you can certainly equate your memory to bytes so long as you accept you are dealing with an estimate. Each individual neuron in your brain transmits between approx. 50-1600 bits/s of information (depending on the neuron). While this isn’t the type of approach the people behind the 10^8432 result took, I think this can give you a sense of how memory capacity can be broken down into such figures. Certainly memories work in a complicated fashion, but if we are estimating the total potential capacity of the human brain, and by that we are looking at all the information stored in the brain regardless of what it relates to, then it is a fair estimate. But it is of course an estimate.
 
zifnab said:
I managed to dig up the paper on it. It was published in Brain and Mind and the group behind the result has calculated the total memory capacity to be in the order of 10^8432 bits (here’s a link to a copy of the paper: http://www.enel.ucalgary.ca/IJCINI/ICfCI/JB&M-Vol4-No2-HMC.pdf). Given that we posses approx. 10^11 neurons, you are going to get a much higher number than your proposed figure.

This number is so wrong because there's no enough atoms (or sub-atom particles) in known universe for storing so much data.
 
The estimated number of atoms in known universe is less than 10^100 (actually around 10^80, but we can have a safe margin). Suppose that we can store, say, 10 bits per atom (which is unlikely), we still have "only" 10^101 bits.

In the paper, it's argued that the relation between neurons is the base of memory in human brain, that's basically correct. However, the computation in the paper is very strange. I don't understand why 10^11 neurons with about 1k connections each can store 10^8432 bits. Suppose that each neuron can store n bits, and each connection can have m bits. Now, each connection can connect to any other 10^11 neurons, that also be used for about 30 bits (log 10^11). Combine these (note that many duplications are not considered), we have 10^11 * (n + 10^3 (m + 30)) = 10^11 n + 10^14 m + 30 * 10^14. That's far from the 10^8432 bits number from the paper.

A simpler way to see this: suppose that every neuron can connect to any other neuron, and there's no limitation on the number of connections on neurons. Then we have n * (n - 1) / 2 connections, that's about 10^22 connections. Still far from 10^8432 bits.
 
How can I defrag my memory? ;) Seriously, will we ever be able to emulate Human brain and memory structure?
 
pcchen said:
The estimated number of atoms in known universe is less than 10^100 (actually around 10^80, but we can have a safe margin). Suppose that we can store, say, 10 bits per atom (which is unlikely), we still have "only" 10^101 bits.
If you look at the number of electrons alone, we have seven different energy levels with a respective maximum number of electrons 2*n^2 (if memory serves me right). So we have 2,8,18,32,50,72,98 as the maximum number of electrons per energy level. The combinatory possibilities here are 2*8*18…= 3.25*10^9 (approx.) which is about 32 bits. That’s still a small drop in the bucket for the numbers we’re trying to reach, but then that’s barely scratching the surface. I mention it as an example of the things you can look at. There is an ocean of information if you look at other aspects like the positioning of the electrons relative to one another, not to mention various aspects in relation to the neutrons and protons in the atom as well the quarks that they consist of. The amount of information capacity in an atom is extremely high given that you take such things into account. No doubt greater than 10^8432 bits.

pcchen said:
In the paper, it's argued that the relation between neurons is the base of memory in human brain, that's basically correct. However, the computation in the paper is very strange. I don't understand why 10^11 neurons with about 1k connections each can store 10^8432 bits. Suppose that each neuron can store n bits, and each connection can have m bits. Now, each connection can connect to any other 10^11 neurons, that also be used for about 30 bits (log 10^11). Combine these (note that many duplications are not considered), we have 10^11 * (n + 10^3 (m + 30)) = 10^11 n + 10^14 m + 30 * 10^14. That's far from the 10^8432 bits number from the paper.
They have calculated this number by looking at the total number of possible combinations that can be made between the neurons given that there are 10^11 neurons with each neuron on average having 10^3 connections. In practice you naturally don’t have a brain that’s dynamic to the extent this might imply, but there are constantly connections that are dieing off as well as new ones being formed.

pcchen said:
A simpler way to see this: suppose that every neuron can connect to any other neuron, and there's no limitation on the number of connections on neurons. Then we have n * (n - 1) / 2 connections, that's about 10^22 connections. Still far from 10^8432 bits.
You have to take into account how many connections you have per neuron. You can then express it as a combinatorial problem as they have done, which gives you: n!/m!(n!-m!), n= number of elements (neurons), m = number of connections.
 
zifnab said:
I managed to dig up the paper on it. It was published in Brain and Mind and the group behind the result has calculated the total memory capacity to be in the order of 10^8432 bits (here’s a link to a copy of the paper: http://www.enel.ucalgary.ca/IJCINI/ICfCI/JB&M-Vol4-No2-HMC.pdf). Given that we posses approx. 10^11 neurons, you are going to get a much higher number than your proposed figure.


Please see what I wrote to bloodbob above. While the figure is an estimate, yes, you can certainly equate your memory to bytes so long as you accept you are dealing with an estimate. Each individual neuron in your brain transmits between approx. 50-1600 bits/s of information (depending on the neuron). While this isn’t the type of approach the people behind the 10^8432 result took, I think this can give you a sense of how memory capacity can be broken down into such figures. Certainly memories work in a complicated fashion, but if we are estimating the total potential capacity of the human brain, and by that we are looking at all the information stored in the brain regardless of what it relates to, then it is a fair estimate. But it is of course an estimate.

And how is this different to estimate your Brains speed to Flops? You cant store bits in your brain, simply doesnt work that way - Point.
From what Ive read from the Article, they assume the relation between neurons represent memory(Quote: "The new metaphor perceives that memory and
knowledge are represented by the connections between neurons in the brain, rather than the
neurons themselves as information containers."). I dont have knowledge on that matter, so Im not going to argue about that. But...
They then compute a estimation of different states - that 10^8432 number - and suddenly spoke of 10^8432 bits which is a ordes of magnitude more. 10^8432 states would relate to ld(10)*8432 bits, in other words, 28010bit
 
Deepak said:
How can I defrag my memory? ;)
I suppose the right amount of beer might do the trick :)

Deepak said:
Seriously, will we ever be able to emulate Human brain and memory structure?
I think so, but it's a long way off imo. IBM are presently working on a big project that tries to do that for a small (relatively speaking) number of neurons that are working together. A lot of the low level stuff is understood such as the neural signalling that is taking place. How that signalling exactly get's decoded however, is still an unknown, although there are lot's of ideas on it. How the neurons should be pieced together is for example also an extremely difficult question given the number of neurons that we have, but there's a lot of research going on and it's all progressing.
 
zifnab said:
They have calculated this number by looking at the total number of possible combinations that can be made between the neurons given that there are 10^11 neurons with each neuron on average having 10^3 connections. In practice you naturally don’t have a brain that’s dynamic to the extent this might imply, but there are constantly connections that are dieing off as well as new ones being formed.

Actually, I think they forgot to do a log (as Npl said). But since the number doesn't add up, I don't know.
 
Npl said:
And how is this different to estimate your Brains speed to Flops? You cant store bits in your brain, simply doesnt work that way - Point.
From what Ive read from the Article, they assume the relation between neurons represent memory(Quote: "The new metaphor perceives that memory and
knowledge are represented by the connections between neurons in the brain, rather than the
neurons themselves as information containers."). I dont have knowledge on that matter, so Im not going to argue about that. But...
They then compute a estimation of different states - that 10^8432 number - and suddenly spoke of 10^8432 bits which is a ordes of magnitude more. 10^8432 states would relate to ld(10)*8432 bits, in other words, 28010bit

pcchen said:
Actually, I think they forgot to do a log (as Npl said). But since the number doesn't add up, I don't know.

I understand your points and I’ll try to explain how/why I believe it is a valid result, which to a large extent is a question of representation. I think we should bare in mind that this paper has had a couple of people working on it and it has been published in a journal as well as at a conference, and I would be surprised if the whole basis of their paper was so wrong. As I stated though, I can understand where you are coming from, and I’ll try and see if I can explain this.

In their picture of things, there is either a connection between one neuron and another, or there isn’t. So they calculate the binomial coefficients, which gives them the unordered number of combinations. This is essentially the number of different scenarios, in respect to which neurons have connections (as opposed to exactly which neurons they are connected to). Thus they have 10^8432 different possible ways of saying which neurons have connections and which don’t. I’ll call these scenarios.

Their reason for saying this is equivalent to bits, rests in part in their argument in the beginning of the paper. As they say, memory is dynamic, and they are assuming essentially that you can have up to all of these possible scenarios though your brain naturally only has one scenario “running†at a time. But they are saying (since the brain is dynamic), you could potentially have others. Let’s say you had 6 possible scenarios. Then you could for example have 000001 expressing which scenario was actually running. This naturally wouldn’t give you 6 bits of information. But if you instead looked at the possible scenarios you could have, for your memories at a given moment (for example 001011), then you would have 6 bits. So your memories are in other words expressed by 10^8432 bits that depict the possible neural structures. This is perhaps a strange way of looking at it, but they are looking at memory capacity, which is essentially trying to decode the neural structure in a way that will maximise the total memory.
 
I don't think so. Let's consider a simpler situation. Suppose we have only 10 neurons. If any neuron can connect to any other neuron, how many connections do we have? The answer is 10 * 9 / 2 = 45. That is, if each connection can be either on or off, we have 45 bits.

Now, we assume a even more restrictive situation: each neuron can only have 5 connections. This is more restrictive because in the example above, each neuron can cnonect to any number of neurons between 0 to 9. In the paper, it's calculated as C(10, 5) = 252, and they claim that it can be used to store 252 bits! How is that possible?
 
Back
Top