Has anyone measured the speed of human brain (in FLOPS)?

You guys can argue or discuss how "powerful" our brain is. In the end it's all speculation, we don't even know how exactly information is stored in our brain, let's not jump the ship and try to calculate "GFLOPS" which in "brain terms" means absolutely nothing. We don't work with numbers, we work with ideas, chemical balance and God only knows what else, so counting bits and FLOPS is rather useless in my very humble opinion.
 
pcchen said:
I don't think so. Let's consider a simpler situation. Suppose we have only 10 neurons. If any neuron can connect to any other neuron, how many connections do we have? The answer is 10 * 9 / 2 = 45. That is, if each connection can be either on or off, we have 45 bits.

Now, we assume a even more restrictive situation: each neuron can only have 5 connections. This is more restrictive because in the example above, each neuron can cnonect to any number of neurons between 0 to 9. In the paper, it's calculated as C(10, 5) = 252, and they claim that it can be used to store 252 bits! How is that possible?

It’s because they aren’t trying to equate the combinations themselves to bits. Let’s take an even more simple example C(4,2). This gives us the subsets:
{1,2},{1,3},{1,4},{2,3},{2,4},{3,4}. They are saying that for a given set of memories, you could for example have connections on neurons {1,2} or {2.4}. This would then be expressed as: 100010. They can get away with this, because the brain is dynamic and it will assume different structures depending on the memories it contains, although in practice it is not dynamic to the extent that they are implicating. They are also assuming (if I have understood it correctly) that each set of memories can be expressed by more than 1 structure (as I gave an example of). While I would agree that this is a somewhat strange perspective on memory (since this basicly means we are decoding memories on the basis of neural structure), it is a theoretically valid result (as an estimate). As I mentioned before, they are looking at memory capacity, and hence they are going to decode the structure in such a way that maximizes the total possible memory.
 
AlphaWolf said:
Estimates vary from a few hundred megabytes to thousands of terabytes. I think they believe there is a lot of redundancy so I suppose it depends on how you count that.

I have a raid 0 array.

Wait what was I saying?
 
Cartoon Corpse said:
so you'd likely have better orgasms if you poked out your eyes and killed some other processes not crucial to the survival kernal.

I might be tired but that image of someone poking out his eyes and have an orgasm just gave me some serious shivers. now i'm all freaked out.
 
think humor cause it's outrageous...well to most...so i suppose i should include that im only kidding with the above post. do not poke out your eyes in an attempt to improve orgasms, as there will be other consequences you may not prefer to endure...(eg you can't see anymore)
 
zifnab said:
It’s because they aren’t trying to equate the combinations themselves to bits. Let’s take an even more simple example C(4,2). This gives us the subsets:
{1,2},{1,3},{1,4},{2,3},{2,4},{3,4}. They are saying that for a given set of memories, you could for example have connections on neurons {1,2} or {2.4}. This would then be expressed as: 100010.

Actually, this is exactly what I was saying. Note that you use C(n, 2) (= n * (n - 1) / 2) to obtain the number, not C(n, 1000). C(n, 1000) is wrong.

Let's use a smaller example, 6 neurons. The all possible connections are {1,6}, {2,6}, {3,6}, {4,6}, {5,6}, {1,5}, {2,5}, {3,5}, {4,5}, {1,4}, {2,4}, {3,4}, {1,3}, {2,3}, {1,2}, that's 15 different connections (6*5/2 or C(6,2)). So you can say that by enabling/disabling each connection you can have 15 bits (and plus 6 bits if the neurons can store bits themselves). However, in the paper it's claimed that if each neuron can connect to 3 neurons on average, you can store C(6,3) bits, that's 20 bits, more than the number of all possible connections. I really don't understand how that works.
 
Hmm...I'd consider the validity of the B&M paper dubious at best - especially their (implicit? I didn't find an explicit explanation in the paper) assumption regarding a fairly trivial mapping of interneural connections (weighted by an analogue potential) to bits.
 
pcchen said:
Actually, this is exactly what I was saying. Note that you use C(n, 2) (= n * (n - 1) / 2) to obtain the number, not C(n, 1000). C(n, 1000) is wrong.
No C(n,2) would give you the number of different groups, with each group depicting which 2 neurons have connections. That is something different. I’ll try and explain what I mean again below.

pcchen said:
Let's use a smaller example, 6 neurons. The all possible connections are {1,6}, {2,6}, {3,6}, {4,6}, {5,6}, {1,5}, {2,5}, {3,5}, {4,5}, {1,4}, {2,4}, {3,4}, {1,3}, {2,3}, {1,2}, that's 15 different connections (6*5/2 or C(6,2)). So you can say that by enabling/disabling each connection you can have 15 bits (and plus 6 bits if the neurons can store bits themselves). However, in the paper it's claimed that if each neuron can connect to 3 neurons on average, you can store C(6,3) bits, that's 20 bits, more than the number of all possible connections. I really don't understand how that works.
I’m not sure what you mean saying that C(6,3) is more than the possible number of connections. C(6,3) = 20, means 20 different ways of listing 3 neurons that have connections:
{1,2,3},{1,2,4},{1,2,5},{1,2,6},{2,3,4},….
They then assign a bit to each of these groups, in the sense of whether each one is a possible instance or not for a given set of memories.

Snyder said:
Hmm...I'd consider the validity of the B&M paper dubious at best - especially their (implicit? I didn't find an explicit explanation in the paper) assumption regarding a fairly trivial mapping of interneural connections (weighted by an analogue potential) to bits.
While I agree that their approach is strange and not very well explained, it looks to me to be a valid estimate. Also, as I mentioned above, this paper has been worked on by a group of people and besides being published in B&M it was also published at the Second IEEE International Conference on Cognitive Informatics 03, and I would be surprised at the whole foundation of the paper being wrong. I think we just have to keep in mind that it is an estimate.
 
zifnab said:
I’m not sure what you mean saying that C(6,3) is more than the possible number of connections. C(6,3) = 20, means 20 different ways of listing 3 neurons that have connections:
{1,2,3},{1,2,4},{1,2,5},{1,2,6},{2,3,4},….
They then assign a bit to each of these groups, in the sense of whether each one is a possible instance or not for a given set of memories.

You can't assign each bit to these connections. For example, how do you handle the connection {3,4} in both {1,3,4} and {2,3,4}? They are different, yes, but you can't make these two happen at the same time. So you can't assign them each a bit. Unless you claim that there are many different kinds of {3,4} connection and they all can happen at the same time (that would be saying that there are 5 different kinds of {3,4} in 6 neurons case, and much much more in human brain case, and all can happen at the same time), I don't think that's probable.
 
pcchen said:
You can't assign each bit to these connections. For example, how do you handle the connection {3,4} in both {1,3,4} and {2,3,4}? They are different, yes, but you can't make these two happen at the same time. So you can't assign them each a bit. Unless you claim that there are many different kinds of {3,4} connection and they all can happen at the same time (that would be saying that there are 5 different kinds of {3,4} in 6 neurons case, and much much more in human brain case, and all can happen at the same time), I don't think that's probable.
I'm not saying that they happen at the same time. Just that for a given set of memories the brain can essentially take on one structure or the other. I agree that this is not a probable memory model (or rather I'm quite confident it isn't), and I imagine this is why you are (understandebly) opposed to it. But this is how I believe the authors got their numbers, and as an estimate of memory capacity I think it is fair enough.
 
Last edited by a moderator:
zifnab said:
I'm not saying that they happen at the same time. Just that for a given set of memories the brain can essentially take on one structure or the other. I agree that this is not a probable memory model (or rather I'm quite confident it isn't), and I imagine this is why you are (understandebly) opposed to it. But this is how I believe the authors got their numbers, and as an estimate of memory capacity I think it is fair enough.

If you want to assign them individual bits, they have to be able to happen at the same time. For example, if {1,3,4} and {2,3,4} are individual bits, you have to handle the "11" case which {1,3,4} and {2,3,4} have to happen at the same time.

Mathematically, if there can be only one connection between any two neurons, the number in the paper is dead wrong. Of course, I do not entirely understand what's said in the paper (which is quite vague, and a lot of parts in the paper just discusses how to compute combinatorials in very large numbers). However, it's very very unlikely that human brain have a memory capacity of 10^80 bits, not to mention 10^8000 bits.
 
zifnab said:
Brain power cannot be calculated in terms of flops because the brain doesn't work that way. Memory capacity has however been estimated and it is thought to be something like 10^8214 bits (I cannot remember the precise number).

Estimates put the number of protons in the visible universe at ~10^80...
 
Back
Top