Today I was reading an article on some experiment conducted on brain. The objective was to explain why we feel confused at times of sudden refresh like say, when you just woke up after sleeping at a new place, you get confused as to where you are and take some time to understand the situation. They claim that the brain actually has no confusion. From some experiments they carried out on rats, they say that the brain actually was taking time to retrieve memory and swaps between two memory locations, one of your regular sleeping place and other of the new place. They say that the swapping of the two memories happens so fast(125ms per memory) that you take time to process. The main problem I have with this kind of articles is how can they distinguish between "you" and "brain"? Isn't consciousness a product/alias of brain? Are the two not same? If you distinguish 'brain' and 'you' then who are 'you' if no the brain?
One more thing I don't like about such articles is the comparison to our digital logic. The article says, our brain needs to be as speed as a 100 core processor to distinguish between the swapping of two memories whatever that means! I feel the area of neurosciences is not appreciated by everyone mainly because they feel our brain is just a much better computer than the fastest supercomputer we have. But how can we compare boolean/digital logic with the complexity of brain? Most people say its similar because of the 'all or none'(fire-nofire' behavior of neurons.But this doesn't mean brain works in digital domain! For instance, we remember somethings 'vaguely'. A computer can never remember things vaguely.Sometimes I feel maybe, quantum theory has something to explain our thinking process. If I am not wrong, according to quantum theory, a particle can be in one place with some probability which means, it can be there or not there i.e. it is vaguely there(??). So maybe the probability of the memory being present is low when you say I remember vaguely and it is high when you have all details of the memory. I am just blindly comparing, of course but get the idea.
I also feel the big companies in Semiconductor and Software industries(as they have some AI fundaes, though I don't think their idea of AI is even remotely connected to real intelligence) should form a group with academia and other neuroscience institutes like Riken institute and start a project to understand the brain like the Human Genome Project. That might at least give one breakthrough in this area.
One more thing I don't like about such articles is the comparison to our digital logic. The article says, our brain needs to be as speed as a 100 core processor to distinguish between the swapping of two memories whatever that means! I feel the area of neurosciences is not appreciated by everyone mainly because they feel our brain is just a much better computer than the fastest supercomputer we have. But how can we compare boolean/digital logic with the complexity of brain? Most people say its similar because of the 'all or none'(fire-nofire' behavior of neurons.But this doesn't mean brain works in digital domain! For instance, we remember somethings 'vaguely'. A computer can never remember things vaguely.Sometimes I feel maybe, quantum theory has something to explain our thinking process. If I am not wrong, according to quantum theory, a particle can be in one place with some probability which means, it can be there or not there i.e. it is vaguely there(??). So maybe the probability of the memory being present is low when you say I remember vaguely and it is high when you have all details of the memory. I am just blindly comparing, of course but get the idea.
I also feel the big companies in Semiconductor and Software industries(as they have some AI fundaes, though I don't think their idea of AI is even remotely connected to real intelligence) should form a group with academia and other neuroscience institutes like Riken institute and start a project to understand the brain like the Human Genome Project. That might at least give one breakthrough in this area.
No comments:
Post a Comment