I think it’s fascinating how the brain computes precisely where we should experience each and every sensation that we have. When we open our eyes, everything we see — brightness, color, shapes, depth, motion — all these things are the the result of real-time computation. We don’t receive these qualities from the world, we project them out into the world. When you see a bird flying on television, you know that every pixel on your TV screen is sitting still the whole time. When you see a printed 3D stereogram, you know you’re still looking at a flat sheet of paper. We say the sky is blue because our vision paints it that way. The color that we project on any object isn’t fixed to the wavelengths of light coming from that object. It’s the result of a calculation that includes the wavelengths of light that surround the object, and other things. The photons of light that enter our eyes are public — they can be measured objectively. But our sense of sight is personal — it is of the mind, by the mind, and for the mind.
The same goes for hearing. Our brain analyzes and compares pressure waves that arrive at our ears, and decides what sound we should hear and where we should hear it. It’s like how TV programs can have subtitles that show up right next to the person who’s speaking. Hearing is like projecting really sophisticated subtitles all around us. Also, what we see can change what and where we hear. If you play the sound “ba” while you see someone mouthing the sound “fa” silently, you’ll hear “fa” if your eyes are open, and “ba” if your eyes are closed. And ventriloquists don’t actually throw their voice. Our vision is what projects the sound onto their dummy. Just like we saw with photons and sight, pressure waves are public, but sound is personal.
Now what about touch? Touch is the most tangible sense that we have. And yet, it too is computed and projected. Think about what happens if you cut your finger. Electrochemical signals get sent through nerves in your arm, up the spinal cord, and into your brain. If those signals don’t make it to your brain, you don’t feel pain. If they do, and your brain decides that you should feel pain in your finger, it doesn’t need to send any nerve signals back down to it, like it would if you were to actually move your finger. The sense of pain just gets projected to your finger. When you feel pain in your finger, it isn’t your finger that actually feels pain. This might be hard to believe. But consider this. Many people who have had limbs amputated continue to feel sensations in those limbs, sometimes very painful ones. They’re feeling pain in a part of their body that does not physically exist! This gives us a hint of the kind of license our brain has to calculate how and where we should feel our senses.
Most of the time it uses this license very precisely, which effectively hides the fact that this license even exists. Hold up a hand and snap your fingers. Notice how you see your fingers snapping, hear your fingers snapping, and feel your fingers snapping all at the same place, all at the same time. In order for this to happen, the brain has to do some sophisticated work behind the scenes. It has to coordinate different reference maps for sight, sound and touch, and get them to all line up. As a result, we have a unified sensory experience of the world around us. We don’t notice that this experience is being constantly assembled and integrated.
What are the implications and ramifications of all this? If all of our senses are constructed and projected, what are the limits to what we can construct, and where we can project? We catch a glimpse of this every night when we go to sleep. When we dream, our senses are more free to roam among their possibilities. This is particularly the case with lucid dreaming, where we know that we’re dreaming while we’re dreaming. We can literally dream up any kind of scenario, we can be anything and we can do anything.
But even though we can dream up, say, as much food as we want, it won’t nourish our body. In order to survive, we actually have to eat food, not just feel like we’re eating food. These are two different types of reality — eating food is objective reality, feeling full is subjective reality.
The brain seems to be at that very boundary between subjective reality and objective reality, between us and the world, between inside and outside. How does it get to occupy this privileged position? I mean, it’s all well and good to say that neural networks calculate how and where we should feel. But then how does that calculation actually become a sight or a smell, or any other sensation? How does the public become personal?
All the logistics of creating a particular perception seem to involve a very physical brain, that’s got all sorts of structures and connections. But the effortless perceiving of that perception seems to involve a very subtle mind, that doesn’t seem to be weighed down by any kind of mass.
How can we try to understand this? Is there anything that can help us put this into some kind of context? Well let’s see. Photons don’t have any mass, but they can be absorbed and re-emitted by matter in various ways. Perhaps there are some kind of photons of perception that are being absorbed and re-emitted in different ways by different brain regions. Maybe these brain regions end up being shaped like particularly good antennas that can transmit and receive these signals.
Or perhaps it’s like the relationship between electricity and magnetism. If you run electrical current through a wire, it induces a magnetic field around the wire. And if you wave a magnet across a wire in a circuit, it induces electrical current through the wire. Perhaps certain behaviors of neural networks induce perceptive fields. And vice versa, maybe perceptive fields induce certain behaviors in neural networks. I don’t know.
Science has come a long way, and it will continue to make progress. But we don’t have to use science in isolation. We can also consider things introspectively. Lots of philosophical writings have been devoted to making sense of the relationship between subjective reality and objective reality. One such teaching is that there is a nondual reality that transcends and unifies subject and object. It’s the very foundation of both, and is said to be knowable even if it defies all manner of description. Meditation and other practices are encouraged to facilitate this awareness.
I think that science can help us refine our appreciation of spiritual practices, and also doing these practices can transform our understanding of science. It’s a virtuous circle that can help us converge on better questions and better answers about reality, whatever reality happens to be.
What are your questions? What are your answers? Let me know in the comments.