Two news stories about the human brain are worth passing along this week.

The first is a potential breakthrough by Australian researchers in the treatment of Alzheimer’s disease. One of the primary causes of Alzheimer’s is the build-up of what’s known as amyloid plaques between the nerve cells of the brain (neurons) that interfere with the transmission of signals. Drug-based treatments for Alzheimer’s have had limited success partly because the body’s own blood-brain barrier, meant to protect the brain from invaders, does a good job of keeping out helpful chemicals too. The new treatment involves using focused therapeutic ultrasound—gentle sound waves that nudge the blood-brain barrier open to help the body’s clean-up crew (microglial cells) to remove the offending plaques. So far, test results in mice have been very promising, restoring 75% of memory function without causing damage. That’s still a long way from curing humans, true. But having witnessed the ravages of dementia up close in the last years of my mother’s life, I appreciate any good news in the fight against it.

The second story involves a new calculation of the data processing capability of the human brain. Signals pass between neurons via special structures called synapses, so it’s not hard to understand that the ability of a neuron to pass signals will be affected by the number and size of the synapses it has. It was thought that there are only a few different types and sizes of synapses in our brains, but when a team led by the Salk Institute recreated brain tissue down to the nanometre scale for the first time, they discovered that synapses change according to how they’re used, and how often. The Salk scientists and their partners calculated that there could be as many as twenty-six different categories of synapses, adjusting themselves as needed, which goes a long way toward explaining how the brain accomplishes so much using so little energy (the power consumption of a dim light bulb, they say). But it also means that our brain’s capacity for storing information could be ten times greater than previously thought—possibly in the range of a petabyte (a million gigabytes) which is the equivalent of all the storage in the World Wide Web.

Are you feeling impressed with yourself yet?

A long-lived urban myth suggested that we only use about ten percent of the capacity of our brains. That claim has been thoroughly discredited by neuroscientists, but the Salk Institute findings have to make you wonder if there isn’t a way we could somehow make even better use of all the brainpower we have. It’s a vast amount of storage, yes, but what if we could improve our information processing, filing, and retrieval systems? For one thing, we might never again lose our car keys (!), but we also might have less and less need for digital computers. Since synapses respond to need, picture flicking a mental switch to turn on “mathematics mode” or “language mode” to temporarily divert cognitive resources to a specific task. We could be specialized  geniuses on demand! The idea of someday using a human brain to store secret data archives (like in the TV show Chuck or the movie Johnny Mnemonic) seems more plausible too.

But doesn’t it also make you wonder if there aren’t other potential capabilities within our brains that we’ve either forgotten how to use or just haven’t learned yet? Most scientists would scoff at the idea of so-called “paranormal” powers like telekinesis and telepathy, and most science fiction writers relegate such things to fantasy instead of SF. I’m not so sure. For one thing, I accept that there are dimensions of existence that we don’t currently perceive, but mathematicians and physicists readily include them in their theories about the universe. And I can accept that the universe includes an underlying level of information, call it what you will. Human beings don’t normally tap into such things because we haven’t needed to for our survival, but that doesn’t mean it’s beyond our ability if we only knew how.

I believe that each new discovery about the human body and mind in physical terms leads to a deeper understanding of ourselves in a holistic and even philosophical context. So news stories  like these reinforce my conviction that the human adventure is far from over and there are many wonders yet to come.


The invention of written language was a game-changer in human history. For the first time, we didn’t have to trust our memory, and that of others in our tribe, to preserve important knowledge. We could write it down. Others, at a later time or in another place, could read it. That provided a framework for enormous progress. Access to personal computers and then the internet, have also been huge leaps ahead in terms of the availability of knowledge and other forms of what could generally be called “problem solving”, from math calculations to determining a location on a map to keeping track of appointments.

These days we joke about our phones being smarter than we are. And I predict that, within the next half-century, technological capabilities much greater than those of our smartphones will be part of customizable brain “augments” that will interface directly with our own biological grey matter. But a recent article at The Conversation got me thinking about that. Some recent neuroscience studies appear to show that our brains selectively forget some information in favour of newer similar data. That’s a good thing: who wants to remember the pin number of a bank card you lost months ago when you’re trying hard to recall the new one? And while certain middle-brain structures like the hippocampus may be crucial for memory storage, it looks like the pre-frontal cortex determines which remembered data is the most relevant to a desired action. Think of it as being like the Google algorithms that show you search results appropriate to your location, previous searches, and other personal data, rather than just any random answer that meets your search keywords. Even with that help, you know how hard it can sometimes be to find what you’re really looking for (instead of a list of porn sites just as your boss is looking over your shoulder).

When we do have brain augments, something—biological or mechanical—will have to act as a similar filter, coordinating the functions and search retrieval. A significant amount of brainpower might have to be allocated to this. Your smartphone probably has a dozen apps you never use, but if we do the same thing with brain augments, the result will be needless mental overload.

So what kinds of brain augmentation would you most want?

Extra storage capacity, the better to remember all of those special moments in perfect detail (and where you left your car keys)? Well, don’t forget that the bigger the hard drive the longer it takes to categorize and locate specific data. Your recall might be total, but slow. Cloud storage would offer benefits and drawbacks.

How about better facial recognition, tied to the correct names and relevant data? I could go for that (great with faces, terrible with names). And it would be pure gold for politicians and sales reps.

Social media, instant messaging, and chat functions could take on an almost telepathic quality (although, would all of your Facebook friends really be welcome right in your head?)

A GPS and mapping function would make sure you could never get lost, or, even more exciting, never lose your car in the mall parking lot.

The possibilities are many, BUT let’s not forget that our brains do forget, very deliberately. Not only do they forget old stuff in favour of information that’s currently in greater demand, but neural pathways that are no longer used eventually disappear. So with every regular brain function that we replace with a digital equivalent, we might eventually lose the ability to do that task on our own (try solving a multi-part math equation without your calculator sometime).

Customizable brain augments will come, but before they do, lets give some thought to exactly what we want from them. While we’ve still got practice at thinking “outside” the digital box.


Fantasy writers can make up all the details of their fictional world. Authors who write about real life can observe what goes on around them. For the rest of us, there’s research. The research I’m doing for the novel I’ve just begun to write (my fourth) is about consciousness, and how the brain works. And it’s enough to make my brain stop working.

Considering we’ve all got a brain (yes, even that @#$*&X driver in front of you), and we think we know how we think, there’s been an awful lot of ink spilled in attempts to explain it. The French philosopher Descartes is famous for saying, “I think, therefore I am.” And one of his main beliefs about how we think has become deeply ingrained in our collective knowledge, because it fits what we intuitively believe about the process of our thoughts. It seems evident that, for every bit of information taken in by our senses, there’s a specific moment when we become conscious of that information, and the details surrounding it (“Oh, look, there’s a very solid baseball coming straight at my face at high speed!”) So each moment of awareness is kind of like an image projected onto a screen in our heads (what some philosophers call the Cartesian theatre). But that forces the question: who’s looking at the screen? It presupposes there’s an inner mind, a deeper you or me who sees the projections on the screen and then does something about them—presumably a center spot of the brain where consciousness happens and decisions are made.

Neuroscientists have never found such a place. Philosophers now discredit the idea of the Cartesian theatre and will fill a book with thought exercises to show you why it’s wrong (a book so thick it qualifies as physical exercise just to lift it). Prevailing theories suggest that consciousness is more like a stream of activity with information coming in constantly, being processed, gaining priority and triggering action, or failing to achieve priority and being discarded. If you think it’s hard to wrap your head around a concept like that, imagine trying to absorb and retain it, along with dozens of other facts about neuroanatomy, brain-scanning technology, cognitive evolution, and more, while you try to write a good old-fashioned yarn about average people and the trouble they get into.

I guess the real question is: was I conscious when I decided to make a living this way?