Make the most of your brain

Date:15 February 2015

By Caroline Williams

The human mind is the most complex information processing system we know. It has all sorts of useful design features, but also many glitches and weaknesses. The problem is, it doesn’t come with a user’s manual. You just have to plug and play.

But if anyone knows how to get the best out of our brains, it’s neuroscientists. So we asked some of the best to explain how the human brain performs many of its most useful functions and how to use them to the max.

1 ATTENTIONAlmost every useful feature of your brain begins with attention. Attention determines what you are conscious of at any given moment, and so controlling it is just about the most important thing that the brain can do.

To make any sense of the world around us we need to filter out almost everything and focus solely on what is relevant. Not only that, but focused attention is essential for learning or memorising. So it follows that if you can boost your ability to pay attention, you can improve at almost anything.

In simple terms, the brain has two attention systems. One, the “bottom-up” system, automatically snaps awareness to potentially important new information, such as moving objects, sudden noises or sensations of touch. This system is fast, unconscious and always on (at least when you are awake).

The other, the “top-down” system, is deliberate, focused attention, which zooms in on whatever we need to think about and, hopefully, stays there long enough to get the job done. This is the form of attention that is useful for doing tasks that require concentration. Unfortunately distractibility comes as both a bug and a design feature. Top-down attention requires effort and so is prone to losing focus, or being rudely interrupted by the bottom-up system.

The good news is that we can tweak our attention settings to stay focused more easily. As well as cutting down on bottom-up distractions by turning off email notifications, putting your phone on silent and so on, Nilli Lavie, a cognitive neuroscientist at University College London, suggests actually giving your brain more to do.

Lavie’s work has shown that better control of top-down attention comes not by reducing the number of inputs, but by increasing them. Her load theory says that, once the brain reaches its limit of sensory processing, it can’t take anything else in, including distractions.

This seems to work for both distractions and mind wandering, says Lavie. In real life, she suggests thinking about adding visual aspects to a task that make it more attention-grabbing without making it more difficult – putting a colourful border around a blank document and making the bit you are working on purple, perhaps. It works with all the senses, she says, so choosing somewhere with a bit of background noise might help.

There are also signs that cognitive training might help. Researchers working with people with attention-deficit hyperactivity disorder (ADHD) and brain injuries have found that cognitive training, combined with non-invasive magnetic brain stimulation, can improve focus on a task that needs sustained attention (Frontiers in Human Neuroscience, vol 4, p 60).

Wider studies are under way, and initial results seem to suggest that the right kind of brain training could help more or less anyone.

While we wait, the next best option is learning to chill out in exactly the right way. Long-term meditators have been shown to have thicker parts of the brain associated with attention, while other studies have found that attention test scores improved after a short course of meditation. So learning to focus better may be as simple as making time to sit still and focus on not very much.


Like attention, working memory is one of the brain’s most crucial front-line functions. Everything you know and remember, whether it’s an event, a skill or a fascinating fact, started its journey into storage by going through your working memory.

But working memory is much more than just a clearing house for long-term memories. It has been described as the brain’s scratch pad: the place where information is held and manipulated. If you are doing anything that requires effortful, focused thought, you are using your working memory.

In the 1970s, Alan Baddeley and Graham Hitch of the University of York, UK, came up with an influential model to explain how the system works. The main component is the executive controller, which runs the show by focusing your attention on the relevant information.

It also kicks “slave” systems into action. One of these holds up to four pieces of visual information at a time; another can memorise about 2 seconds of sound, especially spoken words, which it loops over and over again (think of mentally repeating a phone number while you search for a pen). The third is the episodic buffer, which adds relevant information from long-term memory.

A weakness of this model is that working memory doesn’t occupy a discrete brain area that can be watched in action in a brain scanner. Because of this, some cognitive neuroscientists have suggested that it might not be a separate system at all, but just the part of long-term memory that we are currently paying attention to. Whatever it is, working memory comes as standard in the human brain, but some people have better working memories than others. Working memory capacity is a better predictor of academic success than IQ, so getting the most out of it is useful.

The good news is that the system can probably be upgraded. Some studies have shown that brain training programmes aimed specifically at working memory can produce improvements, and there are even a handful of training packages on the market. But it’s not clear whether they make you better at anything other than working memory tests.

Cognitive neuroscientist Jason Chein of Temple University in Philadelphia, Pennsylvania, who studies working memory, says there seems to be evidence of improvements in other cognitive skills, although any changes are quite small. “A small effect may still be important in the sense that even modest gains can have a meaningful impact on everyday cognition,” he says.


We like to think of ourselves as rational and logical creatures. And so we can be – but not without some effort.

Logical thought requires us to behave like a microprocessor, executing stepwise operations on information using the rules of logic. This doesn’t come naturally to most people, requiring outside instruction to learn and lengthy training to master. Even then, we struggle to maintain a purely rational perspective.

It turns out that there is a kernel of truth in the popular wisdom that “left brain equals logic”. Imaging studies have shown that the left prefrontal cortex is needed to make logical trains of thought happen and, a lot of the time, no input is needed from the right. But when there is conflict between what seems logical and beliefs we already hold, the right side of the prefrontal cortex kicks in to help sort out the confusion (Brain Research, vol 1428, p 24). Unfortunately, the right hemisphere usually wins. Study after study has shown that where new information conflicts with existing beliefs, our brains bend over backwards to keep beliefs intact rather than revise them.

Another surprise is that, contrary to popular wisdom, emotions aren’t necessarily the enemy of rationality. People who have damage to the part of the prefrontal cortex that processes emotions struggle to make decisions at all, especially when there is no logical advantage to either option (Cerebral Cortex, vol 10, p 295).

So embracing our not-particularly logical gut feelings about decisions might actually help us make more rational choices. But not always: other studies have shown that strong emotions can interfere with making rational decisions, particularly when they concern people we love.

Other than hard graft – and an appreciation of the role of belief and emotion – is there anything we can do to become more logical?

Vinod Goel, a cognitive psychologist at York University in Toronto, Canada, says that a zap to the head might one day help. “Brain stimulation techniques may eventually offer a route to improving reasoning,” he says. His team recently used a similar approach to enhance creative thought and,he says, “One can imagine the same techniques being used to enhance our ability for logical reasoning.” As yet, though, there is no shortcut. For now, he says, practice is your best option. Recent studies have shown that a few months’ training in rational thought, as part of law degree training, increased the number of connections between frontal and parietal lobes and between the two hemispheres (Frontiers in Neuroanatomy, vol 6, p 32). The catch is, without regular practice this effect would almost certainly fade a few months after the course ended.


Learning is what your brain does naturally. In fact, it has been doing it every waking minute since about a month before you were born. It is the process by which you acquire and store useful (and useless) information and skills. Can you make it more efficient?

The answer lies in what happens physically as we learn. As it processes information, the brain makes and breaks connections, growing and strengthening the synapses that connect neurons to their neighbours, or shrinking them back. When we are actively learning, the making of new connections outweighs the breaking of old ones. Studies in rats have shown that this rewiring process can happen very quickly – within hours of learning a skill such as reaching through a hole to get a food reward. And in some parts of the brain, notably the hippocampus, the brain grows new brain cells as it learns. But once a circuit is in place, it needs to be used if it is going to stick. This largely comes down to myelination – the process whereby a circuit that is stimulated enough times grows a coat of fatty membrane. This membrane increases conduction speed, making the circuit work more efficiently.

What, then, is the best way to learn things and retain them? The answer won’t come as a huge surprise to anyone who has been to school: focus attention, engage working memory and then, a bit later, actively try to recall it.

Alan Baddeley of the University of York, UK, says it is a good idea to test yourself in this way as it causes your brain to strengthen the new connection. He also suggests consciously trying to link new bits of information to what you already know. That makes the connection more stable in the brain and less likely to waste away through underuse.

The learning process carries on for life, so why is it so much harder to learn when we reach adulthood? The good news is that there seems to be no physiological reason for the slowdown. Instead, it seems to be a lot to do with the fact that we simply spend less time learning new stuff, and when we do, we don’t do it with the same potent mix of enthusiasm and attention as the average child.

Part of the problem seems to be that adults know too much. Research by Gabriele Wulf at the University of Nevada, Las Vegas, has shown that adults tend to learn a physical skill, such as hitting a golf ball, by focusing on the details of the movement. Children, however, don’t sweat the details, but experiment in getting the ball to go where they want. When Wulf taught adults to learn more like kids, they picked up skills much faster.

This also seems to be true for learning information. As adults we have a vast store of mental shortcuts that allow us to skip over details. But we still have the capacity to learn new things in the same way as children, which suggests that, if we could resist the temptation to cut corners, we would probably learn a lot more.

A more tried and tested method is to keep active. Ageing leads to the loss of brain tissue, but this may have a lot to do with how little we hare about compared to youngsters. With a little exercise, the brain can spring back to life. In one study, 40 minutes of exercise three times a week for a year increased the size of the hippocampus – which is crucial for learning and memory. It also improved connectivity across the brain, making it easier for new things to stick (PNAS, vol 108, p 3017).


One of the brain’s most useful featuresis the ability to absorb pieces of information and make connections between them. Knowledge really is power: a little can be a dangerous thing and the more you know, the better equipped you are to deal with life.

But what exactly is knowledge? How are facts stored, organised and recalled when needed?

Knowledge obviously relies on memory – in particular the type of memory that stores general information about objects, places, facts and people, known as semantic memory. This is the part of memory that knows that Paris is the capital of France, a constitutional republic in western Europe – but not the part that stores memories of a weekend break there.

Knowledge isn’t so much about what information you store as how you organise it to create a rich and detailed understanding of the world that connects everything you know.

The sight of a dog, for example, automatically activates other bits of information about dogs: how they look, smell, sound and move, the fact that they are domesticated wolves, the names of similar dogs you know, and your feelings about dogs. How the brain achieves this gargantuan feat is far from clear. A recent proposal is that it has a “hub” which tags categories to everything we know and encounter, allowing us to connect related things.

In 2003, Tim Rogers, a cognitive psychologist now at the University of Wisconsin-Madison, proposed the anterior temporal lobe (ATL) as the hub (Nature Reviews Neuroscience, vol 4, p 310). The ATL is badly affected in people with semantic dementia, who progressively lose their knowledge of the meanings of words and objects but retain their skills and autobiographical memories. Experiments since then have backed this up – when the ATL is temporarily knocked out by a small electromagnetic pulse, people lose the ability to name objects and understand the meanings of words.

Rogers says that, without this system, we would spend a lot of time being confused about how things fit together. “How would you infer, for instance, that when making a collage with your kids, if you run out of sticky tape you can use the glue stick instead?” he says. “The tape is not similar to the glue stick in its shape, colour or how you use it. You need a representation that specifies similarity of kind.”

The good news is that there seems to be no limit to the knowledge that can fit into a brain. As far as we know, no one has ever run out of storage space. But it seems you can know too much.

Michael Ramscar at Tübingen University in Germany reckons that anyone who lives long enough eventually hits that point just by virtue of a lifetime’s knowledge. He suggests that cognitive skills slow down with age not because the brain withers, but because it is so full. And that – like an overused hard drive – takes longer to sift through.


KK Rowling has said that the idea for Harry Potter popped into her head while she was stuck on a very delayed train. We have all had similar – although probably less lucrative – “aha” moments, when a flash of inspiration comes along out of the blue. Where do they come from? And is there any way to order them on demand?

Experiments led by John Kounios, a neuroscientist at Drexel University in Philadelphia, suggest that the reason we aren’t all millionaire authors is that some brains come better set up for creativity than others. EEG measurements taken while people were thinking about nothing in particular revealed naturally higher levels of right hemisphere activity in the temporal lobes of people who solved problems using insight rather than logic (Neuropsychologia, vol 46, p 281). Kounios says recent work hints that this brain feature might be inherited, but even if you happen to have a more focused, less creative brain, there are plenty of general tips on how to get it into creative mode.

Boringly, the first is to put in the groundwork to build up a good store of information so that the unconscious has something to work with. Studies on subliminal learning have poured cold water on the idea that knowledge can drift into the brain without any conscious effort, so it pays to focus intently on the details of the problem until all the facts are safely stored. At this stage, anything that helps with focus, such as caffeine, should help.

Once that’s taken care of, it’s time to cultivate a more relaxed, positive mood by taking a break to do something completely different – like watching a few entertaining cat videos. Studies where people have either watched a comedy film or a thriller before coming up with new ideas have shown that a relaxed and happy mood is far more conducive to ideas than a tense and anxious one (Psychological Science, vol 21, p 1770). Not only that, but it pays to turn down the focus knob a little, and the easiest way to do that is to look for ideas when your brain is too tired to focus properly.

A 2011 study showed that morning people had their most creative ideas late at night, while night owls had theirs early in the morning (Thinking & Reasoning, vol 17, p 387).

Mental exhaustion might be a more realistic state of mind than relaxation when an important deadline is looming, but if the ideas are still refusing to come there may one day be an easier solution. Brain stimulation studies, in which activity was boosted in the right temporal lobe and suppressed in the left, increased the rate of problem-solving by 40 per cent (Neuroscience Letters, vol 515, p 121). So the stressed creative of the future might be able to pop on a “thinking cap” to help those juices flow.


Intelligence has always been tricky to quantify, not least because it seems to involve most of the brain and so is almost certainly not one “thing”. Even so, scores across different kinds of IQ tests have long shown that people who do particularly well – or badly – on one seem to do similarly on all. This can be crunched into a single general intelligence factor, or “g”, which correlates pretty well with academic success, income, health and lifespan. So more intelligence is clearly a good thing, but where does it come from? A large part of the answer seems to be genetics.

In 1990, the first twin studies showed that the IQ scores of identical twins raised apart are more similar to each other than those of non-identical twins raised together (Science, vol 250, p 223). Since then a few genes have been linked to IQ, but all of them seem to have a tiny effect and there are probably thousands of genes involved. That doesn’t mean the environment plays no part, at least in childhood. While the brain is developing, everything from diet to education and stimulation plays a huge part in developing the brain structures needed for intelligent thought. Children with a bad diet and poor education may never fulfil their genetic potential.

But even for educated and well-fed children, the effects of environment wear off over time. By adulthood, genes account for 60 to 80 per cent of the variance in intelligence scores, compared with less than 30 per cent in young children. Whether we like it or not, we get more like our close family members the older we get.

So if genes play such a big part, is there anything adults can do to improve IQ? The good news is that one type of intelligence keeps on improving throughout life. Most researchers distinguish between fluid intelligence, which measures the ability to reason, learn and spot patterns, and crystallised intelligence, the sum of all our knowledge so far. Fluid intelligence slows down with age, but crystallised intelligence doesn’t. So while we all get a little slower to the party as we get older, we can rest assured that we are still getting cleverer.


The brain is a fickle beast – at some times as sharp as a tack, at others like a fuzzy ball of wool. At least some of that variation can be explained by fluctuations in circadian rhythms, which means that, in theory, if you do the right kind of task at the right time of day, life should run a little more smoothly.

The exact timing of these fluctuations varies by about 2 hours between morning and evening types, so it is difficult to give any one-size-fits-all advice. Nevertheless there are a few rules that it’s worth bearing in mind whatever your natural waking time.

It’s an idea not to do too much that involves razor-sharp focus in the first couple of hours after waking up. Depending on how much sleep you have had, it can take anything from 30 minutes to 4 hours to shake off sleep inertia – also known as morning grogginess. If you want to think creatively, though, groggy can be good (see “Creativity”).

If hard work can’t wait, though, the good news is that researchers have backed up what most of us already know – a dose of caffeine helps you shake off sleep inertia and get on with some work (Perceptual and Motor Skills, vol 116, p 280).

Another tip is to time your mental gymnastics to coincide with fluctuations in body temperature. Studies measuring variation in everything from attention and verbal reasoning to reaction times have shown that when our core temperature dips below 37 °C the brain isn’t at its best.

By this measure, the worst time to do anything involving thinking is, unsurprisingly, between midnight and 6am. It is almost as bad in the afternoon slump between 2 pm and 4 pm, which has more to do with body temperature than lunch – studies of people who have no lunch or just a small one have the same problem. All in all, the best time to get stuck in is between mid-morning and noon and then again between 4 pm and 10 pm.

There may be a way to hack the system, though. Studies have shown that body temperature changes and alertness also work independently of the internal clock, so a well-timed bit of exercise or hot shower can work wonders.

Competitive sports, though, are worth leaving until the end of the day. Studies have shown that reaction times and handeye co-ordination get progressively better throughout the day, reaching a peak at around 8pm.

After that, there’s time for a little more focused energy before the body cools down, the brain slows and there’s nothing more to do with it but dream.

Latest Issue :

March 2021