
Neurologist Laments Tech-Driven Brain Changes
Human identity, the idea that defines each and every one of us, could be facing an unprecedented crisis.
It is a crisis that would threaten long-held notions of who we are, what we do and how we behave. It goes right to the heart – or the head – of us all.
This crisis could reshape how we interact with each other, alter what makes us happy, and modify our capacity for reaching our full potential as individuals.
And it’s caused by one simple fact: the human brain, that most sensitive of organs, is under threat from the modern world.
Video games are weakening the ability to think for ourselves.
Unless we wake up to the damage that the gadget-filled, pharmaceutically-enhanced 21st century is doing to our brains, we could be sleepwalking towards a future in which neuro-chip technology blurs the line between living and non-living machines, and between our bodies and the outside world.
It would be a world where such devices could enhance our muscle power, or our senses, beyond the norm, and where we all take a daily cocktail of drugs to control our moods and performance.
Already, an electronic chip is being developed that could allow a paralysed patient to move a robotic limb just by thinking about it.
As for drug manipulated moods, they’re already with us – although so far only to a medically prescribed extent.
Increasing numbers of people already take Prozac for depression, Paxil as an antidote for shyness, and give Ritalin to children to improve their concentration.
But what if there were still more pills to enhance or “correct” a range of other specific mental functions?
What would such aspirations to be “perfect” or “better” do to our notions of identity, and what would it do to those who could not get their hands on the pills? Would some finally have become more equal than others, as George Orwell always feared?
Of course, there are benefits from technical progress – but there are great dangers as well, and I believe that we are seeing some of those today.
Every experience changes your brain. Without exception. The more important question becomes, are those changes positive, or negative, and who decides? The above excerpt demonstrates how Susan Greenfield’s new book ID: The Quest for Identity in the 21st Century tends toward an unscientific techno-alarmism. A review of this book at New Humanist gives the nod to Greenfield’s credentials, but lambastes her lack of philosophical sophistication. It often seems that a scientific education is not enough to prevent fear of change from overwhelming reason Like others who get worked up over screen violence, Greenfield seems to ignore positive aspects of games such as motor skill and team building, and the idea that working out violent fantasies virtually could prevent them from being enacted in real life. The only favorable angle (mentioned in the Amazon capsule review) was her acknowledgment of the harmful effects of fundamentalism on the mind. But her broadside against video games, and other entertainment options ignores the benefits of the sea-change involving former consumers entering the creative community through participatory feedback.
For a more realistic and hopeful view of this trend, I’m looking forward to checking out Clay Shirky’s Here Comes Everybody, discussing the expansive prospect of self-organizing communities built out of former passive content consumers. This new cognitive army has the potential to generate thousands of Wikipedia-like spontaneous open-source initiatives.
Greenfield also badly needs to read Ray Kurzweil’s The Singularity is Near. Barring a global catastrophe, most of the changes to our brains she laments will certainly happen–and then some, and I would argue that’s a good thing. We need to dump the sentimental notion that somehow our unaltered humanity is something worth preserving. It’s the same deep well of fear that causes some people to look askance at genetic engineering. Technological development is on an irreversible course toward physical and mental enhancement and an interconnectedness we can’t even fathom. It represents the next step in human evolution, except this time we will boldly decide our own directions based on our individual and social priorities.
I’m continually amazed at how Luddites cling to ignorance and tradition. While technological progress certainly has its pitfalls, these must be weighed against the risk of failing to act. Our planet is beset with both severe structural problems and a burgeoning population. The same technology enabling changes to our brains also promises to revolutionize food and energy production as well as stabilizing greenhouse gases. Inaction or technological relinquishment will guarantee ever-worsening humanitarian crises, and could never be enforced in any case. Whatever can be done in terms of human enhancement will be done. And there will be accidents and mistakes–as with any new endeavor. We cannot eliminate risk. But we need to press on bravely into the terra incognita.
Sadly, technophobes spin every foray into these areas as some sort of existential threat. We should ignore them. The first salvo in this neo-Luddite rebellion was fired by Bill Joy in his infamous 2000 article Why the Future Doesn’t Need Us. Greenfield mines the same rich vein of technophobia plundered by Joy. From the excerpt at least, she makes no new arguments, and shows no evidence of understanding even the concept of the Singularity. She seems to have a puritanical streak, fretting that we might be getting addicted to our machines or that (horrors) we might learn to derive direct pleasure from them and spiral down into a hedonistic cultural collapse. Sounds to me like an electronic version of the “Reefer Madness” hysteria.
We must come to terms with the fact that humans are absolutely nothing but very sophisticated machines, something Dr. Greenfield must understand. We are beginning to unravel the mystery of how those machines work and how to make them better. In the process we might also merge with our artificial intelligence and become smarter, happier and experience more pleasure and capacity. I’m always confounded that someone manages to turn that into a “bad thing.” In the next decades, each of us will be faced with two choices: ride the wave or become obsolete.
14 comments
Oh oh, sounds like gramma’s a tad grumpy… time to change that diaper.
Just because they don’t think ‘EXACTLY’ like her, is more like it.
There is a fantastic rant by Richard Bartle on this very point:
HERE!
I had to give her amazon book rating, a one star. Pity I have to buy the book to also give it a review…
;)
I personally think that sounds awesome. I’m looking forwards to that kind of thing.
The Luddites and pro-deathers have almost as limited a concept of “nature” as they do of what defines our identity as humans.
Our intelligence is our species’s natural armament for survival, just as much as the shell of a tortoise or the speed of a gazelle.
It’s in the nature of intelligence that it creates tools and improves them. It’s natural that human intelligence creates technological progress over time, and that this progress accelerates.
The achievement of the Singularity will be just as much a part of the natural development of humanity as the transformation into a butterfly is of the natural development of a caterpillar.
I have a debate about this going on at my own site right now. One interlocutor said that it makes no sense to fix things that are not broken, and that the natural cycle of life and death (and the current level of our intelligence) are not “broken”.
Well, a caterpillar is not “broken”, but to try to halt its transformation into a butterfly and keep it a caterpillar forever would be a hideous and grotesque violation of its nature, if anything would be.
Hm, I’d respond to the metaphor with the caterpillar with the fact that we’ve had a chance to see caterpillars change into moths and butterflies previously- show a child this process for the first time and they’re as often dismayed as delighted.
We obviously have never seen a technological singularity happen to another civilization, so all hopes and fears are speculation, usually based on very narrow wedges of the whole picture going on. It’s fairly reasonable to believe a third choice, that our problems and advantages will be completely unique and we’ll have to decide which is which in rather short order.
there is also a really interesting conversation between clay shirky and daniel goleman (author of emotional intelligence) called ‘socially intelligent computing’ which looks at neurological intersections with technology. there are free samples of the dialogue that you can listen to on the publisher’s website http://www.morethansound.net
She’s worried that a microchip will alow people without limbs to manipulate their prostheses by thought alone…
And the Misanthropic Sociopath of the Week Award goes to!
The problem with trying to change consciousness is that we don’t as yet fully understand what it is or how it arises. Can consciousness even exist except by alteration? Is there such a thing as a state of pure beingness unaffected by perception and interaction with the environment?
I have not read this author’s book, and I think she is throwing out unintentional red herrings in these excerpts, but some of her fears are not groundless. Black Sun wrote in a post some time ago that, from his experience, more women than men are anti the new singularity. Perhaps it is because women have held the crucible of life so long within their bodies that they don’t want to relinquish it, or they have a deeper intuitive respect for wild things.
The problem with trying to change consciousness is that we don’t as yet fully understand what it is or how it arises.
Well, not yet. By the time we can seriously embark on radical expansion of human intelligence or on mind uploading, we will — because those things will only be possible when we understand the brain thoroughly.
I know I’ll be in line for a memory chip. Ah to have a terabit or two of easily accessible memory in my head. How can that be a bad thing?
I would blame social networking sites, and mmrpg games that encourage fantasy over reality for the lack of proper decision making, and proper cognitions. The video games she criticizes are played with friends, and encourage creativity, and better problem solving skills (usually).
What will the next generation be like?
Every generation has had something to fear, a cause of the imminent decline of civilization, from the rudeness of youngsters in ancient Greece, through public lending libraries in the 18th century and beatnicks in the 50s, to, of course, video games today. And yet, civilization progresses, in fits and starts. What does that tell you?
June #10 —
I wonder if you’ve even played an mmrpg or any other kind of mmo if you think they cause a “lack of proper decision-making or cognition”. As a not-so-random example (I play it), check out EVE Online and see if it rewards poor decision-making skills. In my admittedly anecdotal experience, self-described video gamers tend to be very good decision-makers, which includes me and most of my friends.
Just because a particular game is set in a fantastic (read: non-real) environment, doesn’t mean it can’t be a rewarding and challenging mental experience. I also think most problem solving skills learned from video games are very useful in day-to-day modern life, and being comfortable with technology.
I’m trying to hold back a rant, but suffice to say that the stereotypes of video games “numbing minds” or destroying attention spans, or of regular gamers being “stupid” or whatever else are really really annoying to me.
Is May 10 your birthday? Couldn't remember for sure.
I have to say my heart sank immediately when I began reading the excerpt. I couldn’t believe to be hearing such things from a Neurologist of all people!
In regards to video games, I am now almost 24 years old and have been a chronic cannabis user for ten years this year, as well as having overcome a seperate (and almost fatal) addiction to amphetamines.
Anyway, I’m no scientist and certainly don’t consider my opinions authoritative in any way whatsoever, but I do believe playing video games since the age of 6, for at least 3 hours of almost every day in my life since, has had a tremendous impact on reversing the costs for such indulgent habits with drugs on my brain. My short-term memory is certainly irrepairably damaged, but I also posess a high IQ which when realized, gave me the encouragment I needed to pursue a proper university education.