Editor’s Note: This commentary is part of our special focus on Summer Reading for the month of May.
When the summer wind comes blowin’ in, try your best NOT to let life go on as normal. For a few months, allow the warm air to suffuse your body in an intoxicating oxygen cocktail, creating just the right environment to make you healthier and happier, more aware, ponderous, and inquisitive. Even more importantly, don’t waste the brain-enhancing gifts of summer reading shit. Step it up. Here’s a playful look at five compelling books that offer fascinating, big thinking reading for hammocks, beach towels, starry nights, and anywhere else the gentle breezes blow.
The appearance of writing was an extraordinary development in human history, but consider for a moment that literacy may not have been the positive, civilizing force it’s generally cracked up to be.
WTF? you say.
Well, think about preliterate society—agricultural, oriented to the changing of the seasons, devoutly religious and mythical (Earth Mother, gods of wind and sea, moon and planets, etc.), charged with animistic life force. Homer’s writing, for example, gives us a vivid image of humans and gods interacting with casual aplomb, and the way rocks, trees, volcanoes, and bubbling springs were alive. In order to stay in good stead with it all—healthy and safe—mortals would engage in a certain amount of prayer, temple worship, sacrifice, and the like. There was violence and mayhem, to be sure, but not organized and destructive to the degree of the Crusades, regional warfare, and genocides of the last millennium.
Based on the evidence at hand, archaeologists generally agree that ancients humans were fairly peaceful and minded their own business, but then around 2500 BCE, letters, numbers, and words were invented and writing commenced—did the ability to read and write gradually push humanity out of a right brain, visual approach to a left brain, logical/analytical/verbal perspective?
The Dyslexic Brain
In Seeing What Others Cannot See: The Hidden Advantages of Visual Thinkers and Differently Wired Brains (Prometheus Books), Thomas G. West explores the fascinating research delving into right-brain-like visual thinking traits, and how visual brains (common to dyslexics) seem predisposed to sensational creative gifts. Yes, dyslexia, the kiss of death in the standard, whitebread classrooms of yore, where spelling, memorization, reading aloud, arithmetic, and other ultra-difficult-for-the-dyslexic skills are valued. Consequently, throughout history, the great majority of dyslexics struggled through school. In a word, they were thought to be slow, or worse, and until very recently, no educators even considered testing for the learning disorder (learning “difference” is now preferred).
Galileo, Leonardo da Vinci, Einstein. Nikola Tesla, Pablo Picasso, Thomas Edison. Steve Jobs, Richard Branson, Steven Spielberg. Whoopi Goldberg, Keira Knightley, Jerry Hall. All dyslexic. Judging by that list of names, it would seem creative, out-of-the-box genius is a side effect of dyslexia, and West absolutely believes thinking in images and pictures, as many dyslexics do, is an either-or proposition. In other words, if you don’t have dyslexia, you aren’t privileged with visual thinking, and you don’t have the stuff needed for great visual achievements.
West is also a leading digital futurist and sees untold breakthroughs come to fruition when visual thinkers fully utilize the imagery powers of computers.
But we are equally intrigued with the pre-literate brains of our Mediterranean ancestors for other reasons. To wit: Julian Jayne’s delightful theory of Bicameralism, which speculates that until about 3000 BCE, the right and left hemispheres of the brain were unable to keep track of each other. Consequently, the right’s dreams, hallucinations, memories, and experiences ended up being interpreted on the left side as unidentifiable voices or commandments. Likely, they would have assumed the words were spoken by a god.
In modern parlance, these voices, this lack of self awareness, would have been like schizophrenia. The author of The Origin of Consciousness in the Breakdown of the Bicameral Mind, Jayne’s theory is fascinating, but it might underestimate our ancestors’ brains. Why can’t we just as reasonably assume they were, indeed, speaking to the gods, as we see in Homer’s writing? Athena, Circe, Heracles, Calypso, and so many other higher beings might have gotten their jollies toying and taunting and tempting the simpletons on Earth, and then lost interest when the human brains got too rigid and literal.
These little footnotes are difficult to prove, of course. Archaeologists aren’t likely to come up with an image of Homer and Athena schmoozing over sangiovese on a newly discovered cave wall.
History is being made every day, that’s the greedy, monopolistic side deal it has with time. So the list of things we don’t know gets longer by the millisecond and the only way for a respectable history buff to manage is to continually read, as well as answer every other question at cocktail parties with a forthright “sorry, but I don’t know.”
The Little Book of Big History: The Story of the Universe, Human Civilization and Everything in Between (Pegasus Books) offers readers the alpine skiing equivalent to the long, meandering Green- and Blue-rated runs that aren’t especially difficult yet cover an extraordinary amount of terrain, as opposed to the Double Black Diamonds that run deep and narrow down steep mountain faces. Ian Crofton and Jeremy Black, each with several acclaimed history books to their name, set out to tell the whole story—the Big Bang till now—utilizing language and methods from physics, biology, astronomy, and history. Yes, all those scholarly branches are closely related and The Little Book’s narrative arc makes for a seamless history lesson, one that provides a “framework for all knowledge.”
We do know our ancient, illiterate ancestors—both Neanderthals and modern humans—seem to have developed the gift of gab, according to forensic archaeologists. To understand how things progressed from there, we turn to a perfect example of Crofton and Black’s deliberate, considered approach, under a subhead titled Early Religion: “We will never know what our prehistoric ancestors believed, as they left no written records. … It is likely that the increase in the brain size of early humans around half a million years ago gave them the capacity for abstract thought. Imagining something that does not yet exist is essential for developing tools, as is a grasp of cause and effect. These are prerequisites for religious belief, but plainly do not constitute religious belief itself. It is unlikely that religious belief could have appeared in any recognizable form until the emergence of symbolic communication, especially complex speech—and this only appeared with the emergence of Neanderthals and modern humans.
There are indications that Neanderthals disposed of their dead in a ritual fashion. Neanderthal remains dating from 65,000 to 35,000 years ago found in the Shanidar caves in Iraqi Kurdistan include the body of a man who may have been buried on a bed of flowers.“
No, we will never know what our ancient ancestors believed, the sounds of the first words they created, or what they thought of the full moon. But we do descend from them. We certainly retain the remnants of many if not most of their traits, instincts, desires, and personalities. And, at times, perhaps the best of times, we even think and act like Neanderthals. So, one wonders, how might a Neanderthal comprehend a creek holding a few elusive brook trout? For our next project, let’s turn to one of the best contemporary nature writers working today.
In Our Nature
Before very long that tiny stream had grown into much more profound dimensions. It had grown enough to completely fill a fisherman’s mind. Soon I knew only the stream, the sound of flowing water, the play of sunlight and shadow, and the search for the jeweled creatures that lived there.
Something else lived there as well. Something to be found if I paid attention. I thought about it as I huffed and puffed back up the side of the canyon that evening. What I had found was what I had missed from the high bridge as I first looked down on the tiny sliver of water, had missed even from its banks as I first stood beside it, seeing only how small and shallow it was.
What I had found, I realized, was exactly what we human beings are constantly looking for—something called significance. It is the quality of having one’s mind filled, of being so rich with an experience that time and all other concerns cease to exist for a while. It is this search for the Significant Experience, the Significant Person or Place or Idea that leads us into libraries and bookstores and Internet searches, churches and schools, careers and marriages, down highways and through airports, that impels us on toward something we cannot name or explain but that exerts an invisible gravitational pull.
And sometimes we find it, this experience of significance. When it is discovered, it colors and reshapes our lives, sometimes even providing a different sense of purpose and meaning to all that comes after. To truly see a trout stream, and to feel it, in all its dimensions, or a wilderness lake, or a last stand of old timber, or a glaciated, rocky point or an island, is often to fall in love with it, to have it become a part of who we are, a rivulet in our consciousness.
This lengthy excerpt from Douglas Wood’s new memoir, Deep Woods, Wild Waters (University of Minnesota Press), points to the experience many of us seek in our wanderings in nature. The author of thirty-plus books and a vastly knowledgeable wilderness guide and naturalist, Wood is endowed with the skill to truly see a trout stream. We should all be so lucky. But to read these essays, combined with frequent visits alone to wild places, is the best way forward for all of us. Intriguingly, Wood is also passionate about fishing and hunting, and his writing on those subjects—featured in several of the thirty-seven essays in this project—are worth the price of admission alone.
Cancer and Modernity
Many, many hundreds of thousands of years before Neanderthals and homo sapiens walked the earth, the genus Homo split off from the ancestors of chimpanzees and began the human juggernaut. (No doubt, Mother Earth now harbors some misgivings about permitting that particular evolutionary leap to take place.) So our human ancestry has been on a lengthy run of about 2.6 million years to work out genetic kinks and metabolic systems based on human dietary and lifestyle preferences. What’s striking to realize is that humans only started including grains, legumes, and dairy products into our species’ diet less than 10,000 years ago—just 400 generations or so. A decent amount of time, sure, but barely a blip on the cellular level where our mitochondria are actively converting the food we eat into energy for the body to run on—the basic definition of metabolism.
Here we turn to The Metabolic Approach to Cancer (Chelsea Green Publishing), where we find a dot-connecting explanation for the startling—“At the beginning of the nineteenth century, only one person in twenty were diagnosed with cancer. … Today half of all men and over a third of all women in the United States will develop cancer in their lifetime.”— increase in cancer over the past 150 years.
In their introduction, co-authors Dr. Nasha Winters and Jess Higgins Kelley point out that as long as one hundred years ago, it was understood that the root cause of cancer was damaged mitochondria, and just a cursory look at modern life shows an astounding number of potentially damage causing factors: antibiotics, high-fructose corn syrup, artificial sweeteners, artificial preservatives, pesticides, refined foods, synthetic fats, and emulsifiers, just to name a few, in addition to profound changes to our lifestyles and environment: electricity, cell phones, the pervasiveness of chemicals, digital media, and chronic stress. Those are all new factors for our ancient genome to deal with, and based on cancer statistics, it doesn’t seem to be going very well.
Back to the intro: “What we explain in this book is that while most modern diets and lifestyles are largely responsible for cancer-causing mitochondrial damage, deep nutrition, therapeutic diets (low glycemic, fasting, and ketogenic), and nontoxic lifestyle approaches can provide the repairs.”
For the 2.5 million years prior to farming grains and raising animals for milk and cheese, humans ate plants and wild animals. Our current reliance on grains steers us right into cancer’s wheelhouse once you understand the metabolic theory of cancer, which relies on “the proven fact that cancer cells are fueled by sugar and that altered mitochondrial metabolism is the ultimate cause of cancer. In fact, a December 2016 meta-analysis research paper assessed more than two hundred studies conducted between 1934 and 2016 and concluded that the most important difference between normal cells and cancer cells is how they respire, or create energy. Cancer cells use a primitive process of fermentation to inefficiently convert glucose from carbohydrates into energy needed to sustain their rapid growth … But the most important finding is that fatty acids (dietary fats) cannot be fermented by cancer cells, which makes a ketogenic diet the most powerful dietary approach to cancer identified to date.”
Lastly, let’s turn to one of humanity’s greatest artists and thinkers.
The very early years of the 19th century were commanded by Napoleon Bonaparte, there is no doubt. But this period was also fresh off the French Revolution, in the heyday of romanticism, with the Enlightenment still casting a soft glow on Europe’s burgeoning intellectual scene. And, yeah, some decent music was being made too. Between 1800 and 1824, Beethoven completed nine sensational symphonies, and you would be wrong to assume the era’s social, political, and intellectual developments weren’t hugely influential on his music. Napoleon, especially.
The brilliant, ruthless general enthralled Beethoven and with his Eroica Symphony, he attempted to forge a link with the “historical figure of Napoleon,” and also to express a “sense of Napoleonic heroism.” All of which we learn in Beethoven’s Symphonies: Nine Approaches to Art and Ideas, Martin Geck’s deeply researched, enchanting look at the great composer.
By 1803 or so, “Beethoven was so dissatisfied in his previous work that he decided to strike out in a ‘new direction,’” one that captured the intellectual significance of the day. Geck explains the difficulty of such a mindset and the true nature of Beethoven’s genius when he writes, “Any composer wanting to conduct a philosophical debate through the medium of music will find it hard to do so without breaking down the conventions of the genre, conventions that state that a symphony must obey certain formal patterns.”
A professor emeritus of musicology at the Technical University of Dortmund, Germany, Geck’s classical background and knowledge of music is of the highest order—a fact both thrilling and daunting for the music layman.
Big ideas, fascinating history, fabulous writing, what more can anyone ask about summer books.
Matt Sutherland is Editor In Chief at Foreword Reviews. You can e-mail him at firstname.lastname@example.org.