A New York Times and Amazon.com bestseller – Buy now

Get the crowd to innovate for you!

Cooperative tools and exponential technologies are reshaping our globe. You no longer have to sit on the sidelines and wait for the future to happen. You are now empowered to get involved to change the world. If you’re sick of the doom and gloom and ready to get in the game, explore the resources below. Here are some great crowdsourcing and collaboration tools on the web:

  • CoFundos (cofundos.org): cheap and really good platform for the development of open-source software.
  • Genius Rocket (geniusrocket.com): solid crowdsourced creative design agency composed solely of vetted video production professionals producing content as a fraction of the cost of a traditional ad agency.
  • Amazon Mechanical Turk (mturk.com): popular and powerful crowdsourcing platform for simple tasks that computers cannot perform(yet), such as podcasts transcribing or text editing. There are also companies, like CrowdFlower, that leverage Mechanical Turk (and similar tools) for even more elegant solutions.
  • Innocentive (www.innocentive.com): one of today’s best online platform for open innovation, crowdsourcing and innovation contests. This is where organizations access the world’s brightest problem solvers.
  • UTest (http://www.utest.com): the world’s largest marketplace for software testing services.
  • IdeaConnection (www.ideaconnection.com): open innovation challenge site for new inventions, innovations and products.
  • NineSigma (www.ninesigma.com): open innovation service provider, connecting clients with a global innovation network of experts.
  • Ennovent (www.ennovent.com): worldwide expert platform seeking solutions for sustainable development in energy, food, water, health and education in rural India.
  • TopCoder (www.topcoder.com): the world’s largest competitive software development & creative design community, with over 200,000 at your fingertips.
  • CrowdRise (www.crowdrise.com): Crowdrise is an innovative, crowd-sourced community of volunteers and online fundraisers that have come together to support online fundraising for charity, events and special projects. It’s a way to raise money in new ways, turning participants and supporters into effective online fundraisers.
  • Kickstarter (www.Kickstarter.com): Kickstarter is the world’s largest funding platform for creative projects. In 2011 the platform raised over $100 million for projects from the worlds of music, film, art, technology, design, food, publishing and other creative fields. Uniquely, on Kickstarter, a project must reach its funding goal before time runs out or no money changes hands, it’s an “all or nothing model”.
  • IndieGoGo (www.indiegogo.com): IndieGoGo you can create a funding campaign to raise money quickly and securely. This trusted platform has helped to raise millions of dollars for over 65,000 campaigns, across 211 countries.

High as Hell: The Evolution of our Gambling Addiction

Americans love a gamble, any gamble apparently. While there’s no accounting for wins and losses at private poker games, we do know that the US gaming industry takes in almost 66 billion dollars a year—which, by comparison, equals the gross national product of Bangladesh. There are 36.7 annual visitors to Las Vegas and while only 5 percent of those visitors claim they’re coming to Sin City for the thrill of the dice, an astounding 87 percent of those visitors end up rolling them anyway.  Did you ever wonder why?

To understand our desire to gamble we first need to understand a bit of evolutionary biology. While they didn’t have slot machines out on the African veldt, they did have hunger. And it was the need to find our next meal that helped shape our need to play the lotto. For millions of years, our progenitors lived in a state of constant threat, taking exceptionally big risks primarily in pursuit of food and sex. Those who’s big bets paid off in extra calories became our forbearers, those whose didn’t died off. As psychologist and Psychology Today  contributing editor Nando Pelusi points out “risk-taking behavior began with foraging.—and foraging is all about pattern recognition and pattern attribution.”

Pattern recognition is the term cognitive neuroscientists use for the brain’s ability to lump like with like, thus allowing us to remember that flipping over chunky rocks often reveals tasty grubs, while flat stones can often hide poisonous snakes. This is an attribute that helps us make sense of all of our experiences. It is a capacity that, as NYU professor of neurology Elkhanon Goldberg points out in his book on the subject The Wisdom Paradox, “is fundamental to our mental world . . . Without this ability, every object and every problem would be a totally de novo encounter and we would be unable to bring any of our prior experience to bear on how we deal with these objects or problems. The work by Nobel laureate Herbert Simon and others has shown that pattern recognition is among the most powerful, perhaps the foremost mechanism of successful problem solving.”

So fundamental is the need for pattern recognition that it’s tied to the body’s need/reward system. When we recognize patterns our brain releases a chemical that make us feel a little better so that the next time we confront the same patterns we’ll remember them. It is this system that accounts for things like the tiny rush of pleasure that comes from noticing the dealer’s latest up card pushes him over 21. And the pleasure chemical in question is one of the brain’s primary feel-good drugs: the neurotransmitter dopamine.

To give you an idea of how pleasurable that dopamine rush feels we need only to turn to cocaine. That rush that users get from snorting up Bolivian marching powder is actually dopamine. What cocaine really does to the brain is cause dopamine to be released and then block the receptor sites that allow for its reuptake (much in the way that anti-depressants like Prozac block the reuptake of serotonin). So the reason the comic Robin Williams once said “Coke makes me feel like a new man, and the new man wants some also” is because it was dopamine that was conferring that amazing feeling.

So amazing is that feeling that 50 years ago neurobiologist Jim Olds found that if he put an electrode in the dopamine-releasing pleasure center of a rat’s brain, then connected it via wires to an electric current generator, and gave the animals a switch to stimulate their own brains  they would do so without pause. They would neglect all other activities—including eating—for this little rush. Rats would rather starve to death then walk away from dopamine.

It is for this reason that scientists long believed that dopamine was pure pleasure. It was thought of as the reward portion of the body’s need/reward system. You wanted something fundamental to survival—like a next meal or a sexual partner—and when you got that thing the brain released a little dopamine so the next time you were faced with a similar situation (like being hungry) you would remember that feeding yourself felt damn good. But lately, thanks to the work of people like Emory University associate professor of psychiatry and behavioral sciences Greg Berns, we now know that dopamine is not released after you’ve gotten the thing you so desired, but rather when you take the risk to do the thing that gets you what you desire.

“Dopamine helps you learn by association, it helps you associate risk with reward,” says Berns. And associating risk with reward is a gambler’s bread and butter. So much so that, in 2005, Mayo clinic psychiatrist M. Leann Dodd found that eleven of her Parkinson disease patients developed pathological gambling addictions. The cause, she discovered, was the anti-tremor medicine pramipexole. Since dopamine also helps the body coordinate motor function, the drug works by blocking the reuptake of dopamine. Unfortunately, that extra chemistry boost was all it took to push these Parkinson’s patients over that big bet edge.

Dopamine further works against gamers because of something called “the gambler’s fallacy,”—the idea that odds being odds a long losing streak can only mean that the next hand has to be a winner. But odds don’t work that way. Every spin of the roulette wheel is independent of every other, but the brain’s pattern recognition system is built around short term gains not long term prediction. “The brain is good at understanding sudden environmental changes,” continues Berns, “but bad at lingering, slow developments—which is why people and animals always overestimate a longshot (thus we buy lottery tickets despite odds against winning which are often roughly 100 million to one).”

But that doesn’t change the fact that when you’re contemplating that next big bet, you’re both utilizing your pattern recognition system and gearing yourself up to take a risk that could bring a big reward—two of the need/reward system’s basic functions. The truth of the matter, as every gambler eventually learns the hard way, is just because you’re feeling lucky doesn’t mean you really are lucky.

What it actually means is your feeling dopamine.


The Neuroscience of Mystical Experience: Part I

The following is excerpted from West of Jesus. It is a detailed breakdown of the psychological, physiological and philosophical effects of modernity on our mythological tendencies:

Scholars use the words mythos and logos to describe the two main ways humans evolved to arrive at knowledge. Logos, meaning logic, is information of the no-nonsense variety: practical, clinical, scientific, secular. It is a way of thinking that helps us function best in society. In the world of logos, it doesn’t matter that fire was a gift from the gods, what matters is that fire is hot and you don’t want to stick your hand into it. Where this becomes difficult is when that same fire gets out of control and burns down a village. In the face of such sweeping tragedy, a larger context is often desired, but the cold calculus of logos—fire burns hot—merely left survivors befuddled. This used to be where mythos came in.

Mythos was a way of giving meaning to events that existed beyond easy context. For good reason, all of history’s fantastical stories—from Biblical tales to Maori myths—fall into this category. Mythos was a way of reminding people that life’s point was spiritual, eternal, deeper, greater, whatever. In the last two hundred years, perhaps nothing has more troubled secularists more than mankind’s clear co-dependence on the mythological, but perhaps not as troubling that, from an biological perspective, scientists are now finding this kind of reminder exceptionally helpful.

In the past few decades hundreds of studies done by hundreds of researchers have shown that spiritual people live longer and have healthier lives than non-spiritual people. Studies have shown that some kind of participatory faith lowers the risk for drug addiction, suicide, cancer, high blood pressure, stress, stroke, heart disease and bunches of other serious ailments. So many of these studies have now been done that in the past five years researchers have begun doing meta-studies of the studies. One such analysis, conducted by the Mayo Clinic in 2001, reviewed nearly 350 studies of physical health and 850 studies of mental health, all of which used religious and spiritual variables. These researchers found that religious involvement and spirituality are consistently associated with “better health outcomes.” They noted 18 longevity studies, conducted during the past three decades, all of which found that the spiritual outlive the non-spiritual by 20-30 percent. In plainer terms, as Dr. Harold Koenig of Duke University Medical Center recently pointed out to The New Republic, “Lack of religious involvement has an effect of mortality that is equivalent to forty years of smoking one pack of cigarettes a day.”

For years, researchers have attributed these health benefits to socio-economic, environmental and psychological factors, but a number of these meta-analysis have been designed to remove all these elements and all the outcomes stay the same. Beyond that, scientists looking into this phenomena have begun utilizing more advanced personality indexes, like University of Washington Medical School psychiatrist Robert Cloninger’s well-respected Temperament and Character Inventory (TCI). The TCI is an internationally used personality questionnaire—240 true or false questions designed to assess the seven dimensions of personality. Among those seven is a trait known as self-transcendence.

“Self-transcendence is a term used to describe spiritual feelings that are independent of traditional religiousness,” writes National Institute of Health geneticist Dean Hamer in his book The God Gene. “It is not based on belief in particular God, frequency of prayer, or other orthodox religious doctrines or practices. Instead, it gets to the heart of spiritual belief: the nature of the universe and our place in it. Self-transcendent individuals tend to see everything, including themselves, as part of a great totality. They have a strong sense of ‘at-one-ness’—of the connections between people, places and things. Non-self-transcendent people, on the other hand, tend to have a more self-centered viewpoint. They focus on differences and discrepancies between people, places and things, rather than similarities and interrelationships.”

By definition self-transcendence is composed of three distinct but related components of spirituality, the first of which is self-forgetfulness. Those of us who are more self-forgetful have an easier time losing ourselves in the moment, in music and sport and work, achieving what has become known as ‘the zone,’ or, what psychologist Mihaly Csikszentmihalyi calls a “flow state.”  Inside of a flow state, time and space and self vanish, in their place creativity and originality and insight often appear. Robert Cloninger, who designed the TCI and is universally recognized as a brilliant scientist not known to back away from his work, writing about the insight that often accompanies a flow state in his book Feeling Good: The Science of Well Being, says: “Dualistic reasoning is an incomplete basis for the dignity and wisdom that emerges from self-transcendent consciousness,” before pointing out physicist Blaise Pascal’s 1662 observation: “Reason’s last step is the recognition that there an infinite number of things that are beyond it.”

The other two key components that fill out the self-transcendent troika are transpersonal identification and mysticism, with transpersonal identification being a form of empathy writ large—one’s willingness to identify with plants and animals and nature—stretching from the admiration of a flower to a rapture worthy of Whitman. Mysticism is exactly as it sounds: the measurement of one’s willingness to interested in Pascal’s infinite things that cannot be explained by rationale and reason. Cloninger found that people who score high in mysticism also score high for creativity and for this reason has come to think of mysticism as, among other things, a measure of one’s intuitiveness.

One of the other things that Cloninger learned was that when tested for accuracy the three components of self-transcendence hang together, meaning if someone scores high for one trait they usually score high for the others. “Even more impressive,” writes Hamer, “when factor analysis was applied to the whole TCI, the three subscales of self-transcendence were clearly separate from all other temperament and character traits…In other words, self-transcendence is as distinct from other parts of the personality as eye and hair coloration are from size.” The reason this is interesting is because it is people who score very high for self-transcendence are the same people who go in for mythos and those are the same people who are outliving the rest of us by years at a stretch.

Which may explain why the need for mythos is a need that stretches beyond cultural boundaries. In the latter half of the eighteenth century, German anthropologist Adolph Bastian made a global study of myths and noticed that they all seem to be built on the same core ideas. The Swiss psychiatrist Carl Jung named these core ideas archetypes and extended Bastian’s work by arguing that archetypes were not only the building blocks of an individual’s unconscious mind, but also the cornerstone of a shared collective unconscious , a slippery term that Jung defined as a “storehouse of latent memory traces inherited from man’s ancestral past” and then broadened that ancestral past beyond the border of species, into our “pre-human or animal ancestry as well.”

Jung reached these conclusions after completing a global study of dreams and noticing that no matter where he was on the planet or whose dreams he was examining, the same symbols kept cropping up. Often times, these symbols mean little to the dreamer, but still carried an enormous emotional charge. In other words, everyone, no matter if they were New Guinea tribesmen or New York lawyers, were dreaming the same potent symbols, even if they didn’t know what those symbols meant.

He then examined the symbols categorically and found that most represented broad, cross-cultural mythological concepts, such as father or mother or shadow or hero. The reason for this, at least according to Jung, was that these archetypes existed in our genes. Jung felt this helped explain why people from different cultures could always enjoy each other’s myths—because those myths reflected a shared heritage. He thought this was why myths and dreams and odd encounters and less-than-random coincidences always caught our attention—they were the bleed through. It wasn’t that these things were trying to tell us something, they were telling us something.

In his 1949 The Hero With A 1000 Faces, Joseph Campbell took matters one step further, arguing that all of humankind’s stories are actually one story. Campbell named this story “the hero’s journey” and summarized it with the adage: “all religions are true, but none are literal.” He meant that in religion, if you strip away the what of what happened (the logos), and focus strictly on the why of what happened (the mythos), you unearth the exact same ideas.

But in the past three hundred years, we have forgotten Campbell’s adage. From the industrial revolution forward, as humankind got better and better at science, mythos—as a functioning rubric—lost its luster. As former nun and religious scholar Karen Armstrong writes in her account of the rise of fundamentalism, The Battle For God:

In the pre-modern world, both mythos and logos were regarded as indispensable. Each would be impoverished without the other . . . . By the eighteenth century, however, the people of Europe and America had achieved such astonishing success in science and technology that they   began to think that logos was the only means to truth and began to discount mythos as false and superstitious. It is also true that the new world they were creating contradicted the old mythical spirituality. Our    religious experience in the modern world has changed, and because an increasing number of people regard scientific rationalism alone as true, they have often tried to turn the mythos of their faith into logos.

Quite literally, rationalism shamed mythos out of sight. In its wake, new logocentric ideas—like Scientific secularism and Fundamentalist inerrancy—have risen to fill the vacuum. But one of the interesting by-products of this rise was that the so called soft sciences, fields like psychology and anthropology, where the study of myth seems crucial, began looking for ways to become harder. Scientists who once looked at mythos as a representation of our inner world, began distrusting the validity of that inner world. Subjectivity was out, objectivity came next.

Among those who turned this way was Claude Levi-Strauss. Trained as a philosopher, in the 1930s, Levi Strauss switched to anthropology because he was interested in a more rigorous approach to facts.  His early work involved a worldwide study of marriage customs. Levi-Strauss reasoned that if marriage customs were strictly cultural then they would vary from place to place and from society to society. In other words, they would be arbitrary and thus not likely to be duplicated. But Levi Strauss found the exact opposite. He found, that in every culture, certain fundamental ideas surrounding marriage repeated themselves, with a prohibition against incest being first and foremost.

The incest taboo had been around for a little while. Sir James Frazer (wrongly) demonstrated its universality in his 1910 study Totemism and Exogamy, while Freud extended this in his Totem and Taboo by arguing that the totems and taboos of tribal religions were the product of primitive peoples projecting the contents of their mostly primitive minds onto the real world. “The projection of their own evil impulses into demons is only one portion of the Weltanschauung (world view) of primitive peoples, and which we shall come to know as ‘animism.’” Much of that work focused on the Aborigines of Australia who practiced exogamy—marriage outside of the immediate tribe—which Freud linked to the incest taboo and later extended into his famous Oedipal Complex. But it was Levi-Strauss who further elevated the incest taboo from one of the basic components of our subconscious minds to the “fundamental step because of which, by which, but above all in which, the transition from nature to culture is accomplished.”

Levi-Strauss called this incest taboo a deep structure, meaning that it was unvarying and ubiquitous. The reason for this was simple—incest is bad for the gene pool. Sleep with your brothers and sisters and pretty soon mutations arise. If that pattern of intimate relations with intimate relations continues for more than a few generations pregnancy becomes impossible. The line dies out. Incest isn’t just a cultural taboo, it’s a biological taboo.

But more than that, Levi-Strauss realized the prohibition forced us to breed outside the nest and this co-mingling of families provided society with its most basic building blocks. For these reasons, much like Chomsky’s universal grammar, Levi Strauss thought these deep structures were hard-wired into our brain. He deduced that we are a species of mythologists because our myths contain the rules for our survival.

In the 1970s, ethnologist Charles Laughlin and neurology-trained psychiatrist and anthropologist Eugene d’Aquili began looking for ways to marry Levi Strauss’ structuralist anthropology with evolutionary theory. One of the first things they set out to do was tear down the theoretical wall that existed between humans and animals, realizing that if the need for myth was coded into human brains, than its vestiges must be present in the lower orders as well. Since questioning the animal kingdom about why it did certain things proved problematic, they instead chose to study the pattern of behavior that surrounded those things. They looked at ritual.

In nature, ritual is everywhere. Whales breech, peacocks display, bees dance. Laughlin and d’Aquili wanted to know, from a biological perspective, what purpose all of this serves. Evolutionary theory teaches us that the brain’s primary function is to keep an organism alive and reproducing and, in turn, everything from love to hunger are fundamentally expressions of this primary function. The proximity of a viable sexual partner produces lust much in the way that a shortage of glucose in the bloodstream produces hunger. Sex and eating, by satisfying these needs, produces an accompanying pleasure response. Without this response, we would stop mating and stop eating.

Laughlin and d’Aquili reasoned that as our brain evolved, this chain-of-command lengthened. Eating became associated with cooking which became associated with hunting and so forth. In this chain of association, it wasn’t just the action of eating that produced pleasure, it was the ritual surrounding eating that produced pleasure. The reasoning behind this ritual was that, as species evolved and grew in size, our nutritional needs grew alongside us. No longer could be anchor ourselves to a rock like a barnacle and feed on whatever floated by. If a wolf only ate the animals that wandering into its mouth, it would be dead within a week. To sustain all that body mass, wolves had to figure out how to hunt.  And it was wolves that Laughlin and d’Aquili decided to study.

In 1979, they published The Spectrum of Ritual, outlining their basic ideas. Before hunting, wolves go through a ceremonial tail-wagging, group howling session. Since wolves often hunt animals considerably larger than themselves, a level of coordination insures both a greater chance of success and a minimalization of danger. This ceremonial tail-wagging, group howling session—based on an outgrowth of this exact same need/response pattern—establishes order and rank for the coming battle. From this, Laughlin and d’Aquili deduced that ritual serves two important biological functions: it coordinates the brain to allow for group action and it teaches the young how to behave. This is why ritual is found everywhere in nature, because it is part of the engine that drives nature forward. But this explanation leaves one critical question unanswered: in humans, why did ritual become intertwined with myth?

Psychologist Frederic Bartlett was the first to realize that memory creates patterns in our brain. These patterns, which he called schemas, provide the mental framework for understanding and remembering information. The most basic schema is found when humans encounter something they don’t recognize. Since the drive to procreate is among our most fundamental, the question we always ask when encountering the unknown is “is this thing like me or not like me?” The reason we ask is obvious: if this new thing is like me than maybe I can breed with it and if it’s not like me maybe I should run away from it.

But as humans evolved, the things we encountered grew in number. We started asking more questions and developing more and more schemas. One of those other schemas was the need to know why something happened. Laughlin and d’Aquili argue that this need has become a “cognitive imperative.” By helping us to quickly adapt to our environment, this imperative contributes much to our success. But humans often encounter illness, death, odd coincidences, mysterious occurrences—things that do not allow for easy understanding. Yet evolution designed our brain to detect meaning and this mechanism doesn’t just shut down when easy answers aren’t readily forthcoming. Hence the need to invent meaning—gods, demons, supernatural forces—is based on an automatic process. Mythos is how humankind resolves the irresolvable. But because we adapt physically as well as mentally, ritualized action has become associated with mythologized meaning. And biologically, people who are better functioning members of society have a better chance at passing on their genes meaning this tendency towards myth and ritual was a trait selected for as part of what helped the fittest survive. The author and director of MIT’s Center for Cognitive Biology, Steven Pinker, summarizes this nicely when he says “religion is a technique for success.”

But, as Armstrong pointed out, the modern world is bereft of traditional mythologies and since our need for myth is part of our fundamental biology—somehow tied both to the immune system and our neurochemistry—we are, as such, shorting out our body’s need/reward system. When myth is denied to us in customary forms—since humans don’t much enjoy suffering—we look for new sources. These days those sources are often found in fiction, poetry, cinema, sports, television, but this is myth at a distance and myth at a distance, as Daniel Pinchbeck points out in his excellent Breaking Open the Head,  it is a shift not without certain consequences:

Modernism caused a profound shift in the way we use our senses. In his book Myth and Meaning, Levi-Strauss admitted his initial shock when he discovered Indian tribesman were able to see the planet Venus in daylight, with the naked eye—“something that to me would be utterly impossible and incredible.” But he learned from astronomers that it was feasible, and he found ancient accounts of Western navigators with the ability. “Today we use less and we use more of our mental capacity than we did in the past,” he realized. We have sacrificed perceptual capabilities for other mental abilities—to concentrate on a computer screen while sitting in a cubicle for many hours at a stretch (“something those Indians would find “utterly impossible and incredible”), or to shut off multiple levels of awareness as we drive a car in heavy traffic. In other words, we are brought up within a system that teaches us to postpone, defer, and eliminate most incoming sense data in favor of a future reward. We live in a feedback loop of perpetual postponement. For the most part, we are not even aware of what we have lost.

On average the human brain takes in 400 billion bits of information a second, but only 2000 of those bits make it up to our consciousness. Those 2000 bits, at least for most of us, represents the end limit of our processing capacity. When Levi-Strauss writes that “today we use less and we use more of our mental capacity that we did in the past,” what he means is we now see a different 2000 bits of information a second than what we saw in the past.  Part of this is straight-forward use it or lose it atrophy and some of this happens because, as research done by Eric Kandell at Columbia and Candice Pert at Georgetown proves down to the molecular level, our emotions help regulate our perceptions. Pert, one of the chief scientists responsible for our discovery of the brain’s neurochemistry, thus facilitating almost everything we think of when we think of modern neuroscience, explains how this works in her book Molecules of Emotion:

Emotions are constantly regulating what we experience as “reality.” The decision about what sensory information travels to your brain and what gets filtered out depends on what signals the receptors are receiving from peptides (Pert calls peptides “the molecules of emotion”). There is a plethora of elegant neurophysiological data suggesting that the nervous system is not capable of taking in everything, but can only scan the outer world for material that is prepared to find virtue of its wiring hook ups, its own internal patterns, and its past experience.

This means, at least on some level, what we believe governs what we see—though nobody is yet certain how much or how little of this is going on. Which means that belief governs perception which shapes reality or, as some consciousness researchers now believe, perception is reality and thus our reality—what we think of as the real world—is nothing, quite literally, beyond what we believe. Which raises the question what kind of effect does it have on our version of reality when a society stops believing in the mythological?

Survival of the Trippiest: Animals and Psychedelics

When the moon is in the seventh house and Mars collides with Jupiter, Andrew Weil was spending taxpayer dollars trying to figure out why people like to get high. This was roughly 1967 through 1972 and Weil was an NIH researcher extending a line of official inquiry dating back to the early 1950s, when the CIA first began playing Manchurian Candidate with LSD. Their hallucinogenic mind-control experiments, code named MK ULTRA, claimed 10 percent of the agency’s operating budget—an estimated 10 million dollars—in 1953 alone. The money didn’t help. Turns out LSD is lousy at making anyone do anything except follow the Grateful Dead. The agency moved on to heroin, morphine, temazepam, mescaline, psilocybin, scopolamine, marijuana, alcohol and sodium pentothal and a testing program spread across 30 different major institutions. Most of this research came to a screeching halt when Nixon started the drug war, but not before Weil pointed out that was a war we can’t win. Children love to alter their consciousness by spinning in circles and hyper-ventilating and adults love do the same thing with booze and drugs and, In his The Natural Mind,  Weil points out there’s probably a pretty good reason for this.

Nixon’s drug war also got a friend of Ronald K. Siegel’s arrested for marijuana possession. Currently, Siegel’s a psychopharmacologist at UCLA and one of the world’s leading experts on drug use, back then he was a psychology graduate student who knew nothing about marijuana. But his friend was in trouble, so he decided to learn. Not knowing what else to do, Siegel started by trying to get a pigeon stoned. When it worked, he wondered what else worked. The drug war had closed the door to most laboratory research, so Siegel sought his answers outside. He spent two decades studying intoxication in nature. It took that long for a reason.

In October 2006, National Public Radio’s All Things Considered considered Lady, a Cocker Spaniel spending a suspicious amount of time down by the backyard pond. “Lady would wander the area, disoriented and withdrawn, soporific and glassy-eyed,” Laura Mirsch, Lady’s owner, told NPR. There was that one night, after being out a long while, when Lady wouldn’t even come back in. Eventually, she staggered over from the cattails. “She looked up at me,” recounts Mirsch, “leaned her head over and opened her mouth like she was going to throw up, and out popped this disgusting toad.” It turned out the toad was Bufo alvarius, the Colorado River toad, who skin contains two different tryptamines—the same psychoactive found in “magic mushrooms.” Licking Bufo produces heady hallucinations.

And toad tripping dogs are just the beginning. Everywhere Siegel looked, he found animals who loved to party. Bees stoned on orchid nectar, goats gobbling magic mushrooms, birds chomping marijuana seeds, rats on opium, also mice, lizards, flies, spiders and cockroaches on opium, elephants drunk on anything they can find—usually fermented fruit in a bog hole, but they’re known to raid breweries in India as well—felines crazy for cat-nip, cows loco for loco grass, moths preferring the incredibly hallucinogenic datura flower, mandrills taking the even stronger iboga root. And from an evolutionary perspective, this is some difficult behavior to explain.

“The pursuit of intoxication by animals seems as purposeless as it is passionate,” writes Siegel in his 1989 Intoxication: The Universal Drive For Mind Altering Substances. “Many animals engage these plants, or their manufactured allies, despite the danger of toxic or poisonous effects. The stupefied bees quickly become victims of predation. The carcasses of “drunken” birds litter the highways. Cats pay for their addiction to pleasure plants with brain damage. Cows poisoned with range weeds may eventually die. Inebriated elephants destroy much property and the lives of other animals. Disoriented monkeys ignore their young and wander from the safety of the troop. Humans are no different.”

In weighing the evidence, Siegel shared Weil’s suspicions, but reached an even stronger conclusion: “The pursuit of intoxication with drugs is a primary motivational force in the behavior of organisms.” He means humans aren’t the only species wired for this want, Siegel’s saying our pets are wired as well. He believes the taste for intoxication is acquired and not inborn, though once acquired look out. “Unlike other acquired motives, intoxication functions with the strengths of a primary drive in its ability to steer the behavior of individuals, societies and species. Like sex, hunger, and thirst, the fourth drive, to pursue intoxication, can never be repressed. It is biologically inevitable.”

Siegel’s research has been validated and something of a consensus opinion reached as to why a pursuit so obviously dangerous would become biologically inevitable. According to Italian ethnobotanist Giorgio Samorini, in his 2001 Animals and Psychedelics, intoxication promotes what psychologist Edward de Bono called lateral thinking—problem-solving through indirect and creative approaches. Lateral thinking is thinking outside the box, without which a species would be unable to come up with new solutions to old problems, without which a species would be unable to survive. De Bono thinks intoxication an important “liberating device,” freeing us from “rigidity of established ideas, schemes, divisions, categories and classifications.” Both Siegel and Samorini think animals use intoxication for this same reason—and they do so knowingly.

Hallucinogens are chemical defenses—toxins manufactured by plants to avoid predation. Fungi, among our most prolific source of psychedelics, evolved 600 million years ago, not coincidentally at the same time as plant-eating animals. Herbivores may have first ingested these psychoactives when the threat of starvation gave them no other choice, but later on sought them out for different rewards.  “For example,” writes Siegel, “morning glories, which contain the same alkaloid as ergot (the psychoactive basis for LSD), are eaten by rats which feed regularly on the plant’s vines and fruits. The rodents tend to avoid the larger concentrations of alkaloids in the seeds. Yet, when disturbed by severe weather conditions, a rat will occasionally snack on a single seed, then display the characteristic head-twitching of intoxication.”

Siegel also saw a mongoose chewing morning glory seeds, not as a routine part of his diet, rather as a reaction to the death of his mate. “Morning glory seeds are used by modern Mexican Indians to console themselves in times of trouble; perhaps the animals were doing the same,” he says. Mandrills eat the hallucinogenic iboga root and then wait two hours for the effects to kick in before picking a territory fight with a rival. Even Lady knew what she was doing. After her initial spate of toad-licking addiction, she learned to only party on the weekends.

Tune in, turn on, drop back even further and we can thank animal planet for the Age of Aquarius. The animals taught us to trip and, to borrow a phrase from Oscar Wilde, “we never had the courtesy to thank them for it.” In Mexico, the Huichol Indians often use the same word for peyote as for deer, which also explains the fourth century ceramic pipe found in Guatemala—in the shape of a deer, with a peyote button between its teeth. The shaman of the Russian steppe, from whom the word “shaman” descends, have a fondness for Amanita muscaria—a serious trip of a mushroom that the reindeer turned them onto. From watching reindeer eat piss-soaked snow, these shaman also learned to drink urine after taking mushrooms to boost the high.

Oddly, A. muscaria is red and white and looks like chubby-bearded guy poured into a mushroom costume. Scholars have repeatedly pointed out that Santa Claus, flying reindeer, pine trees, and the giving of gifts were the original components of a mushroom harvest and consumption festival. Christmas may have become Christ’s birthday, but it’s began as Siberian Woodstock—except, you know, with no Jimi Hendrix and plenty of reindeer.

Meanwhile, Jaguars in the Amazon chew the bark and leaves of the yaje vine, better known as ayahuasca and containing DMT, arguably the most powerful hallucinogen on earth. Yaje also makes you puke violently—so why did anyone bother following this example? “Shaman,” writes Siegel, “teach that by using the vine they too will be transformed into jaguar.” What this means is that the animals taught us to trip and we tripped to become animals.

We tripped for other reasons as well. At the turn of the Twentieth century, William James pointed out that human’s instinct for reality “has always held the world to be essentially a theater for heroism.” In 1974 psychologist Ernest Becker won the Pulitzer prize for figuring out why. In his The Denial of Death, Becker argues the basic motivation for all human behavior is mortal terror—the biological need to control our fear of death anxiety. In 1963 Aldous Huxley asked for an injection of LSD on his deathbed, believing the drug could facilitate a “good death.”  The next year Stanislav Graf found that a number of psychedelics reduced existential anxiety in late-stage cancer patients. In 1966, Harvard researchers Walter Pahnke and William A. Richards realized that “the experience of an undifferentiated unity” is LSD’s “hallmark ‘mystical experience.’” Psychedelics solve death by proving the perennial philosophy—if you’re one with everything, death isn’t much of a concern. But what’s really interesting is that it’s not just humans who have this problem.

In her Coming of Age with Elephants, biologist Joyce Poole describes a mother elephant grieving for a stillborn baby—crying, slumped over, days on end spent desperately trying to revive her child. On another occasion she saw a troop moving through the forest when one of them fell over and died. The elephants spent a long time trying to revive their companion before moving off into the jungle, only to return the next day for further ceremony. Chimpanzees too go through elaborate, multi-day rituals with the corpse of dead relatives— though they casually discard those relatives once they start to rot. In 2008, the internet was flooded with photos of Gana, an 11-year old gorilla at the Munster Zoo in Germany, who refused to let go of the dead body of her infant son for several days, prompting New York Times science writer Natalie Angier to say: “Gorillas, and probably a lot of other animals as well, have a grasp on their mortality and will grieve for their dead and are really just like us.”

And just like us—they take specific drugs for specific reasons. Among the Navajo, the bear is a revered for giving them Osha, a root effective against stomach pains and bacteria infections. Wild carrot, as we learned from birds, repels mites. Horses in pain will hunt for willow stems because that’s where aspirin comes from. In the Gombe National Forest in Tanzania, chimps swallow sunflower leaves whole. When Michael Huffman, a pioneer in zoopharmacognosy from Kyoto University in Japan, took a closer look he found sunflower leaves are hairy and those hairs scrape worms from digestive tracts. 30 different species swallow leaves for this benefit, including humans. These days, when companies like Shaman Pharmaceuticals sends researchers into the Amazon to study the “old ways,” what they’re really after is the medical information originally gleamed from watching animals.  So maybe that mongoose wasn’t just trying to assuage his grief with morning glory seeds, maybe he was trying to nullify it completely—seeking in psychedelics the same thing we seek in psychedelics: proof of membership in infinite collective, proof that death is not the end.

The Godfather: Mihaly Csikszentmihalyi on Flow

Years ago, when West of Jesus first came out I was doing a NPR radio interview and absolutely butchered the pronunciation of Csikszentmihalyi’s name. This, I have learned, is not an uncommon experience. But help is at hand. That day, a woman called in and berated me for my error and gave me the following pneumonic: CHICK-SENT-ME-HIGH. I find this both useful and appropriate and thought I’d pass it along.

In other news, if you haven’t seen Csikszentmihalyi’s great TED talk on flow, here’s a look.

Searching for a Better Way: The Very Early History of Flow Research

A lot of questions come my way about the origins of flow state research. It’s not the easiest question to answer. For starters—with the obvious variable being one of definition—there are great arguments to be made in a great many directions. While many are probably uncomfortable with this idea, one of the points of origin has to be those stone-age hunters tripping on psychedelics or Roman gladiators drunk on herbal stimulants. The point being, from the moment we started trying to hack ultimate human performance (with drugs being a very easy way of doing this), we started to hack flow.

I also feel, primarily because of the deep similarities (both biologically and phenomenologically) between flow states and mystical experiences (a topic I cover pretty thoroughly in the later chapters of West of Jesus), one could also trace the origins back to those early meditators—a practice (if you’re talking about the use of repetitive, rhythmic chanting to alter consciousness) that also dates back to the prehistoric times.

That said, my personal belief is that the real origins of flow research dates to a trio of groundbreaking discoveries made between 1871 and 1916. These discoveries are also my origin story for the science of both optimal and ultimate human performance (the difference being that optimal means doing your best while ultimate means doing your best in any situation where an error can kill you).

Here’s the overview:

In 1871, Swiss geologist Albert Van Heim (who did pioneering work on the structure of the Alps) made a survey of mountain climbers involved in near-fatal accidents and discovered that life-threatening situations had a tendency to produce profoundly altered states of consciousness (including so-called “mystical experiences”).

In 1904, American psychologist William James realized that one of these altered states—which he dubbed “peak experiences”— had two key features. The first was that it significantly heightened human performance, the second was that these peak experiences were often “mystical” in nature—they included what we might call “divine qualia” and were accompanied by “miraculous” events (such as spontaneous healing from disease).

Then, in 1916, American physiologist William Bradford Cannon realized that there were key physiological changes underlying peak performance. While Bradford was primarily focused on the “fight or flight” reaction, the core idea—that biology underpinned altered states of consciousness and peak performance (and by extension flow) was big news.

With these three lines of development, all the main pieces were in place for what later became the science of flow.

Here’s why: Heim’s discovery highlighted three key pieces of data. The first was that high risk environments can alter consciousness. The second was that this alteration bears startling similarities to so-called “mystical experiences” (Heim is credited with the first published scientific discussion of near-death experiences). The third is a little more subtle. The psychological changes Heim described (his book, by the way, is called Remarks on Fatal Falls), seem to be robust—meaning people who have experienced this risk-created altered state are often permanently altered afterward.

William James upped the ante. Some of this was pedigree. Heim was a respected geologist, not a psychologist or sociologist (to be fair, neither of these fields were really in existence at the time), but still there was a gap in expertise. James, though, was a Harvard professor, a philosopher and psychologist with a background in medicine. Moreover, James extended Heim’s ideas considerably, coining the phrase “peak experience,” elaborating on its effects, and—inadvertently—published a very “flow-like” account of a woman using mediation (and the resulting flow experience) to cure herself of chronic disease.

I went to bed immediately, and my husband wished to send for the doctor. But I told him that I would rather wait until morning and see how I felt. Then followed one of the most beautiful experiences of my life. I cannot express it in any other way than to say that I did “lie down in the stream of life and let it flow over me.” I gave up all fear of any impending disease; I was perfectly willing and obedient. There was no intellectual effort, no train of thought. My dominant idea was: “Behold the handmaid of the Lord: be it unto me even as thou wilt,” and a perfect confidence that all would be well, that all was well. The creative life was flowing into me every instant, and I felt myself allied with the Infinite, in harmony, and full of the peace that passeth understanding…I do not know how long this state lasted, nor when I fell asleep; but when I woke up in the morning, I was well.’ (James, 1902/1958, p.106)

This is, of course, only one example. James wrote volumes on the subject. I like this one because the woman uses the word “flow” and, with the “no intellectual effort, no train of thought” portion, gives a great early description of the subconscious nature of the state.

Because he was so thorough, many like to give William James all the credit for early flow work, but my personal feeling is that William Bradford Cannon had the real key insight. By proving biology and physiology underpin altered states (and extreme human performance), he got god far enough from the equation that we could actually start making scientific progress. This is not to dismiss the divine qualia that accompany these experiences, but Cannon created enough breathing room that real research could be done. In this case, it’s may be a case of what Edward Albee meant when he said: “Sometimes it’s necessary to go a long distance out of your way to return a short distance correctly.”

© Copyright 2012 Abundance the Book - by Peter Diamandis and Steven Kotler. All Rights Reserved. - Read Our Privacy Policy