Skip to content

Posts tagged ‘nietzsche’

Produce First, Sharpen Second: What Dylan’s Vomit Teaches Us About Creativity

For Dylan, “Like a Rolling Stone” began as a long piece of vomit, at least that’s what he told two reporters back in 1965. As the story goes, Dylan, who was at the tail end of a grueling tour that took his pre-electric act across the United States and into Europe, decided to quit music and move to a small cabin in upstate New York to rethink his creative direction. He was sick of answering the same questions over and over again. He was sick of singing the same song over and over again. He wanted to liberate his mind.

This is why “Like a Rolling Stone” began as a twenty-page ramble. It was, as Dylan described it, a regurgitation of dissatisfactions and curiosities. What came next was Dylan’s true talent. Like a wood sculpture, he whittled at his rough draft. He cherry picked the good parts and threw away the bad parts. He began to dissect his words to try to understand what his message was. Eventually, Dylan headed to the studio with a clearer vision, and today, “Like a Rolling Stone” stands as one of the very best.

What’s interesting is how Dylan approached the writing process. The song started as a splattering of ideas. Dylan wasn’t even trying to write a song; initially, he didn’t care about verses or choruses. He compared the writing process to vomiting because he was trying to bring an idea that infected his thinking from the inside to the outside of his body.

His strategy isn’t unique. In fact, it resembles the approach of many other artists throughout history. For example, in the Fall 1975 issue of The Paris Review, the Pulitzer Prize winner and Nobel laureate John Steinbeck gave this piece of advice about writing: “Write freely and as rapidly as possible and throw the whole thing on paper. Never correct or rewrite until the whole thing is down. Rewrite in process is usually found to be an excuse for not going on. It also interferes with flow and rhythm which can only come from a kind of unconscious association with the material.” As the saying goes, perfection is achieved not when there is nothing left to add, but when there is nothing left to take away.

This principle doesn’t just show itself in art. Economies, too, succeed and fail by continuous innovation and wealth followed by unvaried ideas and bankruptcies. The Austrian economist Joseph Schumpeter popularized the term creative destruction to describe the simultaneous accumulation and annihilation of wealth under capitalism. As Schumpeter saw it, for every successful entrepreneur dozens of failures followed. But this was a good thing; capitalism was to be understood as an evolutionary process where good ideas prevailed over bad ones.

With these thoughts in mind, consider a study released this month conducted by Simone Ritter of the Radboud University in the Netherlands with help from Rick B. van Baaren and Ap Dijksterhuis. For the first experiment, the scientists recruited 112 university students and gave them two minutes to come up with creative ideas to solve relatively harmless problems (e.g., improving the experience of waiting in line at a supermarket). Next, the subjects were divided into two groups: the first went straight to work, while the second performed an unrelated task for two minutes to distract their conscious mind.

The first thing the psychologists found wasn’t too eye opening. Both groups – conscious and distracted – created the same amount of ideas. But the second finding was slightly more intriguing. Here’s Jonah Lehrer describing the results:

After writing down as many ideas as they could think of, both groups were asked to choose which of their ideas were the most creative. Although there was no difference in idea generation, giving the unconscious a few minutes now proved to be a big advantage, as those who had been distracted were much better at identifying their best ideas. (An independent panel of experts scored all of the ideas.) While those in the conscious condition only picked their most innovative concepts about 20 percent of the time — they confused their genius with their mediocrity — those who had been distracted located their best ideas about 55 percent of the time. In other words, they were twice as good at figuring out which concepts deserved more attention.

When it comes to writing an essay for college, pitching a business plan or creating a work of art we are hard wired to believe that our output is above average. As a result, we are blind to what needs improvement. It’s not just that we can’t see any holes and errors; we don’t think they exist. What’s interesting about Ritter’s findings is that they give us a strategy to overcome our overconfidence. The lesson from her research is that in order to recognize our imperfections we must step back and be dilettantes. In other words, get distracted and don’t marry the first draft.

And this brings me back to Dylan’s vomit and Steinbeck’s advice. The reason we should “never correct or rewrite until the whole thing is down” is because we initially don’t know which of our ideas are worthwhile. It’s only after we get everything down that we are able to recognize what works from what doesn’t. This is the lesson from Ritter’s research: we need to give the unconscious mind time to mull it over so it can convince the conscious mind to make adjustments. Or, as Nietzsche said in All Too Human: “The imagination of the good artist or thinker produces continuously good, mediocre or bad things, but his judgment, trained and sharpened to a fine point, rejects, selects, connects…. All great artists and thinkers are great workers, indefatigable not only in inventing, but also in rejecting, sifting, transforming, ordering.”

Read more

The Future Of Religion

Religious people, that is, people who say that religion is important in their lives, have, on average, higher subjective well being. They find a greater sense of purpose or meaning, are connected to stronger social circles and live longer and healthier lives. Why, then, are so many dropping out of organized religion?

Last year a team of researchers led by Ed Diener tried to answer this question. They found that economically developed nations are much less likely to be religious. On the other hand, religion is widespread in countries with more difficult circumstances. “Thus,” the authors conclude, “it appears that the benefits of religion for social relationships and subjective well-being depend on the characteristics of the society.” People of developed nations are dropping out of organized religion, then, because they are finding meaning and wellness elsewhere.

The real paradox is America, where Nietzsche’s anti-theistic proclamation went unheard. 83 percent of Americans identify with a religious denomination, most say that religion is “very important” in their lives and according to Sam Harris 44 percent “of the American population is convinced that Jesus will return to judge the living and the dead sometime in the next fifty years.” In fact, a recent study even showed that atheists are largely seen as untrustworthy compared to Christian and Muslims.

Why does the United States, one the most economically developed countries in the world, deviate from the correlation between religion and wealth? One answer is that trends always contain outliers. As Nigel Barber explains in an article: “The connection between affluence and the decline of religious belief is as well-established as any such finding in the social sciences…. [and] no researcher ever expects every case to fit exactly on the line… If they did, something would be seriously wrong.”

Whatever the reasons, a recent article by David Campbell and Robert Putnam suggests that Americans are catching up to their non-believing European counterparts. According to Campbell and Putnam, the number of “nones” – those who report no religious affiliation – has dramatically increased in the last two decades. “Historically,” Campbell and Putnam explain, “this category made up a constant 5-7 percent of the American population… in the early 1990s, however, just as the God gap widened in politics, the percentage of nones began to shoot up. By the mid-1990s, nones made up 12 percent of the population. By 2011, they were 19 percent. In demographic terms, this shift was huge.”

A study by Daniel Mochon, Michael Norton and Dan Ariely bodes well with this observation. They discovered that, “while fervent believers benefit from their involvement, those with weaker beliefs are actually less happy than those who do not ascribe to any religion-atheists and agnostics.” It’s possible the “nones” Campbell and Putnam speak of are motivated to abandon their belief by a desire to be happier and less conflicted with their lives. This might be too speculative, but there are plenty of stories, especially in the wake of the New Atheist movement, of people who describe their change of faith as a dramatic improvement for their emotional life. In a recent interview with Sam Harris, for example, Tim Prowse, a United Methodist pastor for almost 20 years, described leaving his faith as a great relief. “The lie was over, I was free,” he said, “…I’m healthier now than I’ve been in years and tomorrow looks bright.”

What does this say about the future of atheism? Hitchens and others suggest that a standoff between believers and non-believers may be inevitable. “It’s going to be a choice between civilization and religion,” he says. However, grandiose predictions about the future of the human race are almost always off the mark, and it’s likely that the decline in religion will remain slow and steady. It’s important to keep in mind that this decline is a recent phenomena. It wasn’t until the 17th century, the so-called Age of Reason, when writers, thinkers and some politicians began to insist that societies are better off when they give their citizens the political right to communicate their ideas. This was a key intellectual development, and in context to the history of civilization, very recent.

To be sure, radical ideologies will always exist; religion, Marx suggested, is the opiate of the people. But the trend towards empiricism, logic and reason is undeniable and unavoidable. Titles including God Is Not Great and The God Delusion are bestsellers for a reason. And if Prowse’s testimony as well as Campbell and Putnam’s data are indicative, there is a clear shift in the zeitgeist.

Why Atheists Should Be Allowed To Cherry Pick From Religion

Ever since Darwin published Origins, Nietzsche declared the death of God and Hitchens argued that religion poisons everything, atheists have struggled with atheism. Some deny the supernatural but are “spiritual;” some deny the historical credibility of the scripture, Torah or Quran but value their principles; some don’t believe in anything that cannot be explained by science yet maintain that humans possess an intangible essence or that there is an afterlife. I’ve even met folks who call themselves “atheists who believe in God.”

It’s easy to understanding said beliefs as inconsistent or incompatible; how can someone both believe and not believe in God? Be scientific and religious? This attitude ignores a truth that doesn’t get said enough: atheism is diverse.

The repetitive and attention grabbing debates between fundamentalists and non-believers are one reason this is forgotten. It’s easy to assume that only two opinions exist when searching “atheism” on YouTube or Google returns talks and articles from only William Lane Craig or Christopher Hitchens.

But most atheists know that the worldview of the fundamentalist and staunch non-believer inaccurately portrays religious belief as black and white. These more mainstream atheists know that there is a fairly large middle ground where religion and atheism can exist simultaneously to promote human flourishing. Religious people can believe in natural selection and be pro-choice even though many texts suggest otherwise while atheists have no problem being moral and giving to charity even though they never went to Sunday school.

When it comes to scientific claims, Hitchens and Dawkins are right: the world wasn’t created in a few days; natural selection is an observable phenomenon; God probably doesn’t exist; one can be moral without religion. But when it comes to how we ought to behave and what we ought to value the great religious texts got a few things correct. The problem is that hardcore atheists don’t let the mainstream cherry pick the good parts of religion without criticizing them for being inconsistent or intellectually lazy. We have to allow atheism to incorporate those religious practices and principles that we know contribute to human flourishing.

My conviction is not only a reminder that atheism is more diverse than some make it out to be, but also that atheism can be improved if it considers the right religious themes.

In a recent TED lecture Alain de Botton assumes a similar position. He explains:

I am interested in a kind of constituency that thinks something along these lines… I can’t believe in any of this stuff. I can’t believe in the doctrines… but – and this is a very important but – I love Christmas carols! I really like the art of Mantegna, I really like looking at old churches and I really like learning the pages of the Old Testament. Whatever it may be you know the kind of thing I am talking about: people who are attracted to the ritualistic side, the moralistic communal side of religion but can’t bear the doctrine. Until now these people have faced an unpleasant choice: either accept the doctrine and have all the nice stuff or reject the doctrine and live in a spiritual wasteland…  I don’t think we have to make that choice… there’s nothing wrong with picking and mixing, with taking out the best sides of religion. To me atheism 2.0 is about a respectful and impious way going through religions and saying what could we use. The secular world is full of holes… a thorough study of religion can give us all sorts of insights into areas of life that are not going too well.

The good news is, I think, most people agree. The problem is that they don’t get the coverage.

At the risk of stating the obvious, let’s remember that knowing how to live the best possible life requires both humanistic ideals as well as ideals from many of the great religions. As Jonathan Haidt concludes his enjoyable book The Happiness Hypothesis, “by drawing on wisdom that is balanced – ancient and new, Eastern and Western, even liberal and conservative – we can choose directions in life that will lead to satisfaction, happiness, and a sense of meaning.”

%d bloggers like this: