Skip to content

Posts tagged ‘Ted’

Why I’m Optimistic About The Future

The history of Earth is a rocky and lifeless story. The first signs of life emerged about a billion years after our planet’s creation. They weren’t much either; mostly single-celled organisms that resembled today’s bacteria. Land animals emerged out of the oceans as recent as 500 million years ago, and the genus homo came onto the scene a mere 2.5 million years ago. Complex life on Earth is the new kid on the block; natural selection spent most of its time keeping species the same, not changing them.

We humans are a different story. 200,000 years ago a few tens of thousands of us dotted the African plains. But then something happened. We spread across the globe creating cities and villages along the way. Language evolved, and with it culture and societies. We began living longer and healthier and our population skyrocketed as a result.

What’s peculiar about the rise of humans is that biologically speaking, nothing changed; the same genes that constituted our hunter-gatherer ancestors constitute us. But somewhere along the line a small change led to profound differences in our behavior within a short period of time. Whereas homo erectus and the Neanderthals spent hundreds of thousands of years making the same tools over and over again, we were able to understand and organize the world better.

Whatever the genetic change was, we eventually gained the ability to learn from others. This was hugely important. Anthropologists call this cultural or social learning, and it not only describes our tendency to copy and imitate by watching others, it highlights our unique ability to realize the best from a number of alternatives and attempt to improve on it. Many animals can learn, but only humans can learn and improve. As evolutionary biologist Mark Pagel explains, “even if there were a chimpanzee-Einstein, its ideas would almost certainly die with it, because others would be no more likely to copy it than a chimpanzee-dunce.”

What’s more is our ability to put ourselves inside the minds of others – what philosophers term a theory of mind. It helps us assign reason, purpose and intentionality to objects and people, which moreover allows us to understand things as being part of a bigger picture. Without a theory of mind we would probably still be using the same tools as we did 200,000 years ago.

In addition, theory of mind gives rise to emotions like empathy and sympathy, which give us the capacity to cooperate with people and groups outside of our kin. Virtually all members of the subfamily homininae (beside bonobos) including chimps, gorillas and orangutans do not exhibit this type of behavior. To borrow a thought experiment from the anthropologist Sarah Hrdy, imagine if you were on a 747 filled with chimps and a baby started to cry. In Hrdy’s words: “any one of us would be lucky to disembark with all their fingers and toes still attached, with the baby still breathing and unmaimed.” Recent psychological research is confirming that our species ability to cooperate is partially innate. As bleak as our current headlines are, it appears we humans are wired with at least a minimal ability to get along with each other and have a sense of justice. We’re not perfect, but no chimp would donate to charity and certainly not group of chimps could set up a charity.

This is important for many reasons. The most obvious is that economics is impossible without the means to cooperate with strangers. This is why, according to Matt Ridley, one of the key developments in our species history took place when we “started to do something to and with each other that in effect began to build a collective intelligence… [we] started, for the very first time, to exchange things between unrelated, unmarried individuals; to share, swap, barter and trade.” The effect of trade was specialization, which gave rise to innovation, which in turn improved technologies and so on and so on. Well before Smith hypothesized the invisible hand and Ricardo thought about how England and Portugal could efficiently trade wine we had already begun to understand that communities were better off when their members honed their skills, pursued their self-interest and traded with other communities.

This is a simplified and incomplete story but you get the idea: humans flourished because they were able to learn from and cooperate with each other. It’s unclear what happened biologically, but the consequences were obviously vast.

What’s interesting is that the same cognitive mechanisms that allowed our species to prosper in the African savannah are the same cognitive mechanisms that are responsible for globalization in the 21st century. However, in place of face-to-face interactions is communication over the web.

In a 2010 Ted lecture Chris Anderson addressed this point by exploring how web video powers global innovation. He explained the following:

A while after Ted Talks started taking off we noticed that speakers were starting to spend a lot more time in preparation… [the previous speakers raised] the bar for the next generation of speakers… it’s not as if [speakers] ended their talks saying ‘step your game up,’ but they might as well have… you have these cycles of improvement apparently driven by people watching web videos.

Anderson terms this phenomenon “crowd accelerated innovation,” and uses it to explain not just how Ted Talks are improving in quality, but how everything is. He is making the same general point as Pagel and Ridley: humans learn and innovate by watching and stealing ideas from others. But what’s unique about Anderson’s point is that it’s describing how the Internet is facilitating this ability. And the exciting part is that people will learn and imitate even faster with YouTube, Wikipedia, Google Books and many more online services that focus on the distribution of content. As Anderson says, “this is the technology that is going to allow the rest of the world’s talents to be shared digitally, thereby launching a whole new cycle of… innovation.”

Whereas a famine could have easily wiped out the only community that knew how to harvest a certain crop, build a certain type of boat or make a certain type of tool – what anthropologist call random drift – the Internet not only ensues our collective knowledge, it makes it widely accessible, something the printing press wasn’t able to achieve to the same degree. This is why I’m optimistic about the future: the Internet will only accelerate our ability and desire to improve upon the ideas of others.

Ted lectures over the years give us plenty of concrete examples to be hopeful about: Hans Rosling illustrated the global rise in GDP and decrease in poverty over the last several decades; Steven Pinker demonstrated the drastic decline in violence; Ridley and Pagel spoke about the benefits of cultural and economic cooperation; and most recently, Peter Diamondis argued that we will be able to solve a lot of the problems that darken our vision of the future. And because all this research is coming to us via the web the next round of ideas will be even better. More importantly, it will inspire a generation of young Internet users who are looking to change the world for the better.

Why Atheists Should Be Allowed To Cherry Pick From Religion

Ever since Darwin published Origins, Nietzsche declared the death of God and Hitchens argued that religion poisons everything, atheists have struggled with atheism. Some deny the supernatural but are “spiritual;” some deny the historical credibility of the scripture, Torah or Quran but value their principles; some don’t believe in anything that cannot be explained by science yet maintain that humans possess an intangible essence or that there is an afterlife. I’ve even met folks who call themselves “atheists who believe in God.”

It’s easy to understanding said beliefs as inconsistent or incompatible; how can someone both believe and not believe in God? Be scientific and religious? This attitude ignores a truth that doesn’t get said enough: atheism is diverse.

The repetitive and attention grabbing debates between fundamentalists and non-believers are one reason this is forgotten. It’s easy to assume that only two opinions exist when searching “atheism” on YouTube or Google returns talks and articles from only William Lane Craig or Christopher Hitchens.

But most atheists know that the worldview of the fundamentalist and staunch non-believer inaccurately portrays religious belief as black and white. These more mainstream atheists know that there is a fairly large middle ground where religion and atheism can exist simultaneously to promote human flourishing. Religious people can believe in natural selection and be pro-choice even though many texts suggest otherwise while atheists have no problem being moral and giving to charity even though they never went to Sunday school.

When it comes to scientific claims, Hitchens and Dawkins are right: the world wasn’t created in a few days; natural selection is an observable phenomenon; God probably doesn’t exist; one can be moral without religion. But when it comes to how we ought to behave and what we ought to value the great religious texts got a few things correct. The problem is that hardcore atheists don’t let the mainstream cherry pick the good parts of religion without criticizing them for being inconsistent or intellectually lazy. We have to allow atheism to incorporate those religious practices and principles that we know contribute to human flourishing.

My conviction is not only a reminder that atheism is more diverse than some make it out to be, but also that atheism can be improved if it considers the right religious themes.

In a recent TED lecture Alain de Botton assumes a similar position. He explains:

I am interested in a kind of constituency that thinks something along these lines… I can’t believe in any of this stuff. I can’t believe in the doctrines… but – and this is a very important but – I love Christmas carols! I really like the art of Mantegna, I really like looking at old churches and I really like learning the pages of the Old Testament. Whatever it may be you know the kind of thing I am talking about: people who are attracted to the ritualistic side, the moralistic communal side of religion but can’t bear the doctrine. Until now these people have faced an unpleasant choice: either accept the doctrine and have all the nice stuff or reject the doctrine and live in a spiritual wasteland…  I don’t think we have to make that choice… there’s nothing wrong with picking and mixing, with taking out the best sides of religion. To me atheism 2.0 is about a respectful and impious way going through religions and saying what could we use. The secular world is full of holes… a thorough study of religion can give us all sorts of insights into areas of life that are not going too well.

The good news is, I think, most people agree. The problem is that they don’t get the coverage.

At the risk of stating the obvious, let’s remember that knowing how to live the best possible life requires both humanistic ideals as well as ideals from many of the great religions. As Jonathan Haidt concludes his enjoyable book The Happiness Hypothesis, “by drawing on wisdom that is balanced – ancient and new, Eastern and Western, even liberal and conservative – we can choose directions in life that will lead to satisfaction, happiness, and a sense of meaning.”

The Evolution of Water

In a recent TedTalk, MIT cognitive scientist Deb Roy gave a presentation entitled “The Birth of a Word,” based on a remarkable study he conducted over the last three years. Roy’s study was motivated by his interest in language acquisition. Specifically, he wanted to know how infants learned individual words over the course of their development. To do so, Roy wired his house with audio/video equipment and  recorded 90,ooo hours of video and 140,000 hours of audio (200 terabytes) to track how his son acquired English. Here is a clip that compiles all the instances from when his son tried to say the word “water,” over a six month stretch (listen for its gradual yet sudden transition from “gaga” to “water”).

Roy’s work is part of a growing body of research that tries to understand language acquisition, which is a hotly debated topic still in its infancy. In my last post I briefly touched on it by explaining why language is so metaphoric and interconnected. However, I realize that if we want to understand how language is acquired it is fruitless to just study syntax. It is like trying to understand planetary motion without mathematical equations; it’s easy to say that planets travel around the sun, but it entirely different explain why they do so. 

Unfortunately, there isn’t a Kepler for language acquisition, so I can’t offer anything novel here. However, it is worth contextualizing two contemporary theories that the language acquisition debate has spawned from. The first comes to us from Steven Pinker, who, in his 1997 book The Language Instinct, suggests that the ability to acquire language is geneticHe doesn’t explicitly state that there is a gene for language, “for any grammar gene that exists in every human being, there is currently no way to verify its existence” (322), but he does say that it is, “a distinct piece of the biological makeup of our brains” (18). So he is advocating that language is genetic in the same way that puberty or yawning is; there certainly isn’t “a gene” for either, but they are part of our genetic code nonetheless.

Park of Pinker’s project aims to figure out if all human languages are unified by some sort of “universal grammar,” (as Chomsky calls it) which holds that all “children must innately be equipped with a plan common to the grammars of all languages… that tells them how to distill the syntactic patterns out of the speech of their parents” (Pinker, 22).  The universal grammar debate is key for the language instinct hypothesis because if language really is instinctual, then you would expect it to be manifested similarly regardless of culture. Chomsky and Pinker have gone to great lengths to prove just that, but I will not get into the details for the sake of space (read Pinker’s book if your really interested).

In contrast, Stephen Jay Gould believes that thinking was an exaptation (a trait evolved for one purpose that later serves another pursue) that led to language. In his words, “natural selection made the human brain big, but most of our mental properties and potentials may be spandrels – that is, nonadaptive side consequences of building a device with such structural complexity” (The Pleasure of Pluralism, p.11). Evolutionarily speaking, Gould’s theory fits a bit better than Pinker who is faced with the Wallace problem, which asks how the Neolithic Revolution could have happened if brains had achieved its modern size roughly a million years ago. Simply put, language was too recent a phenomenon to be explained by natural selection, which is a much more gradual process. Gould accounts for this by saying that language is a byproduct of cognitive abilities that already existed.

Who is correct? Like so many things, the Pinker-Gould debate falls victim to dichotomy-fever i.e., our tendency to categorize things as being this or that. But that is not to say it is not helpful. As V.S Ramachandran explains, “neither of them is [correct], although there is a grain of truth to each… the competence to acquire [language] rules is innate, but exposure is needed to pick up the actual rules” (171). Like others, Rama believes that brains are language-ready, but do not have language built-in. This has been referred to as the still unidentified “language acquisition device,” and Ramachandran uses it to resolve Pinker and Gould’s contrasting views.

The next challenge is to understand exactly how genetics and experience work together to acquire language. This is part of Deb Roy’s project, which I think will turn out to be vitally important in the years to come. It is also a goal of many neuroscience labs out there today. And because they are all showing the brain to be more and more dynamic, it appears that an understanding of language acquisition will involve an understanding of the brain from a botton-up perspective. Unfortunately, this is a long ways off.

%d bloggers like this: