Our Need for Need

It is a trite, well-established truth that people like being useful. But there’s more to it than that, or rather, there’s also a stronger version of that claim. People do like being useful, but useful is a very broad term. Stocking shelves at a Walmart is useful, in that it’s a thing with a use, which needs to be done. And it’s true that some people may in fact actively like a job stocking shelves at a Walmart. But on the whole, it’s not something most people would consider particularly enjoyable, and it’s certainly not something that is considered fulfilling.

Let us then upgrade the word “useful” to the word “needed”: people like to be needed. While stocking shelves at a Walmart is useful, the person doing it is fundamentally replaceable. There are millions of others around the world perfectly capable of doing the same job, and there are probably thousands of them just within the immediate town or city. If our fictional stocker were to suddenly vanish one day, management would have no trouble hiring somebody else to fill their shoes. The world would go on. Walmart would survive.

Now this is all well and good, but I would argue that there is an even stronger version of this claim: people don’t just like to be needed, people actively need to be needed. Over a decade ago, Paul Graham wrote an essay called Why Nerds are Unpopular; it’s a long essay with a number of different points, but there is one thread running through it that in my opinion has gotten far too little attention: “[Teenagers’] craziness is the craziness of the idle everywhere”.

The important thing to note about this (and Graham does so, in a roundabout sort of way) is that teenagers in a modern high school are not exactly idle. They have class, and homework, and soccer practice or band practice or chess club; they play games and listen to music and do all the sort of things that teenagers do. They just don’t have a purpose. They are literally unneeded, shut away in a brick building memorizing facts they’ll probably never use, mostly to get them out of the way of the adults doing real work.

This obviously sucks, and Graham stops there, making the assumption that the adult world at least, has enough purpose to go around. Teenagers, and in particular nerds, just have to wait until they’re allowed into the real world and voila, life will sort itself out. And it’s true that for some, this is the case. A scientist doing ground-breaking research doesn’t need to worry about their purpose; they know that the work they are doing is needed, and has the potential to change lives. Unfortunately, a Walmart stocker does not.

To anyone who has been following the broad path of the news over the last decade , this probably doesn’t come as a surprise. It seems like every other day we are confronted by another article suggesting that people are becoming less happy and more depressed, and that modern technology is making people unhappy. Occasionally it is also noted that this is weird. We live in a world of wealth and plenty. The poorest among us are healthier, better-fed, and more secure than the richest of kings only a few centuries past. What is causing this malaise?

The simple answer is that we are making ourselves obsolete. People need to be needed, sure, but nobody wants to need. Independence is the American dream, chased and prized through the modern Western world. Needing someone else is seen as weakness, as vulnerability, and so we strive to be self-sufficient, to protect ourselves from the possibility of being hurt. But in doing so, we hurt others. We take from them our need, and leave them more alone than ever before.

Of course, Western independence as a philosophy has been growing for near on three centuries now, and modern unhappiness is a much more recent phenomenon. There are two reasons for this, one obvious and the other a bit more subtle. To start with, our modern wealth does count for something. A small amount of social decohesion can trade off against an entire industrial revolution’s worth of progress and security with no alarm bells going off. But there is a deeper trick at play, and that is specialization.

In traditional hunter-gatherer bands, generally everybody was needed. The tribe could usually survive the loss of a few members of course – it had to – but not easily. Every member had a job, a purpose, a needed skill. That there were only a handful of needed skills really didn’t matter; there just weren’t that many people in any given tribe.

As civilization flourished, the number of people in a given community grew exponentially. Tribes of hundreds were replaced by cities of thousands, and for a time this was OK. Certainly, there was no room in a city of thousands for half the adult men to be hunters; it was both ecologically and sociologically unsustainable. But in a city of that size there was suddenly room for tailors and coopers and cobblers and masons and a million other specialized jobs that let humanity preserve this sense of being needed. If it was fine to be one of the handful of hunters providing food for your tribe, it was just as fine to be one of the handful of cobblers providing shoes for your town.

To a certain extent, specialization continued to scale right through the mid-twentieth century, just not as well. In addition to coopers and masons we also (or instead) got engineers and architects, chemists and botanists, marketers and economists. But somewhere in the late twentieth century, that process peaked. Specialization still adds the occasional occupation (e.g. software developer), but much more frequently modern technology takes them away instead. Automation lets one person do the work of thousands.

Even worse than this trend is the growth of the so-called “global village”. I, personally, am a software developer in a city of roughly one million people. Software development is highly specialized, and arguably the most modern profession in the world. At the end of the day however, I too am replaceable. Even if I were only one of the handful of developers in my city (I’m not), modern technology – both airplanes and the internet – has broadened the potential search pool for my replacement to nearly the entire world. My position is fundamentally no different from that of the Walmart stocker – I would not be missed.

At the end of the day, humanity is coming to the cross-roads of our need for need. Obsessed with individuality, we refuse to depend on anyone. Women’s liberation is slowly freeing nearly half of the world’s population from economic dependence. Technological progress, automation, and global travel are all nibbling away at the number of specialized occupations, and at the replacement cost of the ones that remain. The future is one where we all live like the teenagers in Paul Graham’s essay: neurotic lapdogs, striving to find meaning where fundamentally none exists. Teenagers, at least, just have to grow up so they can find meaning in the real world.

How is humanity going to grow up?

Other Opinions #48 – Intelligence Equals Isolation

http://tvtropes.org/pmwiki/pmwiki.php/Main/IntelligenceEqualsIsolation

Disclaimer: I don’t necessarily agree with or endorse everything that I link to. I link to things that are interesting and/or thought-provoking. Caveat lector.

If TV Tropes has a trope for literally everything, is anything really a trope anymore?

Pessimism and Emotional Hedging

In Greek mythology, Cassandra was given a dual gift and curse: that she would accurately predict the future, but that nobody would believe her prophecies. She became a tragic figure when her prophecies of disaster went unheeded. In modern usage, a Cassandra is usually just a pessimist: somebody who predicts doom and gloom, whether people pay attention to them or not.

We know that people are generally rubbish at accurately predicting risk; they seem to constantly over-estimate just how often things will work out. This is usually due to either the planning fallacy or optimism bias (or both; they’re very closely related). However, while that is by far the most common mistake, and certainly the one that’s gotten all the attention, the opposite is also possible. Yesterday I caught myself doing just that.

I was considering an upcoming sports game and found myself instinctively betting against the team I typically cheer for (that is, I predicted they would lose the game). However when I took a step back I couldn’t immediately justify that prediction. The obvious prior probability was around 50/50 – both teams had been playing well, neither with strong advantage – and I am certainly not knowledgeable enough about that sport or about sports psychology in general to confidently move the needle far from that mark.

And yet, my brain was telling me that my team had only maybe a 25% chance of winning. After much contemplation, I realized that by lowering my prediction, I was actually hedging against my own emotions. By predicting a loss, I was guaranteed an emotional payout in either scenario: if my team won, then that was a happy occasion in itself, but if they lost then I could claim to have made an accurate prediction; it feels nice to be right.

With this new source of bias properly articulated I was able to pick out a few other past instances of it in my life. It’s obviously not applicable in every scenario, but in cases where you’re emotionally attached to a particular outcome (sports, politics, etc) it can definitely play a role, at least for me. I don’t know if it’s enough to cancel out the natural optimism bias in these scenarios, but it certainly helps.

The naming of biases is kind of confusing: I suppose it could just be lumped in with the existing pessimism bias, but I kind of like the idea of calling it the Cassandra bias.

Wrapping up on God – Final Notes and Errata on “An Atheist’s Flowchart”

Over the last six philosophy posts (my “Atheist’s Flowchart” series) I’ve wandered through a pretty thorough exploration of the arguments underlying my personal atheism. Now that they’ve had some time to settle, I’ve gone back and re-read them and noticed all sorts of random stuff that was confusing or I just forgot or whatever. This post is going to be a scattershot collection of random notes, clarifications, and errata for that series.

Here we go:

In The Many Faces of God, I wrote “[from] the whole pantheons found in many versions of Hinduism, to the more pantheistic view favoured by Spinoza and Einstein”, which in hindsight is kind of confusing. I blame the English language. A pantheon (apart from the specific temple in Rome) is a collection of many distinct gods. A pantheistic view, confusingly, does not involve a pantheon but is in fact (quoting Wikipedia): “the belief that all reality is identical with divinity, or that everything composes an all-encompassing, immanent god”. Beliefs that actually involve a pantheon are called polytheistic instead.


The first piece of my argument, in two parts, ended up being long and fairly convoluted and still didn’t do a great job of explaining the core idea. One of the things that I failed to explain was this key phrase from the Less Wrong page on Occam’s Razor: “just because we all know what it means doesn’t mean the concept is simple”. I gestured confusingly in the direction of the claim that “god is a super-complicated concept” but I suspect that, unless you’re already well-versed in formal information theory, I wasn’t very convincing. Allow me to gesture some more.

Science explains nearly everything we can observe in a beautiful system of interlocking formulas that, while scary and complex to a layman, are still simple enough to be run on a computer. God cannot be run on any computer I know of. Many gods are, by definition, ineffable – complex beyond any possible human understanding. Even those that are hypothetical effable [is this a word?] are not currently effed [this one definitely isn’t] in nearly the same way we understand gravity, or chemical reactions, or the human heart.


In the third part of my argument, I mentioned briefly without explanation that none of the common logical arguments for god derive from my core axioms. It would have been helpful if I’d given some examples. I did not, because I am lazy. I am still lazy, and after poking around for a while cannot find a good example of something that I can work through in a reasonable amount of space.

If anybody wants to construct a formal logical argument from my nine axioms to the existence of god, please send it to me and I promise I will give it an entire post all to itself.


At the end of my fourth part, I linked to a t-shirt design which has already been removed from the internet. It was a snippet of this comic from Dresden Codak, specifically the panel in the third row with the text “I will do science to it”. It’s not really related to anything, but Dresden Codak is well worth reading.


In my fifth part I actually made a mistake and made a weak version of the argument I was aiming for. The better version, in brief:

  1. Science explains why people believe in god.
  2. You believe in science, even if you think you don’t.
  3. If god’s existence was the reason that people believed in god, that would contradict #1.

Therefore either god doesn’t exist at all, or the fact that millions of people believe is a coincidence of mind-boggling proportions which defies Occam’s Razor.

Other Opinions #46 – There’s No Real Difference Between Online Espionage and Online Attack

This one is a couple of years old but still relevant, especially with the recent ransomware attacks. We’re used to thinking in terms of human actors, where an informant is a very different kind of asset from an undercover operative. The former is a passive conduit of information while the latter is an active force for change. In technological conflict there is no such difference. Both activities require the ability execute code on the remote machine and once that is achieved it can be used for any end, passive or active.

https://www.theatlantic.com/technology/archive/2014/03/theres-no-real-difference-between-online-espionage-and-online-attack/284233/

And of course any vulnerability, once discovered, can be used by whatever criminal claims it first.

Disclaimer: I don’t necessarily agree with or endorse everything that I link to. I link to things that are interesting and/or thought-provoking. Caveat lector.