Unsung Hero: Hans Asperger

Hans Asperger saved autistic children from Nazi death camps. Steve Silberman related the story:

The children in Asperger’s clinic immediately became targets of the Nazi eugenic programs and, in fact, one of Asperger’s former colleagues was actually the leader of a secret extermination program against disabled children that became the dry run for the Holocaust. So the Nazis actually developed methods of mass killing by practicing on disabled children and children with hereditary conditions like autism (even though it didn’t have a name yet), epilepsy, schizophrenia. So immediately Asperger had to figure out ways of protecting the children in his clinic. … One of the ways he did that was to present to the Nazis in the very first public talk on autism in history his “most promising cases” and that is where the idea of so-called high functioning versus low-functioning autistic people comes from really — it comes from Asperger’s attempt to save the lives of the children in his clinic. …

In fact, the Gestapo came to his clinic three times to arrest Asperger and to ship the children in his clinic off to concentration camps or kill them at a so-called children’s killing ward. But [the Gestapo officer] had affection for Asperger, he thought he was very good at what he did, so he saved Asperger’s life and so that’s how Asperger survived the war.


A vanishing comment? That’s, that’s chaos theory.

Today, I encountered a blog entry that was posted to /r/philosophy, a community whose standards are so high that anything short of a well thought-out and cleverly argued piece is likely to be rejected. I was going to leave a comment there, but needless to say, the blog entry in question didn’t hold enough water to survive for very long.

Unable to comment on the Reddit post, I went directly to the blog entry and wrote a brief response there. Later on, I checked back to find that my comment had mysteriously vanished without a trace. Technical glitch or someone’s dislike of critique, either way I was miffed enough to bring my response here, where it can’t be deleted.

I simply pointed out that the author’s “DVD theory,” which I henceforth referred to as “cinematic playback theory,” was really nothing more than a variation of Penrose’s cyclic Big Bangs, and thus not terribly original.

In the entry, the author made an argument about each Big Bang cycle commencing from a singularity that was an exact copy of its predecessor singularity, which would result in the same universe being created again and again (i.e., the cinematic playback) ad infinitum:

Considering that it is beginning from the very same starting conditions as it did before, could we not expect that, then, the universe would play out the same way again?
With everything and everyone following the same course of actions, through all of history, again and again and again?

The flaw in that argument is the assumption that singularities in Big Bang cycles would all be identical. What if, I explicitly pondered, one Big Bang’s contraction caused the resulting singularity for the next cycle to be different? Because chaos theory posits a sensitivity to initial conditions, even infinitesimal changes to a singularity would bring about a radically different universe.

Update (December 3, 2017). When the author posted the blog entry to Reddit, it was framed as chaos theory questioning free will (so that a write-up ostensibly about cosmology would have some relevance to a philosophy community). If the cinematic playback theory, as formulated by the author, is true, there’d be bigger things to deal with than free will. For example, if the same universe is played back again and again, and you as a person would be born and die again and again, there’d be questions regarding the self and the value of human existence. Spoiler alert. Kinda like how in the movie Groundhog Day, the everyday experience of Phil (Bill Murray) was devalued because of its exact repetition.

The Coming Era of Post-Work

The gutting of manufacturing jobs by robotics during the last few decades is a portent (and microcosm) of what’s to come for the wider global economy in the coming years. For instance, with autonomous vehicles coming to fruition, I expect jobs in the driving professions to be mostly obliterated in twenty years. Progressive governments will need to manage the transition to post-work—the report calls for a second Marshall Plan—but much of global politics is trending towards reactionary, neo-fascist regimes. If changes due to AI and robotics are mismanaged by ignorant, tech-illiterate governments, we may see income inequality on a scale beyond imagination, perhaps even revolution-worthy income inequality.

To deal with the transition, the analysts say many countries may have to launch “an initiative on the scale of the Marshall Plan, involving sustained investment, new training models, programs to ease worker transitions, income support, and collaboration between the public and private sectors.”

The options for income support could include initiatives such as more comprehensive minimum-wage policies, wage gains tied to productivity growth, or universal basic income.

Unfortunately, McKinsey notes that “investment and policies to support the workforce have eroded” over the past few decades.

Robots vs. jobs: Report says automation will displace up to 375M workers by 2030

The Secondary Integration of Jim Morrison

To quote the immortal Jim Morrison:

The most important kind of freedom is to be what you really are. You trade in your reality for a role. You trade in your sense for an act. You give up your ability to feel, and in exchange, put on a mask. There can’t be any large-scale revolution until there’s a personal revolution, on an individual level. It’s got to happen inside first.

He makes the case for a core self, but is there a self? Eastern philosophy has always posited there being no actual self, and science is increasingly coming to the same conclusion. The self—or the ego with a narrative that thinks it matters—is a construct created by natural selection to further the goal of gene propagation. That aside, I think Morrison’s main point is that the “cruft” of socialization obstructs who you can be. He uses the words “role,” “act,” and “mask,” which only make sense in the context of you and others, of subject and object. “Cruft” is a term we use in the software world to refer to superfluous code or architecture. You accumulate social cruft as you interact with others. Picture it… You’re at the office water cooler, listening to John and Jane talk about a third party, Fred, who isn’t around. The typical gossip scenario. They talk negatively about something Fred did or said, which just happens to be something you do or say occasionally too. In response to this, you curtail your behavior or speech, or you continue on but try to hide what you do or say from John and Jane. Either way, you’ve just sacrificed a bit of your authenticity at the altar of socialization. It’s unavoidable and you most likely do this unconsciously. After all, evolution “designed” us to be social creatures.

But as much as that “social cruft”—or the psychological artifacts experienced by an individual, in response to norms—acts as a social glue for blending in with others, it also puts up a barrier to the kind of ultimate freedom to which Morrison refers. To be what you can be, to be authentic, requires the Herculean, Nietzschean, and heroic act of rising above the herd instinct that’s been preprogrammed into each and every one of us. The Polish psychologist Kazimierz Dąbrowski named this oftentimes painful path “Positive Disintegration,” which entails “disintegrating” through multiple levels until a higher self emerges1, the attainment of which is called “Secondary Integration.” Something akin to enlightenment in Buddhism? Perhaps. Few have achieved either, but it’s the journey that matters.

  1. Positive Disintegration can also be thought of as a series of cognitive dissonances, frequently brought about by neuroticism, whose resolutions result in the ascent of a ladder of authenticity and Aristotelian virtue by the individual.

The Bit

There inevitably comes a time when, as a Bible-believing Chrtistian, you encounter the Book of Revelation and do one of two things: Accept it provisionally with the hope of a forthcoming explanation, or reject it altogether as the delusions of an author too enamoured with an ancient narcotic. A beast with seven heads and ten horns? Four apocalyptic horsemen? A lake of fire? A purported vision of the future, written by a questionable follower of some guru who may not have even existed. Modern Christians handle the cognitive dissonance of Revelation’s fantastic imagery and today’s science by insisting that said imagery are metaphors. Oh yes, the seven-headed beast is a stand-in for a “world government,” some kind of resusciatated Roman Empire. Sure, whatever you say Christian apologist two thousand years removed from the context of when antiquity’s equivalent of The Lord of the Rings was authored by its likewise equivalent of Tolkien.

All of this is a lead-in to a revelation I had the other day, a microcosm of the psychedelia experienced by John, of the tiny island of Patmos in the Aegean Sea, sometime before 100 CE. The Bit—a great, heavy song superbly performed by Mastodon—was actually conceived by the Melvins for their 1996 album, Stag. Mind equals blown. I’m usually good at metal trivia, having been immersed in the subculture for as long as I have, but this one caught me off guard. Guess I should’ve been paying more attention to the musical brilliance of the Melvins all along.

Here’s a 2008 performance of Mastodon and the Melvins sharing the stage and deftly belting out The Bit:

Featured image source: theMELVINS.net

The Polymath’s Return

The twentieth and twenty-first centuries have seen a steady progression of increasing specialization, a consequence, no doubt, of the concomitant rise in the compexity of various fields in the sciences and humanities. The problem has become so bad that there are specializations within specializations. Members of the skilled or educated classes frequently find themselves employed in niche areas without a lot of horizontal mobility (i.e., being able to move into another speciality within the same overall field). Specialization within specialization, or “recursive specialization,” is particularly endemic to information technology. It comes as no surprise, then, that while specialists have multiplied, generalists have declined. Generalists are known by many names: polymaths, multipotentialites, scanners, and Renaissance men; that last one is strikingly poignant, as the Renaissance is when such multitalented individuals last flourished (e.g., Leonardo da Vinci, the prototypical polymath).

It was in this context that I was heartened to read How Elon Musk Learns Faster And Better Than Everyone Else by Michael Simmons on Medium. Is Elon Musk bringing the polymath back into vogue? I like to think so.

Simmons, on the cross-disciplinary advantage:

Learning across multiple fields provides an information advantage (and therefore an innovation advantage) because most people focus on just one field.

For example, if you’re in the tech industry and everyone else is just reading tech publications, but you also know a lot about biology, you have the ability to come up with ideas that almost no one else could.

Ideas such as the genetic algorithms of evolutionary computation. Could siloed computer scientists or biologists, operating independently, have invented them? Unlikely.

Each new field we learn that is unfamiliar to others in our field gives us the ability to make combinations that they can’t. This is the expert-generalist advantage.

Exactly. Specialization leads to siloed thinking, which doesn’t lead at all. It quickly results in a terminus of originality. If creativity is thinking outside the box, siloed thinking is relishing the constraints of the box.

At the deepest level, what we can learn from Elon Musk’s story is that we shouldn’t accept the dogma that specialization is the best or only path toward career success and impact. Legendary expert-generalist Buckminster Fuller summarizes a shift in thinking we should all consider. He shared it decades ago, but it’s just as relevant today:

“We are in an age that assumes that the narrowing trends of specialization to be logical, natural, and desirable… In the meantime, humanity has been deprived of comprehensive understanding. Specialization has bred feelings of isolation, futility, and confusion in individuals. It has also resulted in the individual’s leaving responsibility for thinking and social action to others. Specialization breeds biases that ultimately aggregate as international and ideological discord, which in turn leads to war.”

As a kid, you were most likely asked, “What do you want to be when you grow up?” A policeman. A firefighter. An astronaut. The expectation was always a single answer. Specialization is, indeed, a dogma that gets drilled into us when we’re young. And the message doesn’t let up as we develop into adults: Follow your passion. (Note the singular noun.) But some of us—current and aspiring polymaths—aren’t content to tread the same banal path as everyone else; we want to blaze our own trails, which is especially true for the creatives amongst us.

Featured image source: GeekWire