Sunday, June 25, 2017

How to Read More: Episode 1 of 3 - Understanding the Naive Phase

5 Minute Read

Whatever you're working on right now, whatever problem you're struggling with, is probably addressed in some book somewhere by someone a lot smarter than you. Save yourself the trouble of learning from trial and error. 

-Ryan Holiday

Ah, the benefits of reading. In addition to objectively making us smarter, habitual reading also has proven to reduce stress, improve memory and analytical thinking, increase vocabulary, and even stave off Alzheimer's. All this with no negative side effects. Put another way, reading makes everything better.

Despite these undeniable benefits, we don't read much. We claim we're too tired, we don't want to spend money on books, and we just don't have the time. Then, we plop down on the couch and watch an average of nearly three hours of television while shopping online. In fact, despite not having time to read a book, 95% of texts are read within three minutes of being sent.

Welcome to a three-part journey (my first miniseries!) where we'll be exploring our peculiar relationship with books (Part I), an audiobooks manifesto of sorts (Part II), and some sort of dramatic conclusion that will make you say, "today begins the new me!" as some sort of musical montage begins in the background.

The Naive Phase

Understanding our relationship with books requires a quick trip to the beginning of our lives and to the dawn of our relationship with the recorded words of others.

In his 2012 book Mastery (which is certain to make my top five books of this year), Robert Greene explains that compared to other animals, we humans enter the world remarkably weak and helpless. While baby birds fly after just a few days and infant giraffes can walk within a half hour and run within a day (!), baby Homo Sapiens are weak, vulnerable, and comparatively helpless for anywhere from 12 to 18 years (give or take) before we can truly function on our own.

This extended phase of dependence serves a vital purpose, Greene asserts, as it gives us time to develop our most powerful weapon--the human brain. This period comes at a peculiar price, however, as our childhoods involve idealizing our parents, teachers, and anyone in an authority role whose strength and reliability we depend on.

We are corralled into classrooms, mandatory books are imposed on us (which is Bad Idea Jeans), and we are instructed to carefully read every page because there might be some irrelevant detail in there that will show up on some bullshit scantron test. Then we're in big trouble, as our grade would be affected and our future employment prospects would certainly suffer. Scare tactics, baby!

As if this weren't torturous enough, anyone who went to elementary school in the 80s and 90s certainly had to deal with the judgement-fest that was popcorn reading. If you want to make a 9 year-old with a speech impediment (me in 1990) hate reading for 25ish years, force him to popcorn read in front of his peers.

Love the movie, but this scene haunts me
As a result of this Stalin-like approach to our literacy, we feel sensitive and vulnerable about our reading abilities.  Books become associated with chores, nerds, and feeling like we don't measure up to some arbitrary ideal, and many of us end up with contempt for books and a bizarre pride in how much we don't read. This is Greene's Naive Perspective to a T, and many of us (including the leader of the free world) never outgrow it.


As we enter adulthood, this naive approach ceases to make sense. There are no tests on our reading anymore, it doesn't matter if we skim or skip a few pages (or chapters), and there's no absolutely no reason to feel any negative emotions whatsoever toward books. 

Considering the availability of the works of the greatest thinkers and leaders in human history (you can access them using the very object you're looking at right now!!), failure to move past this phase has undeniable consequences. As Greene explains, our naivete subtly drains us of curiosity and replaces it with conformity to social norms, pursuit of leisure and immediate pleasures, and a general propensity toward a mechanical, robot-like existence. 


If you're still with me at this point, congratulations! You've recognized that the naive approach is for cowards, and you're ready to change your relationship with reading. Next week, we will explore several techniques to upgrading our approach to books. These include, but are not limited to the following:
  • Audiobooks are not cheating
    • Neuroscience proves the impact on the brain is identical to traditional reading
  • The art of nonfinishing
    • Develop the habit of nonfinishing that which is boring or unproductive
  • You don't have to start at page one
    • It's your book
  • Talk to the text
    • Write notes and make highlights
    • It's your book
Until next week!

Tuesday, May 16, 2017

Our Evolutionary Need for Silence

4 Minute Read

Saying nothing sometimes says the most.
-Emily Dickinson

We're now at a point where noise is almost unavoidable. As I type this in my mother's backyard, I hear pleasant sounds of nature: birds singing, leaves rustling in the breeze, and the occasional bumblebee. In addition, the M-14 expressway, about a mile from where I sit, constantly hums in the background. Two houses down, a power saw has been in intermittent use for the past two hours, and two neighbors just began mowing their lawns. To top it off, some lady set off her own car alarm and apparently doesn't know how to disengage it.

A few months ago, I wrote of learning to not let car alarms infuriate me. More recently, however, I've learned I don't really have a choice in the matter. The human body has evolved to constantly scan the environment for dangers and opportunities, thus it subconsciously reacts to every stimulus it encounters. In the case of noise, sound waves vibrate tiny bones in the ear, and these vibrations fire electric signals directly into the auditory cortex of the brain.

Even while we sleep, environmental noise causes stress hormones to be released, and recent studies have found that people who live in cities experience chronically high levels of these hormones--particularly cortisol. Cortisol increases blood pressure, which contributes to heart disease and cellular damage, and has a debilitating effect on the immune system. As Michael Finkel writes in The Stranger in the Woods: "Noise harms your body and boils your brain."

Noise is pretty much global now

For evidence of this brain-boiling, look no further than a recent global survey by the World Health Organization which concluded that people in wealthy countries--equipped with expressways, power tools, and Steve Miller's Abracadabra--suffer from depression as much as eight times the rate they do in poor countries where the sounds of nature are more common. Further, suicide rates have shown to increase alongside economic prosperity. To paraphrase Sebastian Junger, rather than buffering people from depression, increased wealth in society seems to foster it.

Silence is Beyond Golden

You don't need to voyage to a Tibetan Monastery to win your brain back from lawnmowers and car alarms,  but you do need to regularly seek out a soothing environment. Researchers have found that a daily fifteen-minute walk in the woods will cause significant decreases in cortisol, and that two hours of silence per day prompts cell development in the hippocampus--the region of the brain responsible for memory.

Other studies, including one conducted at the University of Michigan titled Your Brain In The Woods vs. Your Brain on Asphalt, have found that time immersed in nature makes us calmer, less depressed and anxious, with improved cognition and memory. In other words, silence makes us smarter and happier. Evolutionary psychologists believe our bodies relax in quiet, natural environments because our senses evolved there and thus remain calibrated to them.

The burgeoning world of neuroscience agrees. An experiment at New York University placed 20+ Buddhist monks and nuns inside MRI machines, tracking brain activity while they meditated. The result? When we choose to sit in silence, the brain is as active as ever. What changes is where the activity occurs. While talking, texting, and keeping busy, the activity occurs in our cerebral cortex--which is a thin layer on the brain's surface. When we experience silence, the cerebral cortex rests, allowing deeper and more ancient brain structures to come alive. Those of us living busy, noisy and distracted lives almost never get access to these areas. As Finkel brilliantly writes, "Silence, it appears, is not the opposite of sound. It is another world altogether, literally offering a deeper level of thought, a journey to the bedrock of the self."

Afraid of Silence?

Despite the obvious benefits, few of us seek out silence. In fact, a University of Virginia study recently found more than half of men and at about a third of women would rather endure painful electric shocks than to sit in silence for 15 minutes. How can this be? Too much technology? Too many obligations? When we were hunter-gatherers, the average Homo Sapiens needed about 4,000 calories of energy to get through the day. This was pretty much food and a daily campfire. Today, Americans burn an average of 228,000 calories per person per day to feed our stomachs, propel our cars, charge our devices, and to keep our homes between 68 and 72 degrees (I'll be damned if I put on a sweatshirt or open a window!). We're burning 60 times more energy than our ancestors. Is life 60 times better?

Perhaps this is all a result of the disease of more. In our modern culture, as my man Yuval Noah Harari points out, the idea seems to be that if we have a problem we probably need more stuff. To get more stuff, we must produce more.

We don't have to buy into this idea. 2,500 years ago a Chinese fella named Lao-tzu wrote the Tao Te Ching, and there's a reason it's the 2nd most popular book in the history of the world (next to the Bible). "Those with less become content," says the Tao, "those with more become confused."

Sunday, April 30, 2017

Lawns: American Dream or Waste of Time?

6 Minute Read

It is not enough to be busy. So are the ants. The question is, what are we busy about?
-Henry David Thoreau

Out for a few jogs recently, I've observed massive amounts of manual labor in pursuit of beautiful lawns. Pausing to take note of the machinery, fuel consumption, man hours, and monetary considerations, I've become increasingly curious for the following reasons:

  • Aside from being nice to look at, lawns are objectively worthless. We don't eat them, don't graze animals on them (for fear of trampling and ruining aesthetic value), we don't harvest our grass clippings and sell them at a farmer's market, and many lawn owners forbid people to even set foot on them.
These policies are often enforced by snipers

  • I'm fairly certain our hunter-gatherer ancestors never gave a rat's ass about lawns, nor did anyone have a lawn around the entrance to their cave.
  • At some point in human history, we collectively agreed that a well-kept lawn was to be desired, and that a poorly-maintained lawn was good reason to believe ol' Gil Carruthers had been laid off from the hat factory again, and that his wife probably left him for good this time.
  • So where did all this come from? Why do lawns dominate the landscape despite a bewildering amount of upkeep and an equally bewildering lack of objective value?
History of Lawns

The concept of beautiful, worthless lawns at the entrances to private residences and public buildings dates back to the late Middle Ages in France and England. Back then, no peasant could afford to waste land, time or energy on grass, so lawns were reserved for aristocrats interested in showing everybody up. A rich, manicured lawn in front of the chateau shouted to every guest and passerby, "check me out, bro. I ball so hard I can just waste a bunch of land and have my serfs do all the labor."

Built in the early 1500s, Chateau de Chambord (roughly 200km south of Paris), is credited with originating the lawn as we know it.

During the ensuing centuries, we Homo Sapiens grew to associate lawns with power, wealth, and social status. As the Industrial Revolution gave rise to the middle class and provided it with mowing machines and automatic sprinkler systems, suddenly millions of people could afford to cultivate a home field and let all the neighbors know what time it was (metaphorically).

King Francis I? Nope. Kenny Powers.

We tend to think of the destruction of animal habitats as caused by industry, agriculture, oil pipelines, and republicans, but don't lawns do the same thing? According to this article in the Chicago Tribune, roughly 95% of land in the lower 48 United States has been developed into cities, suburbs, golf courses, and farmland. Lawns account for more land usage (over 40 million acres) than corn, wheat, and fruit combined. In fact, a whopping 20% of the land areas of Massachusetts and New Jersey are lawns. 

Meanwhile, the honey bees that pollinate our food are dying off at troubling rates, monarch butterflies are nearing extinction, and the global ecological balance grows increasingly out of whack with every new subdivision.

Some more fun facts:
To be fair, I'm aware that there are people who pride themselves on their lawns, that genuinely enjoy lawn maintenance as a meditative hobby, and that possess limitless wisdom when it comes to various sods, fertilizers, and edging techniques. If this makes them happy, of course they are free to mow.

But it is also true that one in five Americans rates lawn maintenance as his/her least enjoyable chore--below raking leaves, folding laundry, and even emptying the dishwasher (easily the worst one). Perhaps Washington Post columnist Christopher Ingraham was on to something when he wrote, Lawns are a soul-crushing timesuck and most of us would be better off without them.

But what would go in their place? Field turf? Japanese rock garden? Having spent the majority of my life around homes in cities and suburbs of the American Midwest, I've had a hard time picturing a front yard without a lawn. What would that even look like?  Out for a jog earlier today in my native Plymouth and Canton, Michigan, I found two contrasting properties:

Hanford Rd., Canton, Michigan

Gyde Rd., Canton, Michigan

Which of these properties is more visually appealing and which is a more desirable place to live are both debatable. However, there is no debate when it comes to which requires more time, money and energy to maintain. The average American spends 70 hours per year on lawn care and that's including people who live in apartments AND those who don't enjoy that household responsibility. For those in charge of their lawns, that number is certainly much higher. 

Time is a non-renewable resource. Are lawns worth it?

Saturday, April 15, 2017

Watch Your Language

3 Minute Read

Nothing either good or bad, but thinking makes it so.
-William Shakespeare

Behold! Marcel Duchamp's 1917 work of art, Fountain:

Here's the backstory, 100 years ago (almost to the day), my man Duchamp purchased this mass-produced urinal, signed it with a pseudonym (R. Mutt?), and popped it into a museum. Initially, the Society of Independent Artists rejected Fountain and refused to display it at an exhibit. 87 years later (2004), a group of 500 selected British art world professionals named Fountain the most influential art work of the 20th century--ahead of works by guys like Picasso and Matisse.

Is it art? As it turns out, Fountain simply exists as an objective thing and it's up to us as individuals to determine its meaning--if there is one at all.

Art and urinals aside, what about the rest of our world and all of our opinions about it? Which foods we prefer, whether or not today's weather is convenient, whether or not we are successful, our emotions, on and on. Who decides what we call "good" and "bad"?

For example, Simon Sinek (so hot right now in the intellectual community) used to be nervous before speaking in front of large groups. Then he realized that the symptoms that accompany nervousness---increase in heart rate, perspiration, anticipation of what's coming, etc.--are the exact same as those that come along with excitement. In short, nervousness and excitement biochemically identical, and the difference is in how we choose to interpret these sensations.

Nervousness and excitement aren't the only emotions we make mistakes about, countless psychological studies have revealed that perceived "good" and "bad" emotions are so similar to one another that we often can't tell the difference. People mistake fear for romantic arousal, and often find jokes to be hilarious even though they can't understand what's so funny.

Is this love or am I terrified?

We often talk of weather as pleasant or unpleasant, but how can this be? Isn't weather just an objective reality? Can we choose to enjoy a cloudy day and call it good?

Defining Success

In addition to our difficulty in defining art and determining how we feel, we also seem to drive ourselves nuts trying to figure out if we are successful or not. Evolution has hard-wired us to want to fit in with the tribe, to compare ourselves to others, and to define our worldly success accordingly.

Now that we no longer live in tribes, any success we achieve is usually followed by meeting other people who are more successful and who make us feel insignificant. Then, we subconsciously define them as successful and we are (by default) striving for success again.

Up to Us

At the end of the day, it's up to us. There's the objective world--weather, the opinions of other people, and urinals--that is not our to change, and there's our opinion of that world which is entirely up to us. When defining success, are we using an inner or outer scorecard? As billionaire investor Warren Buffet explains, if we place our self worth on what we perceive the outside world thinks of us, we're setting ourselves up for disaster.

By directing attention inward, however, we take control and decide what everything means. We decide if life is a urinal or a work of art.

Sunday, April 2, 2017

Sugar is the New Stalin

7 Minute Read

Emancipate yourselves from mental slavery, none but ourselves can free our minds.
-Bob Marley, Redemption Song

You probably haven't recently thought to yourself, "I wonder what, if any, is the overlap between Stalin's tyrannical rule of the 20th century Soviet Union and the sugar industry's relationship to the current obesity epidemic?" Don't worry, I've got you covered. I'll even weave in some March Madness for entertainment purposes.

Here we go:

If there were a March Madness-style office pool for "worst human ever," the most popular pick is probably Hitler. Hitler's atrocities are well known, he had a trademark look (swastika and ridiculous mustache), much like North Carolina's basketball pedigree is well-documented. UNC frequently plays on primetime against Duke and other longtime rivals, and celebrated alums such as Michael Jordan and Rasheed Wallace are often in attendance.

Stalin, who I've lampooned before, is more of a Gonzaga. Yes, he's a #1 seed and not exactly on a surprise run, but there is less certainty around Joe Joe. The scale of his atrocities (such as how many Ukrainians he starved) are still debated, and the facts that do exist came to light long after his death. Similarly, even long-time college basketball enthusiasts such as myself couldn't tell you what conference Gonzaga plays in or who their arch rival is. Stalin's atrocities (much like sugar's) live in a Gonzaga-like uncertainty.

By the way, if such a bracket pool were to come up, I'm going with Uday Hussein as a sleepy five-seed to take the crown.

OK enough hoops. Let's get down to it.

Stalin ruled the then-Soviet Union from 1922 - 1953 (before just kinda mysteriously dying) and was pretty much a cheat and a liar the whole time. A few highlights:

  • Stalin is ultimately responsible for Holodomor, a man-made mass famine in the Ukraine from 1932-1933 that starved millions of Ukrainians. Soviet officials initially denied this even happened, then early estimates ranged from 1.8 million to 12 million starvations. Quite a range. More recent estimates are between 7 and 10 million.
  • Stalin's Gulag prison camps were home to actual criminals and anyone who disagreed with Stalin on anything...which was also a crime. These labor camps existed throughout the Soviet Union and were probably home to 15 million different people at one point or another. It's impossible to estimate how many people were killed as a result of this.
  • The one that really caught my eye is known as the Katyn Massacre: After ordering up the executions of over 25,000 Polish POWs in early 1940, Stalin personally told a Polish general that they'd "lost track" of the POWs somewhere in Manchuria. A year later, Polish rail workers found the mass grave of POWs, so Stalin blamed the Germans until the day he died. It wasn't until 1990 that the Soviets took responsibility.
At any rate, Stalin was able to pull this off largely because of the cult of personality he created early on. Cult of personality refers to the usage of propaganda, mass media control, and political thuggery to create a larger than life, often worshiped image. Soviet children grew up believing Stalin was their protector, their source of anything good, and that he was to be adored. 

Oh but he seems so grandfatherly!

This cultish brainwashing was so unavoidable for Soviet children that when they became adults, they were unable to know any other reality. So there's 31 years of Stalin for you.

Let's pivot and look at sugar:

Again to deliver a March Madness perspective, let's look at cigarettes and sugar. In terms of damage to your body, cigarettes are Hitler and North Carolina and sugar is Stalin and Gonzaga. We're well-aware of how bad cigarettes are, but sugar has been living in a confusing Stalin-zone since 1967, when the Sugar Research Foundation paid Harvard scientists about $50,000 to conduct BS studies that would find saturated fat--not sugar--to be responsible for heart disease and obesity.

As obesity related illnesses have annually killed millions of people in the 50 years since the Harvard study, big sugar companies like Coca-Cola and Nestle have thrown millions upon millions of dollars at various public health organizations and studies to--at the very least--keep us confused as to what's making us fat.

Look no further than the food nutrition label on anything in your cabinet to see the Stalinesque brainwashing in action. Today, when you look at the nutrition facts, there will be government recommended daily amounts of each nutrient. You will not find a daily recommendation for sugar:

This is called 'Pullin' a Stalin'!

It may be news to some that sugar is the worst thing to happen to public health since the plague, but countless recent studies have found that eating too much fat doesn't necessarily make us fat, but eating too much sugar definitely does. Looking at the top ten causes of death in the US, a high-sugar diet (aka, the typical American diet), is a direct contributor to at least six (heart disease, cancer, stroke, alzheimer's, diabetes, and kidney disease), and you wouldn't necessarily be considered a lunatic if you tried to argue sugar has at least an indirect hand in the other four (lower respiratory disease, accidents, influenza/pneumonia, and suicide). 

Sugar's tyranny developed a cult personality and turned turned full-blown Stalin in 1977, and continues today. Consider the early life of a middle-class American child:

These are real

  • Once able to operate (and crave) an iPad, YouTube soon follows and children barely a year old are peppered with advertisements for sugar nonsense. These ads, displaying Stalin-like contempt for human life, often involve adored movie and television characters--further twisting young imaginations and creating a nation of addicts:

Whatever it takes 

  • With mind-bending ads continuing in the background, children are then exposed to American cultural events such as holidays. These annual celebrations often began as religious and national days of remembrance, but sugar has rendered them almost unrecognizable:
    • While still sort of about love, Valentine's Day is the only time you'll ever buy a random box of chocolates and put yourself through the torture that is eating those chalky candy hearts with the writing on them.
    • Easter - Once a celebration of the resurrection of the son of God, today's child is exposed more to pursuits of sugar-filled plastic eggs, chocolate bunnies, jelly beans, and the somehow-still-a-thing sugarbombs known as Peeps.
    • Although we've definitely lost sight of the meaning of the Fourth of July (the USA's first day as a place), American culture seems more focused on explosives, cheap beer, and assorted meats than sugar. 
    • Do yourself a favor and try to explain Halloween to a Slovenian. I did this once, and it went something like this, "well, alright, so I think it used to be about being scary or something, but now it's kinda morphed into kids dressing up like various cartoon characters, and going door to door to get sugary items from strangers. Then, when these kids grow up, the ladies dress like scary prostitutes and men use it as a reason to get hammered."
    • Throw in Thanksgiving pies and a month of Christmas cookies and you see the point I'm trying to make.

One striking difference between these two ruthless killers is that many of Stalin's kills involved horrifyingly slow starvations, while sugar's MO is directly responsible for the first time in human history that there is too much food, and overconsumption is the problem. For those who enjoy graphs as I do, I grabbed a few from this health blog (worth reading):

To bring this puppy full circle, I used to run two marathons per year (2013-2016) while eating whatever was in front of me and "doing my best" to eat vegetables now and again. I weighed well over two bills and just assumed that's how things were supposed to be until a nasty case of plantar fasciitis forced me to stop running for almost a year. During that time, I read a few books and decided to stop eating sugar and carbs in an effort to offset the new sedentary lifestyle. In three months of inaction, I lost thirty pounds, had to buy new jeans, and I found myself with more energy and focus throughout the day. 

Which brings us back to Stalin. It's weird to think about the life of a Soviet peasant, and the type of courage it probably took to even think there could be another way. Whereas Stalin died before communism fell, other communist dictators weren't so fortunate. 

Nicolae Ceasescu (chow-CHESS-koo), Romania's mustache-less version of Stalin from 1965-1989, was essentially overthrown during a speech in the middle of Bucharest on December 21st, 1989. You can see it in the video below at about the 2:15 mark, homeboy is delivering a seemingly routine speech complete with Trump-like statements such as "I believe in the best for Romania!" and "Hey this whole me being in charge thing is gonna continue and thus everything will continue to be rad!" Then, a small group (probably aware of the recent fall of the Berlin Wall and the revolutions in other Eastern Bloc countries) starts to boo and jeer. Please, do yourself a favor and watch the look on his face when this happens:

For those curious as to what happens to Ceasescu, after 80,000 Romanians collectively decided they don't buy his bullshit anymore AND that there are way more of them than there are of him and his cronies, they force themselves into Ceasescu Tower (or whatever), he escapes via helicopter, then is captured, tried, and executed (with his wife) four days later.

Here's my point with all this: If Romania can collectively shift their mindsets and overthrow their Stalin, is the same thing possible for us with Big Sugar? How can we collectively become aware of this imaginary order and Ceasescu the sugar industry?

I'll tell you how. Take the Fed Up 10 day challenge , Ceasescu your personal sugar industry for a week and a half, and see how you feel. I blogged about my experience about six months ago when I was eight days into it. A sip of soda is a sip of Stalin. It doesn't have to be this way!

Sunday, March 19, 2017

The Conscious Decision to Want Less

7 Minute Read

All of us waste precious life doing things we don't like, to prove ourselves to people we don't respect, and to get things we don't want. Why do we do this?
-Ryan Holiday, Ego is the Enemy

We Homo Sapiens are a finicky bunch. It seems as though once we've attained whatever pleasures we seek, it isn't long before we want more. As bestselling author and history professor Yuval Noah Harari explains, nobody is ever made directly happy by getting a promotion, winning the lottery, or even falling in love. These events have the ability to make us happy by triggering pleasant sensations in our bodies, and those sensations alone are what make us happy. 

The bad news about these sensations is that millions of years of evolution has created a condition in our minds that causes these pleasant sensations to wear off relatively quickly, leaving us with the desire to experience these sensations again and again. For thousands upon thousands of generations, our pleasure/pain system evolved to increase our chances of survival and reproduction, not our happiness. 

Think about it, what if some rare mutation had created a hunter-gatherer who, after enjoying a delicious antelope and a blissful night with a love interest, enjoyed an everlasting sensation of happiness and contentment? Who knows, a million years ago this may have happened. If it did, this hunter-gatherer would have enjoyed an extremely happy and short life, and his genes wouldn't have gotten very far. Conversely, his rivals who were designed to pursue more antelope and more mates had a much better chance of surviving and passing their genes to the next generation. 

Reflecting on his experience coaching Magic Johnson, Kareem Abdul-Jabbar, and the Showtime Los Angeles Lakers of the 1980s, Pat Riley famously coined the phrase The Disease of More, explaining that "success is often the first step toward disaster." After winning the 1980 NBA title, the following season's Lakers played like a collection of individuals, each looking for his own version of more--more playing time, more money, more media attention, etc. The '81 Lakers lost in the first round of the playoffs. Some title defense.

The 1980 Championship led to the Disease of More

And so it goes for us. As we pursue whatever it is we're after--lucrative jobs, attractive mates, big houses--the deeper parts of our minds only understand that we are pursuing pleasant sensations. These sensations are designed to be fleeting and, if we're not deliberate about what we're after--we have no choice but to pursue them constantly.

By Design or By Default?

Take a moment to consider what's on your calendar for the day and week ahead. Are these commitments a result of what's important to you, or what's important to someone else? The fact is, if we don't prioritize our lives, someone else will. In addition to our predisposition to want more, we're also hard-wired to desire social acceptance thus we often make decisions based on comparing ourselves to our peers

In other words, when we don't have a clear sense of what we're pursuing, we fill the void with our own social games based on comparing ourselves to others and pursuing what we think others want. We overvalue nonessentials like new cars and big houses, and we pay attention to trivial intangibles like how many Twitter followers we have and how many likes we get on a Facebook post. Considering the opportunity cost--that time and attention could be going to our loved ones, our health, etc.--this poses a real problem. 

As Captain Ahab pursues Moby Dick over the course of 822 pages, it becomes clear that Ahab is chasing the whale for reasons he doesn't even understand anymore. He's simply hell-bent on winning the game.

Wait, what am I doing here?

Decision Fatigue

Can anyone remember what it's like to be bored? It's rare these days. Not long ago, when there was a line at the grocery checkout, a friend was running late to meet for lunch, or the flight was delayed, we had to wait. Now, however, a staggering amount of information, entertainment, and distraction is at our fingertips at all times. I'm not suggesting humans necessarily need to be bored all the time, but this abolition of time spent alone and in thought is certain to have consequences.

According to most psychologists, our ability to make decisions (also known as willpower) is like the muscles in our bodies in that it wears down when used over and over again. Every decision we make is like another rep in the gym. While we make decisions about things that don't matter, willpower fatigues, and we begin to make decisions based on default. 

Grocery chains are well-aware of this, and have all structured their stores accordingly: Our willpower is strongest when we first enter the store, so healthy food (produce) is right there up front. As we proceed through and willpower diminishes, there are the cookies, candy, and ice cream. As we move to the checkout--exhausted from all the decision making--that's where we'll find alcohol, tobacco, and gossip magazines (who is buying those things!?). Now we're out of willpower, and we default to Wal-Mart's agenda.

Outside the grocery store, default can often mean giving in to the world around us, and that world is almost constantly pushing us to want more. Get a job that pays more so that you can spend more, get more, and keep the cycle going. You'll notice that the outside world implores us to take exotic vacations, dine at fancy restaurants, and buy new electronics, but we never seem to be encouraged to go for a walk with Mom, sit by a pond, or visit a National Park. 

Photo from a recent trip to Del Norte campground at Channel Islands National Park (that's my shadow). Unlike Vegas and Disney, National Parks do not advertise.

How to Want Less

Perhaps Tyler Durden, Brad Pitt's character in Fight Club, said it best, "Advertising has us chasing cars and clothes, working jobs we hate so we can buy shit we don't need."

According to several recent studies, the number one regret of the dying is that they never pursued their own dreams and aspirations, opting instead to live up to the expectations of others. With this in mind, perhaps it's time to take Ryan Holiday's advice: To take time out, figure out what's important, and take steps to forsake the rest.

To find that space, we may have to say no to certain people and commitments. Jeff Weiner, CEO of LinkedIn, schedules two hours of empty space onto his calendar every day in order to process what's happening around him. Bill Gates famously takes it a step further by taking a biannual Think Week--a week off simply to think and read.

Our ancient biological desire for more no longer makes sense in a world saturated with stuff and the opinions of others. Whether it's two hours per day, two weeks per year, or 10 minutes each morning, it's imperative to deliberately create space to want less and to do less.

If this is all there is, it is more than enough.

Sunday, March 12, 2017

The Puzzling Story of Daylight Saving Time

5 Minute Read

We've all obeyed this dictum for a hundred years, and no one can really understand why.
-Michael Downing

If one thing is clear about Daylight Saving Time (DST), it's that it is unclear.

A recent informal survey of some of the smartest people I know regarding the reasoning behind this biannual jet lag infusion produced the following hypotheses:

Uh, I think it's the farmers. Isn't it?
I heard it's for school buses and, like, kids.
Isn't it for, uh, like TV?

These absurd answers, combined with my own lack of knowledge on the issue, have prompted this blog post. What are the origins of DST, and why do we still do it? Here's what I found out:

First let's dispel a few myths:

1. Daylight Saving Time has absolutely nothing to do with farmers. In fact, considering the rise-with-rooster lifestyle, farmers were probably among the last Homo Sapiens to own clocks.

2. Ben Franklin didn't seriously want to do it. Although he was the first to bring it up, he was certainly joking. During his time as French ambassador, Franklin wrote a letter to spoof the lazy French and make them aware of how much daylight they slept through.

The Real History
While the idea had been attempted in a few local communities, credit for implementation at the national level goes to none other than the bold, innovative, and sometimes genocidal early 20th century Germans! This is what most of them looked like:

Here's how it went down: shortly after making the bold (and unprecedented) decision to fight a war against THE WORLD, Kaiser Wilhelm, Paul von Hindenburg (of blimp disaster fame), and other jerks in spiked helmets realized they had no plan for any sort of wartime economy, and that they would have to improvise. Here's what they came up with:

1915: Start rationing bread and hope that does the trick. This was a short-term success.

1916: Slaughter millions of pigs, eat them, free up whatever grains the pigs were gonna those too. While another short-term success, this had obvious consequences after they ate everything and the war continued. The threat of civilian starvation became constant from this point forward.

1916: Ok let's have everyone eat turnips for a while. As widespread malnutrition set in, there was virtually only one food left and it was ordinarily used as animal feed. Turnip Winter is well-known in German culture today, and is almost certainly invoked by scores of contemporary German parents as part of vain efforts to get kids to eat unsavory vegetables and such.

1916: Maybe we can conserve kerosene if we get everyone to move their clocks ahead an hour? This kinda makes sense if you think about it. The world was lit by kerosene lamps and candles at the time, and if people utilized natural light for an "extra" hour each evening, it could add up to something over time. So the Germans implemented it and sparked a wartime fashion trend.  Like early 90s suburban middle schoolers and Girbauds, soon all of Europe wanted to be down with DST (yeah you know me!).

Was it effective? According to Michael Dowling's Spring Forward: The Annual Madness of Daylight Saving Time, it's impossible to know. Energy consumption varies widely with the weather from year to year, and that's under ordinary circumstances. 100 years ago with the world at war, I'm assuming a mass data project of such little consequence was not a high priority. The world pressed on with DST based on a hunch.

1918 DST Promotional Material

American Response

By the time the United States jumped into WWI in the spring of 1917, the DST domino effect had run its course in Europe. Some dude named William Willett proposed American DST, threw out a baseless prediction of $25 million in annual savings, and proceeded to confuse a relatively simple-minded nation with an unprecedented concept.
Those in favor of DST believed working parents would be able to play with their kids for another hour, working women could walk home safely during daylight, and that DST would lead to an increase in social welfare (somehow).
Those opposed employed similar irrationality, arguing DST would prevent people from leisurely mornings and would directly cause overcrowding of transit lines (somehow). Ultimately, in 1918, America decided to give it a shot. And then? A comedy of errors!

  • 1918 - America implements Daylight Saving Time 
  • 1919 - Everyone hates it, American repeals DST and replaces it with nothing
  • Spring 1930 - Josef Stalin declares DST to be a thing in the Soviet Union and vows to personally devour anyone who doesn't think his idea is awesome.
  • Fall 1930 - Stalin, bless his heart, hilariously forgets to order everyone to fall back later that year, resulting in all clocks in every Russian time zone being off by an hour for the next 61 years. Oh, Josef Stalin, I had you for a savage and murderous tyrant, but forgetful!?
C'mon Man!
  • 1920 - 1941 - Everything is weird, but people can get along ok.
  • 1941 - 1945 - FDR tells everybody we're shifting to "war time" (DST). Everyone likes the idea and it's back on for a while.
  • 1945 - 1966 - Everything is weird again, and our prosperous lifestyles are making things complicated. No federal law exists and cities/towns/communities can just do whatever. 23 different start/end dates to DST exist in Iowa alone, and even the Twin Cities of Minneapolis and St. Paul can't seem to get on the same page. People are missing meetings, late for sock hops, and an hour early for church.

Minnesota Newspaper - 1965
  • 1966 - The Uniform Time Act of 1966 is passed. Everyone is mercifully forced onto the same page, and that page happens to include DST.
  • 1980s - Clorox and 7-Eleven fund the Daylight Savings Time Coalition in an effort to extend DST beyond 1987. They convince both Idaho senators to vote for it based on the theory that DST fast-food restaurants sell more French fries, and those are made from Idaho potatoes.
Modern Daylight Saving Time

Today, DST plods along despite a glaring lack of convincing evidence that it does any good for anybody. In fact, recent studies have concluded the annual "spring forward" leads to an immediate increase in auto accidents, workplace injuries, and even heart attacks (!).

Before conducting research for this blog, I believed that people belonged in one of two camps: Those in favor of DST abolition and the ignorant. Now, however, I understand there is a third school of thought on this issue.

If we were to get rid of Daylight Saving Time, what goes in its place? Nothing? Do we leave the clocks sprung forward or fallen back? Meet halfway? Are we ready to have this debate? When the US toppled Saddam Hussein in 2003, they knew they were getting rid of a dickhead. What they did not prepare for, however, were the unintended consequences of overthrowing an iron-fisted dictator who was able to maintain order by way of assholery...and now we have ISIS.

The similarities between Saddam Hussein and Daylight Savings Time are uncanny

In my recently-informed opinion, Daylight Saving Time--much like Saddam Hussein--is bad for the world. But the alternative--lengthy debates based on personal preferences while we certainly have bigger fish to fry--could be worse. So we're stuck with it.

DST across the globe today. 1/4 of the world population is affected by DST.

Saturday, March 4, 2017

Troublesome '17: Differentiating Signal From Noise

5 Minute Read
We face danger whenever information growth outpaces our understanding of how to process it.
-Nate Silver

In addition to our status as the only animals able to think about the future, we Homo Sapiens are the only earthly beings able to ask questions. In The Upright Thinkers, Leonard Mlodinow writes, "Chimpanzees and bonobos can learn to use rudimentary signing to communicate with their trainers, and even to answer questions, but they never ask them."
For all other earth residents, things are just what they are. We, however, are in constant pursuit of why and how.

Recording Information

Roughly 10,000 years ago, we figured out a couple of nature's secrets and acquired the ability to farm and to domesticate animals. As a result, we no longer needed to hunt and gather our food and could begin to hunt and gather knowledge. This was a big step.

About 5,000 years ago (3,000 BCish), the defining trait of human civilization appeared--the written word. Once we could write shit down, we could build upon the knowledge of those who came before us and, ultimately, outgrow the limitations of individual knowledge and memories. Awesome!
500 or so years after that, we decided to create a profession dedicated to passing on knowledge, and we have evidence of the first schools in Mesopotamia. So to all you teachers out there, your profession is roughly 4,500 years old. Not quite as ancient as farmers and prostitutes, but a cornerstone of humanity nonetheless!

The First Books

Jumping ahead a couple millennia, we see the emergence of Homer's Odyssey, the Bible, and others by Plato and Aristotle that you may have heard of. But even then, books were luxury items produced one at a time by scribes, and would have cost roughly $20,000 a piece in today's money. 
The pursuit of knowledge, although possible, was still futile until 1440 when a German fella named Johannes Gutenberg invented the printing press. Now, books could be mass-produced, and that $20,000 copy of the Bible you had your eye on quickly became $70. Game changer.

Information quickly became ubiquitous, but competing authors and publications often contradicted each other. This had consequences: Martin Luther's Ninety-five Theses (1517) sold over 300,000 copies (bestseller!), and promptly dragged Europe into a series of wars that lasted over a century (1524 - 1648).

Gutenberg's printing press also allowed for mass production of errors. Look no further than what we now refer to as the Wicked Bible, published in 1631, which omitted one measly word and ended up causing all sorts of tomfoolery in London:

The "Wicked Bible" 1631

Regardless of whether you're more offended by 124 years of butchering each other in the name of God or by "honey, I won't repeat myself. God commands me to bang whores, and whores I shall bang!" Gutenberg's invention ultimately revealed a human characteristic we were previously unaware of: When presented with an overwhelming amount of information, we instinctively take a shortcut. We digest what appeals to us, disregard the rest, and proceed to high-five those who share our views. Compounding this, we are embarrassingly bad at separating truth from misinformation. Or, as Nate Silver puts it, differentiating signal from noise.

Modern Times

Quick refresher: 
  • 2 million years ago, we stood upright. First humans.
  • 40,000 years ago, we developed language, curiosity,  and the ability to think about the future
  • 10,000 years ago, agriculture, pursuit of knowledge beyond "how do I stay alive?"
  • 5,000 years ago, written language, humanity begins to build collective knowledge
  • 577 years ago, printing press, mass production of information and misinformation
  • 20 years ago, the internet becomes a thing
  • 10 years ago, the internet is unavoidable, Homo Sapiens exist in perpetual state of distraction

Today, more information is created in a single day than one of us can consume in a lifetime. Every minute, the world receives:
  • 400 hours of new Youtube video
  • 350,000 Tweets
  • 3 million Facebook posts
  • 4 million emails
Our primitive brains may not be ready for this

Our brains, wired to detect patterns and draw conclusions, are dangerously overmatched against all this noise. Consider this: According to Sapiens author Yuval Noah Harari, terrorists killed a total of 7,697 people in 2010 across the globe. Don't get me wrong, that blows. But comparing that to the 3 million people killed by obesity-related illnesses the same year puts things into perspective. Here's what that looks like on a standard bar graph:

Why isn't everybody talking about this? While we squabble about ISIS and immigrants, sugar quietly kills us all.

While it doesn't look like we'll be putting the genie back in the bottle on this one, we Sapiens are equipped with methods to deal with this information onslaught:

1. We get to decide what matters
Considering the bewildering lengths companies will go to acquire our attention, this can be difficult. Exercise, meditation, and/or a simple stroll through nature can all help with this.

2. We get to be ruthless to what doesn't
Put another way, you don't have to have an opinion on issues you've deemed irrelevant.

For what does matter, let's get our Nate Silver on and identify the difference between hedgehogs and foxes:
  • Hedgehogs are type A personalities who believe in Big Ideas--in governing principles about the world that behave as though they were physical laws and undergird virtually every interaction in society. Think Karl Marx and class struggle or Sigmund Freud and the unconscious.
  • Foxes, on the contrary, are scrappy creatures who believe in a plethora of little ideas and in taking a magnitude of approaches toward a problem. They tend to be more tolerant of nuance, uncertainty, complexity and dissenting opinion.
Hedgehogs are hunters, always out for the big kill, whereas foxes are the gatherers. Let's be foxes.

Thursday, February 23, 2017

How To Be Invincible

3 Minute Read

You don’t have to turn this into something. It doesn’t have to upset you.
-Marcus Aurelius

90 days of meditation doesn't make me an expert. But I can say with confidence that I am experiencing undeniable benefits, and the above quote from Rich Homie Marcus sums it up nicely.

Meditation, often known as strength training for the mind, has been an impulse-control bench press session for your boy these past three months. 

Here’s what I mean:

All my fellow bros out there understand that muscles are strengthened by lifting, holding, and repetition. Meditation works the same way (minus protein shakes). By focusing on our breath and getting a feel for our senses, we’re able to find center and to exist in the present moment. Without fail, our minds will wander off and get distracted, and that’s the whole point (at least it is for me). We notice this and gently gain control of our impulse, return our attention to the breath and back to the present moment. This happens repeatedly over the course of a ten minute sesh and--Whoomp there it is--you have mental reps over the course of a period of time, and a road to invincibility!

Symptoms of Invincibility

Anyone who knows me well is aware of the blind fury I’m capable of going into when a car alarm interrupts an otherwise peaceful Sunday afternoon. Impulse takes over, my mood sours, and I venomously rant about the ineffective irritant. Often continuing well after the noise has ceased.

Meditation practice has helped me to realize that the car alarm itself doesn’t directly make me insane with rage. I had been choosing that response. Nowadays, the car alarm serves as a reminder of an important lesson I’ve picked up recently: We don’t control the world around us, but we do control our response to that world and how we choose to interpret life's events. We don’t control the existence of outrageous political blogs, but we do decide whether or not we will acknowledge them. We don't decide if the iPhone screen cracks or if she doesn't text back, but we do decide if those things matter.

In The 48 Laws of Power, Robert Greene writes, “Contempt is the prerogative of the king. Where his eyes turn, what he decides to see is reality. What he ignores and turns his back on is dead.”
In other words, we choose to let things bother us. Just as easily, we can choose to not notice them and to consider them unworthy of our attention.

So do your worst, actual and metaphoric car alarms...

U can't touch me.