Skip to content

The Participation Trophy Generation

You’ve heard it. I’ve heard it. “Millenials are entitled. I didn’t get a participation trophy when I was a kid. Young people are killing _____ industry.” Let’s pull this apart from a few different angles, shall we? There are economic, societal, and psychological forces at play that make this more complicated than a simple blame game. And if you’ve read my previous post about demanding more than “common sense,” you know I like to dig in past the billboard, headline, or 140-character perspective to tease out what’s going on underneath.

I am 28 years old, born in 1989 just two weeks after the Berlin Wall came down. This puts me squarely in the usual range given for GenY/Millenials of approximately 1980-1995. So of course this entire post could be tossed aside as some sort of “too sensitive” self-defense, but I hope you stick around a little longer than that to see what I have to say and perhaps explore of these perspectives a little further for yourself!

 

For starters, a quick look at some economic data from the US Economic Policy Institute for the past few decades (money means numbers means chart-friendly trends and figures!) shows that average CEO:employee wage ratio has skyrocketed since 1980 when many of our parents were entering the workforce. It used to be 20-30x (where economists recommend it should be) but is now in the ridiculous range of 300-500x in 2013. Not only that, but most of the new income from innovation, development, and productivity increases has been going to the already-absurdly-wealthy in our country, with the earnings of the top 1% far outpacing the rest of US workers and income concentration worse now than at any point since before the Great Depression.

 

These are very clear data points, and they spell out an economic and work-culture trend that has been hitting most Americans age 20-40 very hard. There’s a 1stBank ad on I-25 coming up from Denver to Boulder that really pisses me off: it shows an individual aged 25-30 with the caption “I use my parents’ HBO account” and the bank’s tagline, Saving is in style. What a joke. Hell no, we aren’t trying to find a bunch of ways to save money because we’re frugal, selfish, or cool. We aren’t switching to locally-sourced foods or ethically-manufactured clothing because of “style,” it’s because we see things that we can’t afford or can’t sustain in the old models. We aren’t killing industries at whim, we’re pursuing new ways of behaving (not just new looks) that we find more ethical, more sustainable, or more economical. Take a look at the Netflix documentary A New Economy for several examples and a more lengthy discussion of this cultural shift. And although it’s not just Millenials engaging in these pursuits, most of these realizations are decidedly not the concerns that were common at a time when working full-time on the factory floor could support a family of 4 (or when a minimum wage summer job could put you through college debt-free). We aren’t being “stylish,” we’re barely scraping by because—despite being better educated, better aware, and better equipped with tools that have phenomenally increased our productivity—Millenials in the workforce make roughly 20% less than Baby Boomers were making at the same age.

Now you might be thinking, “Well that’s entitlement right there… they got $n so you deserve $n, too, huh.” But when a young person in the 2010s looks at salary information for a job they’re considering, they aren’t asking “What do I deserve?” but rather “What should I expect?” Salary reports take million of current jobs into account and show what the market is actually paying for equivalent job titles and backgrounds, and it follows the basics of supply and demand. A salary doesn’t come free, it’s not an “entitlement” or a hand out… it’s what a solid, dependable, educated, and educatable worker is being paid by local employers. Not to mention someone who has proven themselves to deliver far more than their original duties, or someone who is likely far more skilled with computers, software, and automation than the managers and executives who are hiring them.

 

This “I deserve” misconception and similar notions feed the accusation of “entitled” that is ceaselessly leveled at me and my [age] peers. But when we look closely, it should probably seem entirely reasonable for a young person to have believed what was constantly and universally told to them by their parents, teachers, and advisers (basically anyone older, trusted, and respected). Namely, that “If you do X, Y will happen.” So millions of young people actually did X, but—almost just as universally—do not see Y, and if they do it’s years later than anticipated or budgeted. Meanwhile we watch many of our peers skip X altogether and instead lie, cheat, and steal their way into millionairehood (did you watch Zuckerberg’s cringe-worthy “hearing” before Congress?) as US regulatory agencies continues to ignore—and in many instances actually encourage—tax loopholes, employee abuse, and other unethical business practices.

Millenials inherited a lot of problems in the world, and for the most part are trying desperately to solve them. US population has jumped by 30% since 1980, with world population increasing more than twice as fast. Environmental abuses are changing climates and sending species into extinction at a mind-boggling rate. Economic inequality is the worst it’s been in a century. And on top of all these real-world problems, we’re more connected to the rest of the world and more aware of issues than any generation before us, leading to a higher rate of stress-related social, psychological, and physical disorders. This isn’t to say that this is an unusual situation: every generation says “What’s the world coming to?” as it gets older, and thinks the next generation is a bunch of softies. I’m simply illustrating how the rise of the internet in our teens and tweens led to an explosive dissemination of information that has utterly changed how young Americans develop as adults and see ourselves in the world.

 

I mean, if there’s any truth to the “Millenials are entitled” cliché, it’s that we’re really the shock generation… in between our cushy-computerless-desk-job Boomer parents and the learned-climate-modeling-software-in-high-school GenZ. Millenials weren’t trained to fix society, and we weren’t raised with constant connection via internet and smartphones. We’re in shock as we come into [what would have been] our own [were we born 30 years earlier] and realize the massive amount of additional work that needs to be done to feel useful and content. Boomers didn’t have to do that work (though they did have, and still have, their own struggles), and GenZ has the benefit of being raised in it—being aware of environmental, economic, and social issues almost from birth, essentially knowing that they’ll have to be ready for it in adulthood. There won’t be many little league games at all if they can’t even provide clean air for their children.

And speaking of, the “Well I didn’t get a participation trophy when I was a kid” retort that Boomers and GenX give? You’re right. Kids growing up in 1950s got just as sad as kids growing up in the 1990s when they lost the big game. But they had parents raised in and hardened by the realities of global conflict, economic depression, and cultural instability. When the kids in the ’50s cried, their parents said “Jeez, man up,” or “Too bad, work harder next time.” And if they did, maybe they pulled out a W. An entire generation that learned throughout their upbringing that they could achieve almost anything by keeping their heads down and working hard enough—often without the help of their distanced household (parents born 1920s-30s weren’t called the Silent Generation for nothing). So imagine that those kids grow up, get married, have their own kids, and those kids are, well, not naturally great at baseball. The parents have been programmed their whole lives that they can work harder to get better results, and they might naturally apply that attitude to kids as well… cheer harder, shout louder, get meaner, and maybe eventually even insult the little league coach when all that hard work didn’t actually make their kids better catchers, pitchers, or hitters.

Trust me, every kid who’s lost at something wished they got a reward just for trying. But I sure never got a “Nice try!” award. I don’t know what the hell people are talking about with “participation trophies,” and I played baseball, basketball, ran track, and competed in artistic and musical competition from 1998 to 2008. Nonetheless, even if that stereotype was accurate, every kid in every generation has hated losing! There’s nothing special about Millenials in that regard. However, it takes quite a conceited “I worked hard for this” parent to think that their kid is so special that they deserve a trophy despite losing. To shout and yell at the umpires. To write angry letters (and emails, if you had dial-up yet) to the league administrators.

But as harsh as that may seem, I’m not trying to single out our parents for doing everything wrong. As we’ve already discussed, every generation has its own weaknesses, many born of the weaknesses of the previous generation. Priorities shift and concerns change with technology, art, and industry. I can only hope that the economic, cultural, and psychological shock and turbulence experienced by my generation as we witness the power and speed of social media, artificial intelligence, climate change, and mass market global production will show the next generation how very far there is still to go. And I wonder—with hope fringed by fear—what they might be inheriting from us.

A Confession of Conviction

This blog post originally appeared on Facebook on 6/10/2017. It has been edited for grammar and formatting.


Several people have asked me the same or a similar question over the past year or so: Why are you so hostile to religion these days? Weren’t you brought up Christian? What gives?

What follows is as short an answer as I can honestly give.


It is not that I don’t want to believe in God anymore. I can’t believe in God anymore, same as I find I am incapable of convincing myself that the sky is green, the earth is flat, or that gravity actually pulls things apart. I no longer consider theism a legitimate, credible conclusion from any reasonable or desirable perspective.

I was not “corrupted by the world” as friends or family have suggested. I did not have atheist friends who talked and tempted me out of faith. I was not led astray by a dogmatic, anti-Christianity professor. In fact, until the day I admitted to myself that the religion in which I was brought up no longer (and had never) made any real sense, I had not read a single atheist or humanist piece of literature, nor encountered legitimate arguments for a secular worldview against a religious one. Every “science vs faith” debate I had ever witnessed in my childhood had been from the point of view of faith: the non-religious participant was cast as a God-hating heretic, and I accepted this perspective without much question. I had no philosophy courses in high school or college and was in fact attending both Bible study and a local church on a weekly basis. I even kept going to church after I had graduated, but had to call it quits in order to be honest with myself and maintain my personal growth.

I spoke of “the day I admitted” my atheism, but in reality I have no recollection of a specific date beyond mid-2015. I just know eventually “atheism”—that is, the absence of belief in a deity—was the term I realized best described my philosophy. But if there was an event I had to point to that, at least in memory, had perhaps the biggest impact on my transition out of theism, it occurred at church in 2011. As I said I was a weekly attendee at two Bible studies and my church, and was also reading my Bible, Science & Faith (which argued in favor of specifically Christianity’s compatibility with the scientific process and data thereby obtained), and the masterful C.S. Lewis. I was in a very mindful state, having plowed through a half-dozen Lewis books (including The Problem of Pain, The Great Divorce, and Mere Christianity) in a matter of months, and we found ourselves in the college Sunday school class discussing hell and damnation.

By that time I had already started feeling decidedly uncomfortable with the American Christian concept of eternal damnation for billions of people simply because they were born and died in the wrong country whose society, parents, school, and every single friend already believed in the wrong god. I was trying to square this sense of deep injustice with my faith in Jesus and I voiced my concern. I wouldn’t say anything so dramatic as “the room was stunned,” but I was clearly not among sympathetic listeners. At some point I brought up the end of Narnia’s The Last Battle, where Aslan seems to welcome to paradise those of all religions, suggesting that simply following through on the divine urge (by worshiping whatever god your society revers) is worthy of salvation—Aslan greets the false prophet of Tash, “Beloved, unless thy desire had been for me thou wouldst not have sought so long and so truly. For all find what they truly seek.”

Lewis himself admits that many elements of his fantasy (and even religious) works are based more on imagination than theology, and I believe perhaps he let his inner compassion show too much here, failing to follow through and cement it only for fear—possibly subconscious—of how an open embrace of universalism would be received.

I believe that in that classroom I encountered the very reactions he may have been trying to avoid: chiding condescension, righteous indignation, self-assured correction—almost all accompanied by disappointing looks and raised eyebrows. I do not remember the full dialog and so will not try to recreate it, but rest assured that many verses were quoted at me with the goal of admonishing me and convincing me that only faith in the singular, literal Jesus Christ was enough to avoid eternal torture in hell and separation from God, including but not limited to “Whoever is not with me is against me…”.

I was certainly upset, and shaken that my seemingly friendly compatriots and holy brethren were so sure of their own state of salvation, their own having followed all the right rules, that they were quite evidently disgusted at the very possibility of people who did not think exactly as they did getting to enjoy the very best possible existence. I had had a similar though muted reaction from a member of my own family when I had brought up the subject several months earlier, where I even referenced early church writers who had the gall to hope for a similar universal salvation sometime in the post-Revelations era… a look of horror, or perhaps shame, as though she were embarrassed and worried on my behalf.

I contemplate that this reaction is similar to the sense of outrage one might feel at the grocery store when perhaps only 2 clerks are ringing up customers and each line is 7 or 8 people long. Suppose I had waited in line for 20 minutes and finally got to the front when a 3rd lane was opened and a woman just coming out of an aisle gets instant service. I would probably laugh, or shake my head, or even moan with a sense of injustice at her good fortune considering my own inconvenience. After all, I followed the rules. I made the effort. I endured the hardship. Why should she get her needs addressed same as me without paying the same cost?

But I would not demand that she get back in the seven-person line behind me, nor shout that she shouldn’t be able to check out at all. How much less should we begrudge the eternal joy of our fellow human beings? (Even Jesus himself admonishes this selfish urge in his parable of the workers.)


There was perhaps a 6-month phase 2 years later during which I considered myself somewhat of an “areligious Christian,” in that I loved and tried to follow the words of Christ, but disregarded the entirety of the Old Testament, the church’s interpretation of scripture, and all the writings of Paul—who, it seemed to me, had stolen the leadership role of the Early Church from the actual apostles (to whom Christ directly gave authority in this world) in order to establish his own religion that frequently disagrees with Christ’s teachings. I felt most pastors and congregations should call themselves Paulian instead of Christian since they seem to prefer the obsessive, judgmental, anti-women, anti-homosexual works of Paul over the literally-everyone-is-welcome acts and words of Jesus.

But once I snapped out of it and admitted that picking and choosing was an invalid way to approach an all-or-nothing doctrine, only then, with a sense of loss and instability, did I seek out the literature, philosophy, and company that might help make sense of my newly secular situation.

Perhaps I have spent too much time focusing on that day in Sunday School, but it’s the best I can do to come up with a singular event that seems to have played a significant role in my conversion, or deconversion as some unnecessarily put it. As I stated before there was not a guiding influence, character, or conversation wholly responsible for my transition; my problems began and continued while surrounding myself with faith-based people and practices.

In the years since I have come to see a host of incriminating aspects of not just a single interpretation of Christianity, but religion as a whole. Regarding the faith of my upbringing, it is now clear to me that my acceptance of it was entirely predicated on having been “trained up in the way he should go” (also known as “indoctrination”) and that any intellectually honest, curious, and critical individual coming from the outside cannot fail to see it as such: a cult, or cultural delusion and obsession. There is no evidence for a global flood—there are trees older than the supposed age of the earth. There is no proof of Hebrew slaves in captivity in Egypt—no remnant Hebrew artifacts, culture, or language, nor any record of them in the expansive Egyptian engravings (which actually suggest the wonders of Egypt were designed, engineered, and built by well-paid skilled laborers). There is no foundation for even the existence much less the words and miracles of the person of Jesus Christ, all records of his life and deeds having been generated decades after their supposed occurrence—extraordinary claims require extraordinary evidence, and the best evidence for Jesus is inferior even to evidence for similar 21st-century characters (for instance consider Indian guru Sathya Sai Baba, 1926-2011, who claimed virgin birth, reincarnation, and numerous Christ-like miracles that have literally thousands of documented, living supposed-eye-witnesses, yet clearly qualifies as the subject of a “shared delusion” or perhaps illusion). In the vein of Occam’s Razor, it is wise to accept the most plausible solution as the best—is it more likely that the God-man Jesus existed, escaping all contemporary accounts including the thorough Roman records (nearly every aspect of his birth and childhood in the Gospels, like King Harod’s reign and the “every man to his home town” census, directly contradicting all available historical, anthropological, and archeological evidence); or that the character of Jesus was manufactured in the first century AD, a winding and plot-hole-ridden story crafted to check a prophetic list of incompatible messianic boxes?


If you are religious, it is quite possible that you have responses to one or more of these concerns. However, it seems to me that quite literally every argument in favor of religion, spirituality, and certainly the Abrahamic deity Jehovah and his Only Begotten can be dismantled through rational discourse—yes, including the claim that, being supernatural, He is above our ability to understand (it’s involved, hit me up if you’re curious). From the pain of innocents in the world, to the concept of Original Sin, to hostility toward homosexuality, to the fact that the religion a person is most likely to adhere to depends almost entirely on the date and location of their birth (consider: there have been over 4,000 recorded deities in human history)… there is nothing about it that supports itself or stands up under scrutiny. If religious society is to maintain any hope of honesty and consistency, it must abandon the attempt to compete (or claim compatibility) with rational secular pursuits of understanding.

I want to live a better life. It is becoming increasingly clear to me that Christianity has nothing to provide on these grounds considering its history of selling women as property, burning scientists as heretics, drowning witches, lynching black men and women, attempting to stone gays, and on and on. There is no context in which it has proven itself a steadfast foundation for an ethical society. In fact, religious fervor is directly correlated (with tremendous statistical significance) to a decrease in happiness, health, equality, and level of education, and an increase in racism, violence, corruption, and abuse. This applies regardless of whether you are looking within a given nation or comparing across the entire world. The sooner we can eject from the decrepit craft of religion, the sooner we can float steadily onward in our pursuit of understanding the human condition and confering on the least of these our brothers and sisters an existence superior to that dealt to them by their genetic makeup, disease-ridden environment, or violent society.


Post Script

I have decided to issue a short addendum in response to a comment (basically that “Christians ruin Christianity”) that a friend made after reading my post. However, I don’t believe we don’t need to look any further than Christ himself for that.

As I said in my original essay, for a while I basically tossed out everything that wasn’t the direct words and life of Jesus. And although an honest reading of his deeds undoubtedly paints a very different character from the one that American Conservative Christianity would have us believe in (he was basically your stereotypical liberal hippy who paid his taxes, decried war and injustice, and tried to help everyone), we tend to ignore his actions and teachings that don’t sit right with us.

First, he lied frequently, from claims like “some of you will not die before the Second Coming” to “anything you ask for in my name I will do“.

Second, he supported barbaric social standards instead of pushing the Truth. He would have already known what things were right and wrong, but although he hung out with lepers and whores and tax collectors, he tolerated slavery and even used it as an analogy for our relationship with God, casting God as the slave-owner in his parables. Plus he condoned racism and nationalism within Israel and supported the Old Testament commands to murder kids who talk back to their parents.

Third, he was a serial hypocrite, often giving completely contradictory sets of commands; for instance “let your good deeds shine” vs “do not be righteous in front of others“. He also gave numerous conflicting instructions on how to inherit eternal life, including simply loving God and others, partaking in communion, and even hating everything EXCEPT God, plus another dozen or so contradictory requirements.

Fourth, last for now and perhaps most abhorrent to me, he treated sickness and disease as the works of Satan (or worse, the works of God). This is almost too frequent to reference completely, but the biggest instances are the epileptic boy in Matthew Ch. 8 from whom Jesus supposedly cast out demons (which went into a herd of pigs and killed them) and the blind man in John Ch. 9 who was sightless “so that the works of God might be displayed in him.” Why not teach people even a little about germs, infections, viruses, or genetics instead of pretending it was all supernatural in order to manipulate people? He could easily have avoided the next couple thousand years of medical barbarism, witch burning, and faith healing scam artists with a dozen verses about health and hygiene. He even had the gall to claim that “nothing that enters a person from the outside can defile them… in saying this, Jesus declared all foods clean.”

The fact that we tend to overlook or dismiss these things just confirms our superior human capacity for morality and decency compared to the lessons given us by “God” two millennia ago. The 10 Commandments, the O.T. laws, the Pauline letters, and even Christ himself cannot be considered the moral foundation of any healthy society, and again beg the question: Is it more likely that the character Jesus Christ as portrayed in the Scriptures was the actual, literal, perfect Son of God? Or that his story was manufactured decades later by normal, errant, first-century humans whose knowledge of reality, illness, and ethics can only be described as barbaric when compared to our current state of understanding?

Are there beautiful, compassionate, and inspiring messages to be taken from the Gospels? Of course. But treating it as an all-or-nothing Divine Truth cannot in my opinion be considered honest, rational, or moral.

Desirable | Useful | True

More and more I’ve been thinking recently that pieces of information can be scored according to these three categories… a claim can be placed somewhere on the scale that runs between desirable and undesirable, between true and untrue, and between useful and useless.

And these three aspects must not be confused.

For instance, telling a child that a monster will come and take them away if they don’t clean their room may rank as highly useful, but it is both untrue and undesirable (we don’t actually wish that monsters existed and had free reign to abduct messy kids).

Similarly, telling the child “I before E except after C” could be useful for teaching rudimentary spelling and simple words, and although this would be convenient rule to follow, it is not actually true once they advance to an elementary-school-level vocabulary (just 44 words in the English language obey the rule while at least 923 break it).

As another example, telling the child that a deity watches over them every day and loves them may be a desirable situation (don’t we all want to be loved and cared for?), but that does not have any bearing on the usefulness or truth of the statement. It may actually be much more useful to tell the child that when they are mean to someone they need to make it right with the person, not with a god. It may be more useful for the child to value their relationships with family and classmates over their relationship with an imaginary friend. It may plant the seed of a more productive member of society for the child not to believe that abusive people will face consequences after death, but that preventing them from preying on the weak is something we should strive for in this life.

Our president and his administration have chosen to define “fact” as those things that are desirable or useful to them, and unceasingly ignore or attack information that is true and accurate. This is the hallmark of liars, narcissists, abusers, and dictators.

Don’t get me wrong: desirable claims may sometimes have their own sort of usefulness, regardless of reality or practicality. One might call these “placebo” claims, or “fake it til you make it” claims, which carry out some desirable function simply because they are themselves desirable. But if we prefer reality to delusion, this is the lowest form of value we should ascribe to any statement—if we create a value scale for examining statements, desirableness is worth, say, one point.

Useful statements can also be powerful: if my car is low on wiper fluid, has a broken tail light, needs some air in the tires, and the engine was stolen, when the tow guy comes up and asks me “What’s wrong with it?” the correct answer—although true—is not “A lot of things.” The information that will get the most done in this situation is that THERE’S NO ENGINE. Say we give usefulness a value of three points.

However, if we appreciate reality and honesty over fantasy and deceit, we must give the most value—five points on our arbitrary valuation scale—to those claims which are true: statements that agree from every angle of detailed inspection with the physical, historical, calculable reality of our world. This is not snap judgments, it is not first glances, it is not laymen’s terms, and it is certainly not common sense (see my essay on the dangers of “common sense”). It is persistent, continuous, meticulous inspection from individuals dedicated to discovering the facts, regardless of usefulness or desirability.

Truth is hard. It goes against instincts and wishes, it goes against appearances and assumptions, it rarely fits a billboard, tweet, or news chyron. Sometimes you can express the idea in a form that is useful but also mostly true (they say education is the process of telling smaller and smaller lies), but the greatest information of all is that which is true AND useful AND desirable. This is a rare discovery indeed.

From that perspective it can be seen that desirability is really just icing on the cake and must be ready for sacrifice the moment self-satisfaction gets in the way of real accomplishment or illumination. Usefulness is much harder to sideline, and sometimes even usefulness plus desirability can come pretty close to trumping truth. But in my experience these are almost exclusively in sensitive social situations because we deal in words and emotions and expression and relationships and culture and art and history… I think the typical mundane example is, “Does this dress make me look fat?” but it is not hard to find more substantive instances where a straightforward answer may do more harm than good (consider: “A meteor the size of Africa came out of nowhere and will impact the earth in an hour wiping out all life instantly without warning” is a true statement—would it be more merciful to let everyone end their lives in peace, or expose them to hysteria, fear, sadness, etc. since there is nothing to be done about it?).

But truth is what put foot prints on the moon. Truth is what eradicated polio from the United States. Observation, calculation, doubt, verification, exploration, evidence. The hard work of discovering the mechanisms of our physical, social, political, cosmological, quark-riddled universe uncovers the truth of reality from which all practical information is distilled. Truth is behind every major social improvement, medical breakthrough, and scientific discovery throughout human history, and must be the grounds upon which all legislative efforts are based. Applying the label “fake news” to dogged inquiry and tireless investigation while sharing on social media the latest David Avocado Wolfe nonsense is exactly the same as lying. It is adding your feet to the masses already trampling out the fires progress and opportunity.

If we each closely examine the things that we hold to be true, we will doubtless uncover many ideas that we cling to because we wish them to be true, or because we feel they will get something done. The hard task is to admit when these do not also match the reality of our existence.

And so these three remain: desire, use, and truth. But the greatest of these is truth.

Scientifically Speaking, “Just A Theory” Is High Praise Indeed.

Today on social media I witnessed yet another confusion regarding the differences between the terms “law” vs “theory” vs “hypothesis.” These are thoroughly well-defined concepts in the world of scientific thought and research, but they don’t exactly match with our day-to-day interpretations of the words. So let’s take a quick look at them and breakdown the relationships between the three as they are used in professional publications and academic endeavors.

First, when you make a guess or a suggestion about how you think something might work, you have generated a HYPOTHESIS. Something along the lines of: “Considering an apple falls to the ground when I let go of it, I bet everything in the universe gets pulled towards each other in a similar manner.” A good hypothesis can be thought of as a potential explanation, sometimes vague and undefined, for the reality that we observe. It can be examined, deconstructed, analyzed, and tested to be proven right or wrong—upheld or discredited. If during this examination there is discovered significant evidence from multiple sources, methods, or institutes that proves it correct, it can spawn laws…

Something is a LAW only if it is a specific statement that is constantly seen to be true and can be expressed using mathematics. Something like: “F = G(m1xm2)/r^2.” That little discovery is known as the Law of Gravity, and has over and over (with absolutely no exceptions) been observed to matched how two objects in space are pulled together based on their mass and distance. The hypothesis of the apple falling to the ground was thoroughly examined and applied to other scenarios in astronomy and physics, and this law is a direct result of this analysis. Combining the expanded hypothesis with this law forms the backbone of a very important theory…

The highest form of scientific understanding is when a concept becomes a THEORY. A theory is a defining system of explanations that rose out of a strong hypothesis, supported by laws and consistent empirical data. In very simple terms, Newton’s Theory of Gravitation can be expressed: “We see apples fall, planets orbit, and gases collect around suns. There is a universal force that pulls all physical objects towards each other. The strength of this force changes depending on the relationship between the objects.” A theory can be disproved, improved, or even replaced, but it is the Final Evolution of an idea. It does not ever somehow become a law or fact (scientists don’t typically like to use the term “fact,” but we’ll talk about that later).

A law is not stronger or better than a theory; it is a compact and calculated statement, often an ingredient of a theory. Evolution by natural selection will never become a law. Evolution itself is already expressed in multiple laws (see Mendel’s Law of Segregation and the Law of Independent Assortment; generally these laws express how the genetic composition of offspring is different than the genes of its parents) and the Theory of Natural Selection explains how this happens again and again to change a population. This theory has been tested, attacked, and scanned thoroughly for problems across many decades, but it is currently our best model for describing the origin of species. It does not explain every single detail—which makes sense given the incredible scale of time required for it to operate—but every competing theory has been shown to be worse at explaining what we see in the physical world.

The word “fact” is usually only used by a scientist when expressing these things to laymen, since the general public often has trouble with these three terms. But no respected scientist will claim to another scientist that they have arrived at “fact.” They do their best to discover reality by coming up with hypotheses, calculating laws, and developing theories—all the while replacing erroneous information or filling in the gaps when better ideas are put forth (Newton’s 1687 Theory of Gravitation was for the most part supplanted by Einstein’s 1915 Theory of Relativity, which more thoroughly justifies our observations of the universe).

So remember: When someone says, “I have a theory about that,” the scientist in you should hear, “I have a hypothesis about that.” And when someone says, “Evolution is just a theory,” the scientist in you should hear, “Evolution is just a robust, logical, scientific model that is currently the most accurate way to explain our observable reality.”

Let the scientist in you speak out!

Common Sense is all too common

Many people seem to believe that “common sense” will solve all the world’s problems if people would just use it more. They post about it, talk to their friends about it, they shake their heads and politic. But left to common sense, Humanity would not have achieved many of the wonderful advances we have made over the centuries.

Look at birds. They are small, light, and feather-winged. Common sense tells us that humans cannot fly.

Look at the moon. A distant object hundreds of thousands of miles above us. Common sense tells us that we cannot walk on it.

Look at any celestial body, in fact—sun, stars, retrograding planets. What possible clues to the physics and histories of such far flung entities does common sense provide?

 

The answer: Not much. Common sense is not a great tool for making progress; it is a good tool for surviving. That’s where it comes from after all. Our mammoth-hunting ancestors who ignored the rustling bushes or assumed it was the wind were eaten when the giant sabertooth emerged; it became common sense to assume an individual—especially a dangerous creature—was behind the noise. A malicious mind behind the unknown.

It was a great tool for a long time. But common sense does not tell us anything about airfoils, combustion, or cosmology. No, these types of knowledge require something more. They require curiosity. They require someone brave enough to point his spear at the tall grass and advance, ready for anything. At times, it was a killer feline on its evening prowl. At others, it hinted at a mechanism of reality in the pattern of wind rushing through the valley as the sun set and the air cooled on the eastern slopes.

These realizations also require data. They require in-depth investigations and drawings of the curvature of avian wings, repeated observation of materials that are susceptible to heat and flame with explosive results, numerous calculations and models regarding the behavior of heavenly bodies. It takes teamwork and mulish determination to understand what’s taking place behind the scenes. It takes a rejection of appearances to instead ask, “Why? How can that be?” When common sense gives a simple answer, inquisitiveness tries to find what is really going on, and it’s very rarely simple.

While calling something “common” is usually to say that it is simple, found frequently, or shared widely, an alternate definition as worded by Merriam-Webster is one I feel more befits the word when used in this term: “falling below ordinary standards; lacking refinement.” In this sense, common sense is all too common… it provides a basic conclusion to a situation without helping really understanding the breadth and depth of what is going on. It is often contrary to reality, and ignoring the facts found through a thorough investigation in preference of “common sense” does far more harm than good.

 

Applying common sense to illegal immigration from Mexico, for instance, would reasonably bring one to the conclusion that a border wall and increased militarization of the patrol forces would decrease the number of undocumented entrants. After all that’s an added deterrent, a tougher obstacle. But this an issue for which there are many years of verifiable data—and it tells a different story. According to Douglas Massey, professor of sociology at Princeton who has been tracking and analyzing this dynamic for decades, the periods of strictest control along the Mexican-US border have been years of the greatest net illegal immigration (inflow is greater than outflow). In more relaxed times Mexican men would head north to take advantage of seasonal work that many Americans didn’t like or saw as beneath them (such as picking fruit), then take the earnings home and live fairly well the rest of the year. It was a benefit for both countries as American farmers had laborers, American stores had full shelves, American citizens paid low prices, and Mexican citizens improved their own lives and those of their families.

In periods of stricter border control, however, since it took more effort to get north, Mexicans would be reluctant to head back home when the picking season was over. They went to all that work and spent all that money to get here, they didn’t want to have to do it again next year. They’d stay and become a burden on the American infrastructure—picking season ended so they had to find jobs that students and uneducated Americans often took temporarily. They were also unable to pay into civil welfare programs like Social Security that would have benefited both legal immigrants and US-born citizens.

These are easily verifiable data points that can be tracked and trended over the years. But they are certainly not common sense. They are facts, found through exploring reality, asking why, and accepting the findings even when they didn’t seem to follow the narrative being told.

 

In these matters and others, common sense is not the intricate tool required to discover the truth. It is a blunt instrument, capable only of applying clumsy force to rudimentary concepts.

Unless we’re ready to live in a world without space flight, without GPS or astronomy, and without strong economies, we need to be willing to use uncommon sense.