Browsing Tag
science
What If We Don't

I hesitate to publish something that interrupts the flow of our ongoing serialized novel, but this has been a pervasive thought for some time now and I don’t think Facebook or any other social media is the best place for it. Please allow me this one indulgence as I momentarily direct our attention to more urgent matters.


Anxiety has risen around when we’re going to break free of the COVID-19-related shutdowns and “get back to normal.” While the US president is pushing for an unrealistic May 1 date for business to reopen, other experts are projecting much longer time periods. One bioethicist predicts it could be autumn of 2021 before large crowd gatherings such as concerts and sporting events can be resumed. The underlying question on everyone’s mind is, “When are we going to get back to normal?”

But what if we don’t?

What happens if “normal” as we knew it on January 1 of this year never returns? What could that look like? Could we create a better society for everyone if we don’t allow normal to come back? I don’t think anyone would say that our world we perfect before the pandemic struck. There’s absolutely nothing in the world that says we have to go back to the way things were. This is our opportunity to build something new, something better.

What if we don’t return to a society where people are segregated socially, financially, opportunistically, educationally, perceptively by race, religion, gender, sexuality, or any other arbitrary denominator base on traditions of hate, jealousy, and outright stupidity? 

What if we don’t return to an education system that is demonstratively better for those in some neighborhoods, cities, and towns than it is others, leaving many undereducated and lacking the skills they need to survive and/or hopelessly in debt for the majority of their adult lives?

What if we don’t return to a financial system that preys on the poorest of the poor, denying credit to those who need it most, charging fees to those who can least afford them, and rewarding those who hoard the most wealth with opportunities and resources the majority can never achieve?

What if we don’t return to a workforce that undervalues people we now see as critical to everyone’s survival: grocery store workers, food service employees, delivery drivers, postal service workers, first responders, pharmacy technicians and assistants, warehouse workers, and others?

What if we don’t return to a healthcare system that can deny care to anyone because they don’t meet a list of arbitrary and unnecessary qualifications such as insurance, or pre-existing conditions, or ability to pay, or where they live, or their chances of surviving, or their age, or the gender by which they identify?

What if we don’t return to a political system that denies anyone over 18 the right to vote because they don’t live in the right place, don’t have the right ID in their wallet, can’t physically get to the poll, were once in jail, didn’t meet a deadline for registering, or haven’t jumped through all the restrictive hoops?

What if we don’t return to churches, synagogues, and mosques that teach divisiveness, elitism, racial separation, retaliation, warmongering, theocracy, bigotry, sexism, xenophobia, disregard for science and medicine, authoritarianism, and complete disregard for the entire LGBTQ+ spectrum of people?

What if we don’t return to a disregard for climate and other evidence-based sciences, underfunded medical research, the obliteration of our natural resources, complete destruction of entire ecosystems, willful ignorance of climate change, underfunded science education, and pay-for-play publication systems?

What if we don’t return to an entertainment industry that makes its fortunes by exploiting the worst qualities of humanity, finding humor in our ignorance, celebrating irrational stereotypes, greed, corruption, nepotism, class warfare, racial disparity, injustice, and blatant misrepresentation of history and people groups?

What if we don’t return to a music industry that steals songs from songwriters, exploits performers, promotes live-or-die competitions, makes live music inaccessible for the masses, creates profit for labels over musicians, minimizes the role of women, and replaces talent with gimmicks?

What if we don’t return to an art industry that relies too heavily upon a system of corrupt curators and collectors hoarding art and controlling access to galleries and museums, diminishes the role of indigenous arts and gives unwarranted preference to eurocentric elitists, denigrates illustration and graphic design to lesser class status, and blocks access to financial stability for artists?

What if we don’t return to a world where more than 700 million people are food insecure, where 78% of workers live paycheck-to-paycheck—struggling to provide basic necessities, where as much as half of the world’s population does not make a living wage despite endless hours of work, and where workers’ rights are continually diminished?

What if we don’t return to a world where taxes are imposed on those with the least to give while billionaires escape with no taxes at all, where the efficacy of representation depends on the size of one’s political donation, and the voice of corporations dominates over the voice of individuals?

What if we don’t return to a world where any form of sex is illicit, where nudity is prohibited, where personal forms of pleasure are shamed, where professional sex workers have no legal protection, where protection against sexually-transmitted infections is arbitrary and optional, and where individual choice is superseded by antiquated laws based on unjust morality?

What if we simply refuse to return to the dysfunction that previously defined normal? What if we refuse to participate in something that is broken, inept, and unsustainable? What if we say no? What if we consider the possibilities of our own actions, collectively and individually, to change the world and create a new normal?

What if we take this opportunity to disrupt the political systems of the world, to demand more open and honest elections for everyone, to destroy the very concept of party restrictions and the misrepresentation inherent to their existence, to recognize the interdependence and cooperative necessity of every individual on this planet?

This is our opportunity to take control. We don’t have to accept the ineptness of our politicians. We can say no. We can demand resignations where resignations need to happen. We can refuse to support an economy built on corporate greed. We can demand more. 

We can create a new normal—something better, something lasting, something sustainable. All the cards are on the table. What do you choose to do?

The Old Man in the rain

Where we pass the hat

$
Personal Info

Billing Details

Donation Total: $20

Exploring Creativity

Reminder for those just joining us: We don’t underline links. Anything in bold italics is probably something you can click for more information. Usually.

My version of Adobe® Creative Cloud updated last week. Creative Cloud is the bundle of applications photographers and designers and directors and videographers and artists and everyone else use for everything from video editing to product design to the photographs you see here. Central to my interests, this means Photoshop updated. To say that Photoshop is a behemoth of an application is an understatement. One could take classes for years and still not be proficient in everything Photoshop does. Very few pieces of software dominate an industry to the extent Photoshop does the whole of creative arts.

Of course, when Photoshop updates the emphasis is typically on all the new features that have been added because for all the program can do, we want it to do more and we want it to do everything faster. The problem is that in order to achieve that goal, developers are at a point now where they have to leave some older functionality out. This aspect doesn’t get as much attention and unless one wants to go through all the fine print of the production notes one isn’t likely to discover what has been omitted until they need to use something that is no longer there. 

This time around, Photoshop seems to have dropped support for the older (free) version of a set of plugins I have used extensively [late note: a colleague says it’s still supported, but with extra work. I haven’t had time to explore that possibility yet.] From a development perspective, the omission is reasonable. The plugins are several years old and a newer standalone version is available that doesn’t leach off Photoshop’s resources. The problem from a practical perspective is that the new version is no longer free. The new version is $150, which is more than I had planned to spend on software upgrades this month. Or any month. 

Ah, the beauty is that the plugins didn’t do anything that wasn’t already available in the main application. The attraction is that they do it much more efficiently than one would do on their own. You’ll find the images that fueled this entire line of thought by clicking here.

All this turmoil has me thinking about what it means to be creative, how the reality is far more complicated than the end result would make it out to be, how being creative requires flirting with insanity, and the degree to which no one cares about the process, just the end result. Come take a walk with me through my world for a bit. This can get scary. Bring your own alcohol.

What Does It Mean To Be Creative?

What does it mean to be creative

We are constantly asking ourselves whether something is or is not art. That argument has gone to its furthest extreme of “if someone says its art, it is,” and puts any conversation about quality or talent on the defensive. I’m not sure we’re doing society nor artists any favors by being too accepting.

What we’re less likely to discuss is what it means to be creative. Being creative doesn’t just apply to what we might traditionally consider art. Creativity is involved in all manner of science and engineering as well. Where a new discovery comes as the result of a person trying something different or approaching a question from a unique direction, creativity was involved. That means that being creative does not make one artistic. Perhaps, just maybe, the inverse is true as well. Is being artistic always creative? Does writing an essay or taking a picture or finding a new algorithm for calculating the density of peanut butter mean that one is gifted or have we simply learned how to manipulate the elements from which new things are composed or composited?

In his article Being Special Isn’t So Special, Mark Manson attempts to make the argument that if you’re not setting the world on fire with awe-inspiring art or world-changing inventions, that one shouldn’t be too hard on themselves. After examining the progression and complications of contemporary Western society, Manson comes to the following conclusion:

As they say, wherever you go, there you are. Being special isn’t so special. You will still feel frustrated. You will still feel lonely. You will still feel like you could have done more.

Don’t sell yourself out for the sake of attention and false glory. Not that attention and glory are wrong, but they should not be prime motivators that drive your life.


Instead, focus on simplicity. On nuance. Slow down. Breathe. Smile. You don’t need to prove anything to anybody. Including yourself. Think about that for a minute and let it sink in:

You don’t have to prove anything to anybody, including yourself.

I’ll admit, there are days, weeks, months, even years where that “it’s okay, you don’t have to be Da Vinci” attitude has gotten me through some low points. However, as I get older, that attitude, especially over prolonged periods, risks being too defeatist to entertain. Okay, so not every picture I take has to be wonderful. Shouldn’t I at least try to make every photograph eye-popping? Trying and not succeeding is one thing. Not trying at all, however, is quite another. I’m hard-pressed to consider as creative the person for whom hum-drum and ordinary is the goal. 

There is an ad campaign that uses the tag, “for when being ‘okay’ isn’t okay.”  “Okay” meets only the most basic goals; it ticks the fewest boxes possible to be considered complete. “Okay” is life’s C-; sure it’s passing, but it’s a meaningless high school diploma that hangs alone on a wall where nothing else of note was ever accomplished.

I think part of what has to be separated is the act of creativity from the act of performance or presentation. For example, as I’m writing this paragraph (painfully struggling over everything except participles) I’m listening to a portion of Alfred Schnittke’s Faust Cantata (It Came To Pass if you’re really that interested). Where is the greater creativity: in the act of composing by Schnittke or the interpretation by Maestro James DePriest performed by the Malmö Symphony Chorus? There’s no question that there’s immense talent on the part of everyone involved, but where, exactly is the greater creativity demonstrated? Are the soloists as creative as the conductor? Is the maestro any less creative than the composer? Can degrees of creativity even be adequately measured?

Into this stream of steaming consciousness is a new study that suggests there are two types of creativity. Experimental creatives build off their experience, bringing years of trial and error to bear before delivering a seminal, perhaps final work that defines the whole of their career. Conceptual thinkers work from abstract principals, chasing raw thought and following it through to its creative outcome. What’s interesting about this study is that is generally age definitive. Conceptual creatives tend to be younger, primarily people in their 20s who don’t have the life experience that might hold them back from chasing new ideas. Experimental thinkers are more likely to be over 50, have experienced some disappointments in their careers, maybe even changed careers multiple times, before reaching an intricately formed and detailed result. 

There’s something to be said for both approaches and it is entirely possible for a person to fall into both categories at different points in their lives. I look at musicians, especially. LadyGaga raised a bit of a ruckus with her “little monsters” when she tweeted that she doesn’t remember her album ARTPOP. Looking at the quality of the music on that album, comparing it to what came before and what was created after, it’s reasonable that the album falls between the conceptual success of “Born This Way” and the more introspective and perhaps experimental sounds of “Joanne,” but the artist is still quite young and may yet develop a different sound as her voice matures.

Comparatively, not everyone who is successful at an early age tops that first big explosion. Consider T. S. Elliot and Pablo Picasso whose best works (arguably, of course) came when they were young. By contrast, Virginia Wolf and Charles Darwin had a whole lifetime of experience behind the works for which they are best known. One of my favorite examples is Matisse, whose early works are exceptional on their own but have absolutely no relation to the work from his later life that demands to be a topic in every art history course ever taught. 

That doesn’t define, though, what it means to be creative, so let’s toss something even more convoluted into the mix. Adobe, the massive software company whose products directly target creatives, teamed up with the creative agency Anyways and writer/researcher Carolyn Gregoire to create the eight distinctive creative personalities: ‘The Artist’; ‘The Thinker’; ‘The Adventurer’; ‘The Maker’; ‘The Producer’; ‘The Dreamer’; ‘The Innovator’; ‘The Visionary’. The test is based on the Miers-Briggs personality exam which almost everyone on the planet has taken. Using their relatively short testing process, I’m apparently the Dreamer, which lists its strengths as being connected to emotions and imagination, empathy and sensitivity. If you want to take the test for yourself, you can do so here. However, at the end of the exercise, I don’t see the test as definitive of creativity any more than I find the Miers-Briggs anything more than a personality snapshot, a definitive point on an extended timeline. One can fit any of the artistic personality types and still be perfectly satisfied with their life sitting on a couch doing nothing. Personality is a filter that colors our actions, not necessarily a motivator that leads one to act.

Perhaps the end result is that what it means to be creative is as undefinable as attempting to determine what is or is not art. If that is the case, how do we begin quantifying our creative lives? If there is no “this is, that isn’t” determination, then on what basis do we justify people investing in, paying attention to, or distantly regarding our work? Volume? Quality? External perception by peers or “critics?” If some people like the work of Sibelius or Gustav Klimt, why are they enthused by those works while others consider both trash? 

As hard as I look at the topic, I keep finding more questions than I do answers.

What Is The Source Of Creativity, Anyway?

Creative Sources

Ask a thousand people a question, get a thousand answers to fuel a thousand frustrations. I’m half-tempted to ask why we need to ask this question in the first place? Does it really matter what the source of creativity is as long as there is creativity? Creativity isn’t a shared resource where one has to worry about their idea being polluted by someone or something further upstream. Or is it? And there’s the answer to the question of why we need to ask the question. Understanding the source of creativity does not make the ideas come any faster or make them any better, but helps us understand the shared space that creatives occupy, that portion of the universe that plants seeds in our brains and waits for them to grow.

Right from the start, however, one runs into a problem determining the source of creativity in that there is no consensus. There are those who look at creativity as an abstract that “lies deep within the soul of man,” (really, someone wrote that). Then, there are those who look at creativity as a role of brain function or, at least, keep making that attempt. Each of those approaches carries with them a lot of evidence based on the observation of what happens when someone is in the act of being creative. What was someone doing/thinking/eating/experiencing when engaged in a creative activity? Based on one’s perspective, the answers can be rather diverse and, at times, even contradictory, leading one to the conclusion that, no, we really don’t understand the source of creativity.

First, let’s get out of the way the concept that creativity is linked to intelligence. Yeah, sure, you may have read that somewhere, and it may be that most the creatives you know are also intelligent people. However, one does not necessarily infer the other. Dr. Rex Jung, assistant professor of neurosurgery at the University of New Mexico, said in an APA interview,

“ … some people have found correlations between creativity and intelligence. They’re usually pretty low, this association. And some people make a lot of that, this low association. But usually, because this association between creativity and intelligence is low, it means that you don’t necessarily have to be intelligent to be creative (source).”

Okay, that’s not the hard break some might have liked. Anecdotally, it often appears that intelligence and creativity are linked, especially if we are looking at scientific forms of creativity, where knowledge of a specific area of study precludes being creative in that field at all. Someone like me, who despite all my efforts still does not understand Algebra, is not likely to have a seminal moment where I solve some math problem that five minutes ago I didn’t realize existed. However, there remain plenty of areas where pre-existing expertise is not requisite to the creative process and, at times, an overabundance of knowledge in certain areas, or even the access to excessive information in an area, can stand in the way of creativity.

Point of fact: following the rabbit trails of research on a topic can cause me to spend a lot of time reading rather than actually writing the article. However, in that case, the intelligence getting in the way is not mine, is it? One can hardly blame the author of an article if they’ve done well enough that I find the words compelling. 

One of those rabbit trails, however, led me to a 1965 article in a now-defunct scholarly magazine called Social Science. In the article, (source registration required) Alfred W. Monk, who was at the time Professor of Philosophy and Chairman of the Department at Albion College, postures that there are three primary sources of creativity. He alleges that,

“Nature, by virtue of its vastness, its order, its beauty, and its challenges to man, constitutes a source of creativity. Man himself, however, in terms of his higher capacities, represents a higher source of creativity. Yet, if man is to develop and to become creative, he needs the kind of society which is most conducive to the development of his potentialities.”

American poet Walt Whitman would have underscored the influence of Nature. A decade after the Civil War had ended, Whitman mused in his diaries, later published as the collection Specimen Days, of the importance of communicating with trees.

One lesson from affiliating a tree — perhaps the greatest moral lesson anyhow from earth, rocks, animals, is that same lesson of inherency, of what is, without the least regard to what the looker on (the critic) supposes or says, or whether he likes or dislikes. What worse — what more general malady pervades each and all of us, our literature, education, attitude toward each other, (even toward ourselves,) than a morbid trouble about seems, (generally temporarily seems too,) and no trouble at all, or hardly any, about the sane, slow-growing, perennial, real parts of character, books, friendship, marriage — humanity’s invisible foundations and hold-together? (As the all-basis, the nerve, the great-sympathetic, the plenum within humanity, giving stamp to everything, is necessarily invisible.)

No, it’s not an easy read more than a century and a half out from its creation, but Whitman was channeling a communion with nature that was itself introduced by English author Ralph Austen all the way back in 1653 (source). In fact, the period between the late 19th century and early 20th, prior to World War I, saw a global movement in naturism and contemplating gardens and trees and lying about naked among them. This is the atmosphere that raised great photographers such as Horst P. Horst. 

Neither does the concept that humanity itself, one’s own existence and experience, breeds creativity within oneself. The entire rationale of Mindfulness and its related practices such as many forms of yoga underscores and supports the concept that the answers and creativity lie within the self and flow forth most freely as one becomes “in tune” with the self. This is part of ancient traditions going back at least as far as the 15th century.

Where Munk may be unique, and tragically unheard, however, is in the premise that society has an obligation and need to foster creativity. He repeats the philosophical question of whether Newton would have been equally as creative in the Stone Age, in a society where he might have been seen as a magician rather than a man of science. After fussing around the history of philosophical ponderings, Munk makes a final charge.

“Although it is impossible to predict clearly and precisely the basic characteristics of the kind of society most conducive to the production of geniuses, at least three things are possible. First, from a negative standpoint, it is clear that not less than four types tend to stifle creativity: primitive societies; modern totalitarian states; stagnant, traditionalistic and archaic cultures; and any society that is unstable to the point of chaos. The second is simply the fact that any society that aims at maximum creativity must find its way between totalitarianism and authoritarianism, on the one hand, and instability and chaos on the other. The third is the fact that the creative society must be engaged in creative interaction with other societies. There is no instance of any great nation or civilization in isolation.

Remember, Munk is speaking from the perspective of a society that is still attempting but not yet succeeded in landing a human on the moon when he writes, “… it is well to point to the hope that, while we are on the brink of chaos and disaster, we may also be on the verge of the greatest period of creativity that mankind has ever known.” Given all that has happened over the past 50-plus years, Munk seems to have nailed that prediction on the head. 

As I read and ponder all these things I’m still not satisfied that we, collectively, especially from a societal perspective, understand creativity in its purest form or even recognize it when it occurs either within ourselves or, most especially, within others. I worry that far too much of the creative element is only recognized in hindsight, which leads me to the next section of the discussion.

How Are We Defining Creativity?

Go ahead, define creativity

Over the course of this week, when not chasing down the infinite distractions of this topic, or preparing meals for children who are perpetually hungry, or trying to make a dent in the ever-growing mountain of laundry [seriously, how do we have so many clothes?], or troubleshooting an uncooperative computer program, I’ve been processing a set of erotic images with the intention of submitting at least one of them for inclusion in next year’s art shows. The work has been at times tedious and enjoyable and on some emotional level, both exhausting and exhilarating as the production of these ten images has dominated my focus for the week. 

What bothers me about investing so much creative capital into a set of pictures is the constant concern that, short of me standing right next to the observer explaining to them what they are seeing, they will neither understand nor appreciate what they are viewing. I know that I’m not alone in harboring that fear, either. We have all been pelted with stories of artists and scientists and creatives of various kinds whose work was completely ignored until after their deaths. At times during the educational process, there seemed to be a subliminal messaging that to be creative is to doom oneself to obscurity in this lifetime and fame after our name has been forgotten.

One prime example that has received a fair amount of attention only in the past few years is the fact that it was women, specifically black women such as Katherine Jackson, Dorothy Vaughn, Miriam Mann, Christine Darden, and Annie Easly whose work, largely unheralded before the release of a movie about their contributions, who are responsible for many of the creative advances in both science and art through the latter part of the 20th century and into the beginning of this one (source). What if the movie Hidden Figures had never been made? Would anyone outside their most immediate family have recognized their creativity before their deaths? 

I am thoroughly convinced that a lot of people sit on creative thoughts and ideas, never sharing them or pursuing them to any degree, for fear of being ridiculed, told their ideas are silly, or being told they’re wasting their time. The problem starts when we’re young. Parents and preschool teachers who have a lot on their minds find it too easy to push aside a child whose creative bantering is disruptive. As children enter school, they’re told to sit down, be quiet, let someone else do the talking. By the time they’re teenagers, even those with immense talent in specific public areas of art and entertainment are they shouldn’t hum while reading, or drum their fingers on the desk, or doodle on their test papers. It is the rare individual who survives this system into adulthood with their creativity fully intact. 

Yet, I am fully aware that there is a perfectly legitimate and authoritative argument that knowledge within a particular standardized framework is necessary to develop creativity in more rigid areas of study, such as math, economics, and physics. Economist Tim Leunig argues that creativity is born of skills that are developed in the classroom and sites the manner in which Thomas Newcomen invented the steam engine as evidence. As mentioned previously, there are certain forms of creativity that can only come with a specific amount of knowledge already in place. Leunig and others refer to it as a creative form of literacy that, when absent, creativity has difficulty establishing a foothold (source).

Part of the challenge is that creativity in a field such as mathematics is not the same as creativity in the arts. A painter might come up with an elegant manner of expressing a math problem, be completely and utterly wrong about the math problem, and it still is art. If a mathematician were to express the same incorrect problem within the language common to that field, they would be ridiculed, scorned, and possibly driven out of business. 

Julian Astle, the former director of Creative and Learning Development for the RSA, has written that “Creativity is not a single thing, but in fact a whole collection of similar, but different, processes.” Hence, we have difficulty recognizing creativity at different levels and in different fields because we’re looking in too narrow a zone. 

For example, if we’re looking at an Ansel Adams photograph of the American desert, the tendency is likely to appreciate it for its framing, for the way in which Adams captures light at just the right angle to make the image aesthetically astonishing. What we often miss, however, is Adams’ genius in calculating when that light was going to appear, the precise time at which it would appear, and the conditions that had to exist for the light to appear at all. What is often praised for its aesthetic creativity is perhaps more astonishing for its scientific creativity and use of knowledge to create something visually pleasing. While there is no question that the photographer had a creative vision, he also had a creative application of knowledge that facilitated that vision. To fully appreciate the photograph, then, we have to consider not only what was captured but how it was captured and even the manner in which the photograph was processed. 

Inversely, the presence of artistic skill does not predicate creative ability. The Suzuki Method of teaching music, for example, is often criticized for producing musical automatons. Yes, the four-year-old knows how to play Mozart with technical precision, but the aesthetic value is lacking. Music requires more than just an iteration of notes and sounds in a specific order. A digital machine can just as easily reproduce the pure sound as can the four-year-old. However, there is still a noticeable difference between the child’s performance and that of a master such as Yoyo Ma The child is reciting notes on an instrument much as they might recite “Mary had a little lamb.” Ma is creating something new, something different, every time he picks up his cello, even if the notes on the page are exactly the same.

At this point, I have to insert the existence of composer John Cage (1912-1992). Cage was to contemporary Western music what Marcel Duchamp (1887-1968) was to contemporary art. The fact that the two avant-garde artists were friends set up one of the greatest events of public art in the 20th century [you’ll have to read more about that here.]. As a composer, however, Cage’s perspective on creativity and music and sound was unique, influenced not only by Dadaism and his fascination with music theory but by Zen Buddhism and the concept of silence. 

When in the 1940s the Muzak Corporation began piping music into offices everywhere as well as subway platforms and department store elevators, Cage led the revolt by composing the piece 4’33”. Asserting that silence was as important to music as sound, the premiere performance of that piece in 1952 went something like this:

  • Pianist David Tudor walked on to the stage at a chamber music hall in Woodstock, New York (yes, that Woodstock).
  • Tudor sat at the piano and propped up six black pieces of paper.
  • He shut the lid to the piano.
  • He clicked a stopwatch.
  • At the 30 second mark, Tudor opens the piano lid, pauses, then shuts it again.
  • Rain begins to fall (Cage had nothing to do with that … I think).
  • Tudor repeated his actions after two minutes and 23 seconds.
  • Audience members began to leave.
  • One minute and 40 seconds later, Tudor opened the piano lid, stood up, and bowed. The performance was over.

The audience was livid to the point that some wanted to run Tudor and Cage out of town. The response from every “respectable” music critic in the country ferociously declared that 4’33” was insulting to audiences and to the music community. Even Cage’s own mother told him the work was trash. 

Not everyone saw it as a waste, however. Musicians such as John Lennon and Frank Zappa would later hail it as one of the greatest pieces of music ever written (source). 

Abstract painter Willem de Keunig was once (perhaps apocryphally) debating art with Cage when he made a rectangle with his fingers and placed them around a scattering of bread crumbs on the table. “If I put a frame around these bread crumbs, that isn’t art,” De Kooning said. 

Cage disagreed. “The frame is everything,” he said. 

Out of context, everything is just noise. The sound of wind rustling through the leaves. The whir of a finely tuned car engine. A violin playing a lone melody. All nothing more than irritants until they are provided a frame, a context that reveals the genius of creativity. Suddenly, we see and hear and understand things in a different light, we appreciate their beauty, we place value on their existence.

With that understanding, or at least from that perspective, perhaps it makes sense to say that creativity on its own is just noise. If I write a song, something I did once upon a time, but no one ever hears it, or the people for whom it is played are unable to understand it, what was created holds little value. Sure, I might like it (I rarely do) but is it enough to create for our own understanding or our own pleasure? If we do not create to the benefit of someone or something outside ourselves, is there value to creativity at all? The answer seems to depend on whom one asks.

Who Owns Creative Property?

Who owns this mess

If there is value to creativity, and let’s assume for the moment that there is if for no other reason than the deepened depression that comes with the alternative is debilitating, then there is an inevitability to the question of who owns that value. Normally, I would reference some piece of law at this point, but when it comes to the overall survey of creativity, the law only serves to confuse and discourage us even more. This topic is a real-world nightmare that does nothing more than make millionaires of lawyers who spend years arguing without end. We have constructed a nightmare by attempting to hold the value of creativity to something that can be bought, sold, traded, franchised, and licensed. None of it makes a damn lick of sense and it only serves those whose understanding of creativity is completely self-serving.

A significant portion of the week has had the perils of Taylor Swift filling my Twitter feed. The country-turned-pop diva left the label, Big Machine, because of alleged improprieties on the part of Scooter Braun, one of the company’s big wigs. No, it’s not because it’s impossible to take seriously anyone named Scooter. This runs deep and has its own legal issues taking place somewhere else. This week’s particular challenge is that, in exchange for spending millions of dollars building Ms. Swift’s career, Big Machine owns the rights to all the songs she recorded during that period, even if she wrote them herself, which applies to a large portion of her back catalog. Scooter was not part of Big Machine while Ms. Swift was under contract there. He bought the label after Ms. Swift had left. Because of their previous legal difficulties, everyone knew it was just a matter of time before this became nasty.

This week, Ms. Swift claimed that Big Machine was refusing to allow her to perform any of her old songs at the upcoming American Music Awards. Their alleged justification was that doing so amounted to re-recoding the songs (because the show is taped) and Swift isn’t allowed to do that until next year. 

Scooter says, “Did not, she’s just trying to get me in trouble.” Okay, those weren’t the exact words, but reading the explanation issued on Friday, it reminded me far too much of the arguments between children when a parent was not present to witness the alleged grievance. The whole mess is missing any substantial evidence on the part of either party and, quite honestly, the best response might be to send all parties to their room without any dinner. 

What the on-going argument does, however, is to highlight the perils and, often, the futility creatives face when attempting to monetize their creations. Every form of copyright and patent law upholds the rights of the creator to claim ownership of the created—sort of. If one discovers something or creates something of value while in the employment of another entity who might benefit from that discovery or creation, then the employer may own the rights to what was created. Check the small print of your employment contract. This is just the tip of a very big iceberg where the matter of creative rights depends on the specific circumstances around the how, where, when and why of creation complicated by whether it was sold, how it was sold, and whether the person doing the selling had the rights to sell in the first place. Yes, the whole mess is muddy and discouraging.

There are basically three general areas of protection: patent, copyright, and license. The most simple breakdown goes something like this:

  • Patents apply to physical objects or processes involving physical objects or the plan/concept for physical objects.
  • Copyright applies to any item created through the general artistic process, regardless of medium nor the manner in which the item might be presented. 
  • License is the means through which a patent or copyright holder allows someone else to utilize, perform, display, or otherwise make use of that protected property.

Seems simple enough, doesn’t it? But, of course, nothing ever is as easy as we’d like and there are more crooks and crevices within intellectual property law than one could adequately cover in a dozen books. 

One of the most significant problems comes when one tries to sell something they’ve created. For centuries, especially within the field of the arts, once something was sold, whether a song or a photo or a sculpture, ownership moved from the creator to the buyer. The buyer was then free to do whatever they wish with the object, even to the point of destroying it. Creatives often felt left out when the buyer would then go on to make a fortune re-selling their creation. I cannot help but think of this every time I see a painting selling at auction for millions of dollars. Be sure, the artist isn’t making a freaking dime from that resale. 

Licensing was developed as a way for creatives to continue making money off their creation as the value of that creation grows. For example, if the Associated Press called me up and asked to use one of my photos of the Vice President, I would likely sell them a limited use license that allows for a specific manner of distribution while maintaining the copyright in my own possession. I could then enter into a similar agreement with another media entity if someone else asked to use the same photo. 

The problem with licensing is that it may work too well. When the concept was developed in the 1920s, it centered primarily on intangible assets. However, with the advent of computers, software companies such as Microsoft utilized the concept of selling licenses so that they could re-sell and simultaneously limit the use of their software, creating different rules, and pricing, to apply to differing circumstances. As more and more of the creative world has moved to the use of digital tools, we’re finding that many of those tools require individual licensing.

For example, not only do I have to license Photoshop in order to process my photographs, but I have to also license fonts for various type, brushes and patterns for various effects, and even some specific color palettes. This drives up the cost of every image I process. I have the choice, then, to either absorb the license fees as a cost of doing business, or I can attempt to reclaim those by adding them on to the price of images that are sold. 

I don’t especially like the licensing system, though. Imagine if the same philosophy was applied to building a house. I might license the lumber from Home Depot, my hammer from Stanley, my saws from Stihl, and my nails from someone else. Obviously, I would factor the cost of those licenses into the price of the house, but what happens if, in the middle of the project, Stanley decides that they are discontinuing the license for the hammer I’m using. I’m supposed to return the hammer and obtain a new model which, big surprise, costs twice as much. This impacts the cost of building the house, but the person buying the house is likely to be quite upset and may even cancel the contract if I go back mid-project and try to raise the price.

Another sore spot in the area of digital licensing is that many products are licensed based on a subscription. Maintain the subscription and the license is in force. Drop the subscription and one can no longer use the product. Never mind that the real value of the product is considerably less than the accumulated subscription cost, to continue using them is a copyright violation.

Yet, the people who created those tools deserve to be justly compensated, do they not? And being that digital product is intangible, it is subject to licensing where products such as lumber and hammers and saws are not. The situation exists because so many of the creatives involved are freelance, part of a gig economy that leaves fair payment for one’s creativity up to an ungrateful end user who thinks they should get everything for free, including end product. Instead of being supportive by buying products and services outright, the society that should be supportive of creativity in all forms instead starves it to death with inappropriate payment systems that keep us all on proverbial street corners looking for handouts.

And that leads us to the final thought.

Are Creatives Crazy Or Are Crazy People Creative?

who are you calling crazy

Honestly, I don’t know creative people in any field that haven’t had their bouts with mental illness of one form or another. I sit here almost every Saturday questioning my value, wondering if I’m the only one who thinks my work has value, and questioning my worth as a person. Plenty of others have it worse, fighting with suicidal thoughts on a regular basis and dealing with urges of self-harm. We may make jokes about van Gogh cutting off his ear, but the number of creatives across every field who hide scars with long sleeves or, more recently, heavily inked tattoos, is higher than anyone can accurately measure. Not only do we suffer, but most also suffer in complete silence.

I have found it interesting as I’ve looked at this subject in sometimes painful detail the number of psychopathological challenges that have been found common among creatives.

  • Depression
  • Anxiety
  • Bipolar Disorder
  • Schizophrenia
  • Manic ideations
  • Suicide

Every study seems to have their favorite malady and plenty of famous anecdotal subjects who conveniently fit the diagnosis that particular psychopathology despite not being available to participate in an actual study, usually due to having been dead for a hundred years or so. 

On the surface, it’s easy enough to accept such studies because of our own need to explain the mood swings, the sudden outburst of anger followed by uncontrollable crying, hearing voices when no one else is in the room, and the persistent urge to drive one’s head into a wall, among other symptoms. 

The fly in this seemingly obvious ointment is Alan Rothenberg’s book Flight from Wonder: An Investigation of Scientific Creativity. In preparation for this book, Dr. Rothenberg interviewed 45 Nobel Laureates and failed to find a single instance of a psychiatric disorder. None. Zero. Some of the most creative people in the world and they don’t exhibit any of the plagues that seem to haunt the minds of others. That kind of puts a pin in all the other studies who looked at more “average” creatives.

Maybe part of the problem is that we’re not reaching our creative potential and that is making us crazy? There’s certainly an argument for that, but there is no hard scientific evidence in support of the theory. 

What does seem almost certain is that Cognitive Disinhibition plays a roll in what is at the very least considered artistic eccentricity. Cognitive Disinhibition is the inability to ignore the things we would be better off ignoring. You know, like constantly chasing rabbit trails instead of sticking to the research one needs to do. For anyone who has Cognitive Disinhibition, the Internet and especially social media are like death traps. The overabundance of information constantly changing and being updated feeds that inability to filter out information we don’t really need to know (source).

Where does that leave us? A 2013 study says this:

Reduced cognitive filtering could explain the tendency of highly creative people to focus intensely on the content of their inner world at the expense of social or even self-care needs. (Beethoven, for example, had difficulty tending to his own cleanliness.) When conscious awareness is overpopulated with unusual and unfiltered stimuli, it is difficult not to focus attention on that inner universe.”

That might explain how many creative people end up seeming antisocial or having difficulty participating in social events. The same researcher says in a similar study:

In all of our studies and analyses, high IQ, when combined with low LI, was associated with increased creative achievement. These results are particularly stunning in the analysis of eminent achievers and high-functioning controls. High IQ clearly appeared to augment the tendency toward high creative achievement characteristic of low-LI individuals.

These results lend support to the theory that there may be qualitative (e.g., failure to filter out irrelevant stimuli) as well as quantitative (e.g., high IQ) differences in the processes underlying creative versus normal cognition.”

Just for clarity, LI in this instance stands for latent inhibition, “the varying capacity of the brain to screen from current attentional focus stimuli previously experienced as irrelevant.” So, to summarize, intelligent people who are easily distracted are also more likely to be more creative. That’s nice to know, I suppose, but it doesn’t explain why so many creatives are happy taking a handful of sleeping pills and never waking up.

Hold on, Dr. Carson isn’t done. In yet another article she and her colleagues write:

“…These results also support the theory that highly creative individuals and psychotic-prone individuals may possess neurobiological similarities, perhaps genetically determined, that present either as psychotic predisposition on the one hand or as unusual creative potential on the other on the basis of the presence of moderating cognitive factors such as high IQ (e.g., Berenbaum & Fujita, 1994; Dykes & McGhie, 1976; Eysenck, 1995). These moderating factors may allow an individual to override a “deficit” in early selective attentional processing with a high-functioning mechanism at a later, more controlled level of selective processing. The highly creative individual may be privileged to access a greater inventory of unfiltered stimuli during early processing, thereby increasing the odds of original recombinant ideation. Thus, a deficit that is generally associated with pathology may well impart a creative advantage in the presence of other cognitive strengths such as high IQ.”

Translation: The whole matter may be one of genetics. The same genes that result in mental incapacities in some people may create “unusual creative potential” in others, with the possibility that a person and shift back and forth between the two. In short: we’re born this way, baby.

Oh, but this gets way crazier. If we recognize that there’s a problem we have to try and solve it, right? Famously, Timothy Leary and others tried using LSD and other drugs and while it might have made them more creative for a period it also made any mental issues worse. So, we’ve all been told to stay away from psychedelic drugs.

Until a couple of years ago. Microdosing. Are you familiar with the term? It’s when a drug is administered at levels significantly lower than the norm. One of its most common uses is in hormone therapy where it’s shown significant promise. Now, apply that to psychedelic drugs, specifically LSD.

A 2018 study showed that people who microdose LSD and mushrooms score higher on wisdom, creativity, and open-mindedness while scoring lower on dysfunctional attitudes and negative emotionality. While this is far from being any kind of a cure, it is some sign that there are at least options that might momentarily mute some of the more negative symptoms that creatives regularly endure.

Pardon Me While I Soak My Head

I'm done

Seriously, my head is throbbing. It’s now late Saturday night, stress has created a pain at the base of my skull, and I’m trying to find a way to wrap up this bitch of an article so I can take a hit of scotch and go to bed. I’m not convinced that all this research this week has actually solved anything except that I have a lot more information in my head now to contribute to all the Cognitive Disinhibition. 

Here’s where my brain is at for the moment.

  1. Those of us who are genuinely creative are damn lucky. There are a lot of people who work in creative-related areas that can’t actually produce a damn thing but have been led to believe that they are creatives. Their frustration is significantly higher than the rest of us and many end up in mental institutions … doing art therapy.
  2. Creativity has a mind of its own and shows up whenever, wherever, and for whatever reason it wants. There are a thousand ways to stimulate the creative mind and no, not all of them are healthy, but when every molecule in your brain is telling you that you have to create something then consequences be damned, we’re going to create. Something.
  3. Creativity can be the answer to a math problem no one else can figure out or a smattering of bread crumbs on a table or the cacophony of a dozen ring tones smashed together and punctuated with rhythmic silence. What matters is the frame, the context, how one allows others to experience their work. If you think you’ve made a freaking masterpiece then show it off like a freaking masterpiece, not in your mother’s garage.
  4. What you create is always a part of you even if it is no longer with you. Possession is an illusion. If you create something, it is yours. If someone else can rif off what you created, let them because in doing so you celebrate the creativity you both share. Nothing worthwhile deserves to be locked away by any means physical, contractual, or digital. Sing your songs. Make your art. Discover new worlds. Let no one tell you no.
  5. It’s not being creative that presents mental illness, it’s the pressure, whether internal or external, to create that drives us right smack over the edge. Creatives are under constant pressure to produce more and as we do it is supposed to be different and better and more astonishing than what we did last time. Feel free to call bullshit on that whole scenario. 
  6. Someone needs to be taking care of creatives because, for the most part, we do a lousy job taking care of ourselves. We’re a mess, ya’ll. And while we should embrace the mess that we are, let’s get real and appreciate that there are probably days/weeks/months that we shouldn’t be left alone in a room where there are sharp objects. We need people to check on us and not believe us when we say that we’re fine. We’re creatives. We’re not “fine.”
  7. We all need more sleep.

There is a long-haired orange tabby kitten peering over the edge of my laptop most likely wondering if I’m going to get anything to eat and if I do whether he can mooch some if it. He gets his balls lopped off on Monday. We are removing an element of creativity from him. 

Too many days I feel as though I’ve had my creative balls lopped off.  I go back over the questions I’ve asked here and despite all the research, I can’t answer any of them. Then, a poem comes to mind from the pen of Alfred Lord Tennyson, whose depression and exhaustion drove him into a manner of solitude. He wrote, in part,

“Forward, the Light Brigade!”
Was there a man dismayed?
Not though the soldier knew
   Someone had blundered.
   Theirs not to make reply,   
Theirs not to reason why,
   Theirs but to do and die.
   Into the valley of Death
   Rode the six hundred.


Cannon to right of them,
Cannon to left of them,
Cannon behind them
   Volleyed and thundered;
Stormed at with shot and shell,
While horse and hero fell.
They that had fought so well
Came through the jaws of Death,
Back from the mouth of hell,
All that was left of them,
   Left of six hundred.


When can their glory fade?
O the wild charge they made!
All the world wondered
Honour the charge they made!
Honour the Light Brigade,
Noble six hundred!

My creative friends, we are the six hundred. Charge on.

Climate Change Requires A Radical Response

Rhetoric is irrelevant; climate change now threatens everything, eliminating the opportunity for a measured response.

Note: I realize that since we’ve spent 20 weeks with the novel there haven’t been a large number of links in what we’ve written. Our links are not underlined or colored, they’re bold italic. When you see anything in that format, click it for more information.

Not that anyone will. Our click rate is something like 0.001 percent. Either you trust me too much or you don’t care. This is part of the problem. We all need to click those links.

Eagle Creek State Park Ooze

Didn’t We Already Talk About This?

Driving across town recently, I found myself increasingly frustrated by how quickly the needle was descending on my gas gauge. Traffic was horrid, people were weaving in and out of lanes with little regard for safety, and I was late. In conditions such as these, I find myself thinking that there has to be a better way. We’re on the cusp of 2020, after all. All the 20th-century science fiction promised us something better by now. Why aren’t we there?

Then, a sports car passes doing nothing short of 90 miles per hour, black smoke belching from the exhaust, swerving dangerously through traffic, at times crossing four lanes and then back, cutting off a semi whose driver then had to brake hard to prevent a significant accident. Words similar to “fucking idiot” came out of my mouth. This happens far too often and it always surprises me at how many people I see driving like this. I’m both angry and disappointed.

Seven minutes later (yes, I checked), I’m sitting at a stoplight, look over at the car on my left and guess what: it’s the same speeding dude who had passed earlier. All that noise, pollution, and danger at high speed and it had gotten him to the same place at the same time as I had gotten driving slower. I looked at him and glared, hoping maybe he’d look my direction. He didn’t. The light changed and he left a trail of rubber as he sped off.

As I watch his trail of pollution disappear in front of me (for the distance of two more stoplights where I’m in front of him this time), it occurs to me that drivers like him are the reason we don’t have flying cars. People drive badly enough on the ground. Can you imagine the chaos and disaster that would occur if we allowed them to take flight? Getting people into autonomous cars is likely to be one of the greatest life-saving events in vehicular history.

What bothers me more, though, is that it’s almost the end of another decade and as I’m driving across this midsized midwestern city I can see a blue/pink haze hovering around the city’s skyline. This is mid-October. We don’t have the extreme heat to blame for creating an “ozone action day.” There are no longer big factories downtown belching black smoke into the sky. The horizon should be clear, but it’s not. Once again, I’m prompted to ask why this is happening.

There’s little question that people are what’s happening. This haze is caused by too many vehicles with bad exhaust, people still mowing their lawns, burning leaves in the backyard, greasy exhaust from commercial kitchens filtering into the air catching dust and other particles, and other seemingly casual elements of life that all add up to creating an environment that not only is bad for our own lungs but is destroying the planet as well.

We hear a lot about climate change and global warming today as a political issue more than a scientific matter because the world is at a tipping point. If we don’t initiate significant change quickly, the effects could become irreversible within the next 30 years. After that point, if we’ve not significantly reduced CO2 emissions, the planet starts fast-tracking its way toward being uninhabitable and there is nothing we can do to stop it.

What bothers me most is not the ridiculous denial on the part of short-sighted people with a lot of money and power, but the fact that we’ve been aware of the problem for almost two hundred years and have done next to nothing to stop it. Seriously. This is so not a new issue that had we responded appropriately at the first alarm, a half dozen generations could have been raised never knowing there was a problem. Science writer Simon Weart compiled this short history of how our knowledge of climate change has developed.

  • 1824 – Joseph Fourier discovered the greenhouse effect.
  • 1859 – John Tyndall discovered that H2O and CO2 absorb infrared confirming the Fourier greenhouse effect.
  • 1896 – Svante Arrhenius proposed human CO2 emissions would prevent earth from entering next ice age (challenged 1906).
  • 1950’s Guy Callendar found H2O and CO2 did not overlap all spectra bands, therefore warming from CO2 expected (countered the 1906 objections against Arrhenius).
  • 1955 – Hans Suess identified the isotopic signature of industrial based CO2 emissions.
  • 1956 – Gilbert Plass calculated adding CO2 would significantly change radiation balance.
  • 1957 – Revelle/Suess suggested oceans would absorb less CO2 causing more global warming than predicted.
  • 1958/60’s – Charles David Keeling proved CO2 was increasing in the atmosphere.
  • 70’s/80’s Suke Manabe and James Hansen began modeling climate projections.
  • Current: NCAR, GISS, Hadley, CRU, RSS TLT, UAH, MSU, Glacier Melt, Sea Level Rise, Latitudinal Shift all confirm models.

Mind you, that’s the short version. Weart offers a little more depth in his book, The Discovery Of Global Warming. The amount of science supporting and providing evidence of this cataclysmic problem is ponderous. So, why the hell are we so incredibly slow to do anything about such an obvious problem? The answer lies within the foundations of human character in the 21st century: We are lazy and we are cheap.

Numerous solutions have been available between 1824 and now. We’ve had plenty of opportunities to avoid this last-minute panic. Yet, we are a society that celebrates a culture of procrastination, starting in school when we wait until the last minute to finish a project or cram for a test, and not buying anything that isn’t on sale for less than it costs to produce. As a result, we have simultaneously eroded not only the environment but the retail economy as well.

Because of our procrastination, we have reached a level of emergency where the solutions still available to us are going to require billions, perhaps trillions, more dollars and an even greater, more drastic change to our lifestyle and cultures than could ever be considered comfortable. If we are going to survive, however, we have no choice. We have to be willing to make sacrifices and piss off people in power in order to actually get something done, even if it means working outside the permission and purveyance of governments. As a society, we can no longer wait for governments to lead the way. We must go around them, or over them, in order to maintain human viability on this planet. Hold on tight, this is going to get ugly.

The Truth Is More Radical Than We Realized

Eagle Creek Park Oil Slick

When former Vice President Al Gore presented the concept of severe climate change under the banner of An Inconvenient Truth in 2006, he did so with the deft touch of an experienced politician: He played it soft. He knew far too well that Americans wouldn’t be able to handle the enormity and seriousness of the problem had he gone full-tilt with all the alarming facts available to him. Even soft balling it, he was still called a radical and a fear-monger by just about everyone in any position of power. What Americans didn’t see is that Mr. Gore’s actions scared the living shit out of those who control the money and, by extension, the economy of the United States. He presented to them a problem whose only solution required an extensive overhaul of investments, one that would produce less return and therefore less profit. Theoretically, they could have embraced their role and responsibility and, if so, we probably wouldn’t be having the conversation we are now. Instead, they got mad, painted Mr. Gore as a liar and radical leftist (as though there’s anything wrong with being a radical leftist), and invested hard-core into climate change denial.

The other challenge standing in the way of easy acceptance of the severity of climate change is the fact that all the genuinely informative and factual studies are written in academic science language, something the average person doesn’t understand, doesn’t see the importance in understanding, and therefore holds the summaries suspect because they don’t understand a damn thing the paper just said. Let me try and help you out there a bit.

Last year (2018), the Intergovernmental Panel on Climate Change issued a report on Global Warming of 1.5℃. Right there, in the title, they lost the vast majority of Americans who might, depending on their age, have been taught in school how to convert between Celsius and Fahrenheit (hint: you multiply the Celsius temperature by 1.8, then add 32). If we’re going to get alarmed over what seems to be a relatively low amount, we need to understand what that increase means. 

For example, in 1980, the mean temperature for the planet was around 57℉. By 2015, that had risen to 61℉ and that’s when we saw scientists begin to scream, “Oh shit!” and start throwing major conferences on just how severe the problem has become. If we take the 2015 number and even add one full degree (which is where we were this past summer), the conditions become rather worrisome.

Why get so upset over one degree? Because it only takes as little as five degrees difference to take the planet from nice, reasonably livable conditions to being buried under thousands of feet of snow or, if it goes the other direction, a complete desert with no surface water available anywhere.

Spinning your little head a bit? I know, on the surface it doesn’t appear to make sense because we see more than five degrees fluctuation in a single day, especially this time of year. In the Midwestern United States, it’s not the least bit unusual for some days to see a thirty-degree shift between morning and evening temperatures. If we can endure that with no problem, how is complete devastation possible because of only five degrees?

Our friends at the NASA Global Observatory explain it like this:

The global temperature record represents an average over the entire surface of the planet. The temperatures we experience locally and in short periods can fluctuate significantly due to predictable cyclical events (night and day, summer and winter) and hard-to-predict wind and precipitation patterns. But the global temperature mainly depends on how much energy the planet receives from the Sun and how much it radiates back into space—quantities that change very little. The amount of energy radiated by the Earth depends significantly on the chemical composition of the atmosphere, particularly the amount of heat-trapping greenhouse gases.

While land temperatures fluctuate wildly, the rapid warming of the earth is taking us quickly toward a condition where human life is no longer sustainable. And how hot is too hot?

1.5℃ from pre-industrial levels. Spoiler alert, we were already 0.79 degrees warmer in 1980. The earth’s temperature hasn’t gone down any since then.

Now that we understand why the title of this report is alarming, let’s look at some of its findings. 

  • Human activities are estimated to have caused approximately 1.0°C of global warming above pre-industrial levels, with a likely range of 0.8°C to 1.2°C. Global warming is likely to reach 1.5°C between 2030 and 2052 if it continues to increase at the current rate. 
  • Climate models project robust differences in regional climate characteristics present-day and global warming of 1.5°C, and between 1.5°C and 2°C.
  • Estimates of the global emissions outcome of current nationally stated mitigation ambitions as submitted under the Paris Agreement would lead to global greenhouse gas emissions in 2030 of 52–58 GtCO2eq yr−1 (medium confidence). Pathways reflecting these ambitions would not limit global warming to 1.5°C, even if supplemented by very challenging increases in the scale and ambition of emissions reductions after 2030 (high confidence). Avoiding overshoot and reliance on future large-scale deployment of carbon dioxide removal (CDR) can only be achieved if global CO2 emissions start to decline well before 2030 (high confidence).

Now, let’s break this down to a third-grade reading level. The first point is one we’ve heard, and denied, for 30 years. Human activity is causing the earth to warm. 1.5℃ warmer is when bad things start to happen. Those bad things cannot be reversed. Nothing here is new, we’ve just argued over it so long it’s now an emergency.

The second point is that 1.5℃ is the LOW end of the scale. Regionally, some areas of the planet will see warming to 2℃. This is bad. This is very bad. A 2℃ increase means people cannot live there. People will have to move. Global migration increases. Global food supplies are not enough. Some animal species will die out completely. Global resources are too small to handle those changes. 

The third point is, perhaps, scarier. Even if everyone followed the Paris Climate Agreement like they’re supposed to, it’s not going to be enough to prevent the planet from warming to 1.5°C. The “solutions” we have now are not enough even if everyone played along and the US isn’t playing at all. Our government is going in the opposite direction as quickly as possible.

Let’s talk more like grownups again. Our ridiculous arguments over whether the science is real have cost us dearly in terms of time available to find and enact an appropriate solution. The fact that climate change is even a question in anyone’s mind is a depth of ignorance and/or stubbornness that may have to be declared criminal in order to avoid complete extermination of the planet.

Even among those who do accept that climate change is happening there has not been enough alarm over how severe the consequences are going to be within the next ten or so years. Let me say that again: ten years. 2030 sounds distant for many people but that is no longer reality. We’re not looking at only the loss of every major coastal seaport and a redefining of beachfront property by several miles, we’re looking at massive drops in food production. As polar ice caps melt, more water becomes over salinated, making it undrinkable. Production rates for crops such as wheat, rice, potato, soybean, sugar beet, alfalfa, cotton, tree and vine crops, and most vegetable crops decreases because of the increased CO2 (long and scientific explanation of why can be found at The National Academies of Science, Engineering, and Medicine

Not everything is going to wait ten years before becoming problematic, either. Global migration is already an issue and is only going to worsen as more areas of the world become uninhabitable. Europe is already feeling the pain where migration is expected to triple over the next ten years. The World Bank Group estimates that 140 million people from sub-Saharan Africa, South Asia, and Latin America will be displaced by 2050. As that migration takes place, political, cultural, and social strains easily result in outbreaks of violence as bigotry, racism, and discrimination fueled by rampant Nationalism becomes more of a problem than it already is. 

A 2016 presidential memorandum addressed the extent to which climate change presents a threat to national security in the United States. That memorandum said, in part, 

“Extended drought, more frequent and severe weather events, heat waves, warming and acidifying ocean waters, catastrophic wildfires, and rising sea levels all have compounding effects on people’s health and well-being. Flooding and water scarcity can negatively affect food and energy production. Energy infrastructure, essential for supporting other key sectors, is already vulnerable to extreme weather and may be further compromised.” 

However, the current administration revoked that and other climate-change-related memos, choosing to completely ignore the severe danger. The administration’s opinion seems to be that if it’s not making money that it’s not important. Such an incredibly ignorant and short-sighted approach doesn’t merely threaten the economy and the stock market the president seems so worried about, but also the lives and well-being of every person in the United States. 

What we’re looking at is an ecological and economic disaster of a magnitude far greater than that of the Great Depression a century ago. The less we do, the less done not only by the United States but every government across the planet, the greater the risk that we hit that 1.5℃ mark and blow right past it. If we wait for the natural order of politics to provide change, we inevitably find ourselves facing a situation where we can no longer focus on prevention and instead are forced to find more radical ways to respond to the crisis.

A Desperate Situation Requires A Radical Response

Dead Conch

The days for a moderate, careful response to climate change passed thirty years ago. We are now in a situation where mass migration, drought, new deserts, food shortages, severe coastal flooding, agricultural failure, economic inflation, and all the social unrest that goes with those conditions is inevitable unless we make dramatic and uncomfortable changes. Those changes inevitably mean upsetting the status quo and thereby defying the powers that be and making at least half the population angry. We know that before ever starting.

In her new book  “On Fire: The Burning Case for a Green New Deal,” Naomi Klein compares the modern situation and “radical” proposals to the era that prompted Franklin Roosevelt’s New Deal. She writes:

The skepticism is understandable. The idea that societies could collectively decide to embrace rapid foundational changes to transportation, housing, energy, agriculture, forestry, and more— precisely what is needed to avert climate breakdown—is not something for which most of us have any living reference. We have grown up bombarded with the message that there is no alternative to the crappy system that is destabilizing the planet and hoarding vast wealth at the top. 

From the start, elite critics derided FDR’s plans as everything from creeping fascism to closet communism. In the 1933 equivalent of “They’re coming for your hamburgers!” Republican senator Henry D. Hatfield of West Virginia wrote to a colleague, “This is despotism, this is tyranny, this is the annihilation of liberty. The ordinary American is thus reduced to the status of a robot.” A former DuPont executive complained that with the government offering decent-paying jobs, “five negroes on my place in South Carolina refused work this spring . . . and a cook on my houseboat in Fort Myers quit because the government was paying him a dollar an hour as a painter.”

Far-right militias formed; there was even a sloppy plot by a group of bankers to overthrow FDR.

Self-styled centrists took a more subtle tack: In newspaper editorials and op-eds, they cautioned FDR to slow down and scale back.

The rhetoric of nearly 100 years ago hasn’t changed. As Americans, dramatic change scares us. Being told that we might have to be temporarily inconvenienced in order to make things better make us angry. Consider the typical response to large expanses of road construction. We fuss and fume about the detours and the heavy traffic and the inevitable delays. We deride construction workers for not moving fast enough. We curse at the long lines. Yet, when the work is done and the roads are smooth, there’s no denying that, as uncomfortable as the construction period was, it was necessary to keep the entire road from falling apart.

Our environment is at exactly that same stage. We are on the verge of having the entire planet crumble underneath our feet. If we are to have any hope of preventing total collapse we have to begin work right now and accept the fact that some very basic elements of life and economics in the United States and around the world have to change. 

Painful truth: change is going to happen one way or another. Either we can take steps in an attempt to control at least some of that change, or we can let it happen to us and suffer the consequences. All the bad things possible will happen if we sit on our ever-expanding backsides and do nothing.

An all-too-perfect example is the United Kingdom’s decision three years ago to leave the European Union. When the vote first passed, the UK government had time and opportunity to craft a workable departure that would have minimized the economic impact. Parliament made the decision to not do that. They fussed. They argued. They refused to cooperate with anyone under any circumstances. Those who wanted to stay in the EU dug in their heels and refused to consider any compromise. As a result, they are now at a point where they’re having to consider significantly more dramatic and uncomfortable actions to keep the country from leaving without the benefit of trade or any other treaties and, as a result, not only upending the UK economy but potentially putting the entire global economy into a downward spiral. 

Stubbornness and commitment to petty ideals have been the death of many solutions that could have already saved us from being in this frightful situation. We have reached a point where politicians can no longer be trusted to lead on environmental issues. Instead, our best option is to appeal directly to state and local governments, private corporations, and non-profits to take the actions federal governments will not and make changes even in defiance of federal regulations.

Another example: In July of this year, automakers Ford, Volkswagen, Honda, and BMW openly defied the federal government’s rollback of fuel emission standards for new vehicles by signing on to a California deal that decreases greenhouse gas emissions by 3.7 percent each year between 2000 and 2026. Yes, the change will increase the price of new vehicles, but the long-term benefit to the environment is far greater. The president’s objection reflects fear from the oil and gas industry as the new vehicles also improve fuel economy by as much as 50 miles per gallon, potentially putting a severe dent in industry profits. 

At this point, however, any argument against improvement of the environment is irrelevant no matter who is doing the arguing. To defend the status quo on the basis of one industry’s profit or loss is unconscionable. If the planet overheats by 1.5℃, the net effect is going to be severe enough to crash all economies on its own and at that point, there is nothing the federal government can do to stop it.

Change Deliberately Or Consequentially

All the denial and arguing in the world isn’t going to stop the warming from happening. By 2030 either we’ve taken the dramatic steps necessary to slow the warming (completely stopping it at this point probably isn’t an option) or we pretend to act surprised when all the things about which we’ve been warned become severe enough we can no longer deny their consequences. Either we care about the sustainability of life on this planet or we don’t. If we do care, we’re going to have to take some dramatic steps quickly. 

What steps make the most difference? The ones that are the least comfortable. Walk with me here.

Agriculture

We have to change the way we’ve been farming. Sure, it’s been productive—the United States provides food for more people than any other country in the world and employees some 827,000 people. However, agriculture is also the fourth-highest source of greenhouse gas emissions in the country. Oops. We’ve been talking about using more sustainable farming methods since I was a kid and some farmers have made moderate changes. 

Get ready to be upset, though, because all that “organic” nonsense that everyone’s been screaming about the past ten years? It’s got to go. Organic farming increases greenhouse gas emissions. Full stop. Those benefits you think you’re getting are not worth losing the entire planet. 

Better animal breeding practices could reduce methane emissions by 10-20% and better-pasturing techniques could double that. However, feed alternatives are where a lot of reduction can take place, as much as 52% in some studies. Dietary oils are key and there are several other feeding methods that show promise.

Improving crop rotation practices, manure holding procedures, reducing the amount of fallow ground, and switching from fossil-fueled to electric pumps and motors are all things some farms have started but the process is expensive and smaller farms need financial assistance in making those changes. The difference, however, is worth any financial investment necessary.

Transportation

This one’s going to hurt. The problem is not only that we drive too much but that the vehicles we use when we do drive are amazingly inefficient and are made more so because of the inferior condition of roads and highways. Everyone’s been screaming about infrastructure investment for years, but the money still hasn’t shown up and where it has the funding has been focused on propping up bad systems rather than replacing them.

First, we need to ditch vehicles using fossil fuels ASAP. The most recent studies show that newer electric-powered vehicles (not the ones from ten years ago) reduce CO2 emissions by as much as half and the technology is only improving. Here’s the thing: we can’t wait for everyone to buy a new electric car in the natural course of individual car buying. Department of Transportation figures show that it takes 11 on average to get a car off the road. We don’t have that much time. That means we have to eliminate used car sales for non-electric vehicles and provide tax incentives, subsidies, and vehicle buy-back programs to encourage the purchase of new electric vehicles. 

Even that move, as drastic as it is, falls short of what we need to get CO2 emissions back in line. We still need to drive less and we also need to reduce the number of airplanes in the sky. On average, whether hauling people or cargo, the average commercial airplane produces a little over 53 pounds of CO2 per mile. One 2010 study shows that over 10,000 are killed each year just from the pollution that planes emit. The most readily available solution to both is investing in high-speed rail systems that utilize electric power. Localized high-speed rail systems in major cities combined with severe reductions of individual car use (most likely implemented by changes in driving laws) would not only reduce carbon dioxide emissions but could save lives do to reduce road fatalities. 

Yes, it’s expensive. Yes, it’s a move that upsets the current economy and shifts power away from traditional sources, but we have to make these moves if we’re going to continue living on the planet.

Elimination Of Fossil Fuel Use

Talk about upsetting the status quo, we’ve had the means to wrench away from our dependence on fossil fuels for at least 30 years and we’re afraid to make the move because of this prevailing myth pushed by big oil and related industries that the effect on the economy would be devastating. It’s all bullshit. We’re talking about eliminating a source of fuel, not the demand. Therefore, the economic impact only hurts those companies who are unwilling to make the shift. Already, big oil has started investing in renewable energy sources and European producers are doing so at a significantly faster rate than US and Asian producers. If governments eliminate oil subsidies to renewable sources, the same dollars stay in the market, continue employing high numbers of people at higher-than-average wages, and the economy benefits. This is not an economic issue but a power issue. Given the corruption we’ve seen in the fossil fuel industry, a power shift isn’t a bad thing.

We can provide all the power needs for the entire planet with solar panels covering 0.3 percent of the earth’s land surface (source). Yes, that’s a lot of land, but since solar isn’t our only choice we can reduce the land-use significantly and still make sure the entire planet has more than energy to not only fuel current needs but the increasing needs going into the future. 

Conservatives and those financed by the oil, gas, and coal industries want you to believe that moving away from fossil fuels is a bad thing. No, it’s not. At the worst, it might mean more people have solar panels on their roofs. If solar panels on your roof are what saves the planet isn’t that a reasonable trade-off?

Rethinking Plastics

We use a LOT of plastic and much of it is for very necessary things especially in regard to medical supplies. So, to completely eliminate plastics, which are traditionally made from fossil fuels, requires a strong and flexible alternative. We’ve been hearing the call to reduce our dependency on plastic for over 30 years. How did we respond? We started using it to store and sell water, causing every environmentalist on the planet to do a hard face-palm. 

Plastics such as the Polyethelene PE that is used most have a carbon footprint equivalent to burning 2kg of oil for every 1kg of plastic. 1kg of plastic is roughly the weight of five plastic shopping bags. Put it all together and plastics represent the fourth largest contributor to greenhouse gases and that’s before we fail to recycle them and they end up being the trash that pollutes everything

The good news here is that technology is rapidly bringing us to that point where bioplastics, especially those produced from hemp, offer the possibility of at least making plastics carbon neutral, meaning they absorb as much carbon as they emit. As of this writing, there are still some areas of the creation process that uses oil and the biodegradable claim is challenging to fully support, keeping it from being the perfect solution. However, the reduction of CO2 a complete switch to bioplastics would provide is a significant boost toward halting the warming of the planet.

The bad news? Big Oil is only too happy to sponsor arguments against bioplastics claiming they’re not fully biodegradable. Biodegradability is certainly something that would help, but the far greater CO2 emissions from plastic happen during the creation process. Arguing over biodegradability is, at best, a distraction to keep any improvements from actually happening. Bioplastics are proven to reduce greenhouse gas emissions and right now that has to remain the top priority. We can worry about biodegradability when we’re sure we’re not all going to die.

Technology

The fourth Industrial Revolution is here and technology is in the driver’s seat. Already, our reliance on technology has grown 100-fold over the past ten years, but in order to save ourselves from the damage we’ve done, we have to go further than tends to make us comfortable. Right off the bat, technology is the fundamental resource that makes all other solutions possible. Still, there is more that it can do and we need to get comfortable with making the investments that utilize technology to its fullest extent.

For example, as scary as autonomous vehicles sound, they provide more fuel-efficient transportation, even in electric vehicles. Humans are not efficient when they drive. We speed, we brake wrong, we rubber-neck like crazy, and all of those bad habits result in using more energy than is necessary to get us from point A to point B.

Technology also offers the opportunity to compress CO2 into fertilizer, turn CO2 into liquid fuel, use CO2 to create hybrid membranes for medical use, and a plethora of other changes that help eliminate the use of fossil fuels and other materials that leave large amounts of carbon in the atmosphere. One of the most critical may be in developing new fabrics for clothing, eliminating the need to grow cotton, an act that completely ruins the land on which it is grown. 

There are plenty of options but what they all need is a dramatic level of investment to get them out of labs and into everyday use. One obvious source of investment funds would be to completely eliminate oil and gas subsidies and put those same funds toward planet-saving technologies. 

Economic Redistribution

If ever there was a strong argument for economic equality, saving the planet is it. The reasons are rather obvious.

  1. The poor suffer the most from environmental damage.
  2. Economic inequality drives environmental damage.
  3. The richest 10 percent are responsible for 50 percent of global emissions.

Equitable distribution of funds and resources allows poorer countries to invest in technologies and methods that reduce greenhouse emissions. Pollution in countries that have greater economic balance is significantly less than in countries with severe gaps between rich and poor. What’s more, as reliance on fossil fuels and their related industries has to be eliminated, people employed in those fields are more likely to experience a severe reduction in wealth as they are not necessarily skilled to transition into the most high-demand fields of employment. 

In order to combat this problem, we need to come to grips with the need for some very uncomfortable economic changes.

  1. Significantly taxing the richest one percent
  2. Significantly taxing corporations, especially those involved in industries that contribute to greenhouse gas emissions
  3. Greater public investment in global education
  4. Significantly higher wage minimums 
  5. Tighter control of housing and food costs

I can hear the screaming from here. Let’s face the facts, though. Trickle-down economics only benefits the rich. Inflation in the housing market has created a crisis. More public funds are necessary to combat warming and there’s no legitimate reason to put that burden on those who can least afford it. Personal wealth and corporate profits have to take a back seat to the sustainability fo the planet.

Planetary Problems Require Global Solutions

Bird tracks in the mud

Global warming and the resulting climate change are not problems unique to the United States. Granted, we’re the largest country not making a concerted effort to find a solution, but the problem is universal. For us to avoid reaching the 1.5℃ mark or worse in ten years, every country on the planet has to participate in solutions. When the Paris Climate Agreement was signed in 2016, 197 countries, including the United States, signed on. However, not everyone has been able to sell the agreement at home and the current closed-minded administration pulled the US out altogether. We’re not alone, though. In addition to the US, there’s an interesting list of countries that have not ratified the agreement.

  • Turkey
  • Iran
  • Angola, 
  • Eritrea, 
  • Iraq, 
  • Kyrgyzstan, 
  • Lebanon, 
  • Libya, 
  • South Sudan, and 
  • Yemen

What all of those countries share with the United States is an authoritarian leader (though not necessarily authoritarian government) whose focus is on maintaining a tight grip on the rule of their country. Such an inward “me first” focus is detrimental to addressing climate change. Leaders have to actually care about the welfare of the people they govern in order to support solutions that involve international cooperation. 

Such Nationalistic tendencies are more a reflection of the leader’s psychosis rather than the nation’s true attitude. Notice that some of the world’s most infamously dictatorial leaders, including Russia’s Putin, China’s Xi, and North Korea’s Kim, all recognized the need for their country to cooperate. For a leader to not adequately address the emergency of global warming is to demonstrate utter disregard for the people most likely to be directly affected by the crisis: the people they rule.

While most of the countries who have not ratified the agreement are small and some, such as Kyrgyzstan, barely have a carbon footprint at all compared to other countries, the United States is the world’s second-largest contributor to CO2 emissions behind China. For the US to not address the challenge not only dooms Americans but the entire planet. We, as a country, fail to provide the most critical leadership and in doing so effectively sign the death warrants of millions of people.

Yes, I realize that sounds alarmist, but this is the reality.

About Those Consequences

Tree stump in a withering lake

Remember when the IPCC report mentioned overshoot and gave it a “high probability?” That means they don’t expect the world to be able to limit warming to 1.5℃. In fact, they go ahead and admit that some regions will see an increase of 2℃ or higher. So, what happens if we completely fail and blow right past that half-degree increase from where we are now?

It’s not pretty. If greenhouse gas emissions remain at their current level, here are some of the effects.

  • Temperature records will continue to be broken. Considering this past summer already saw the hottest months on record, it’s safe to assume severe drought patterns across places that normally do not have a problem. [source]
  • The amount of land destroyed by wildfires would more than double to approximately 5.3 million acres annually. This would continue to grow in severity putting people at risk who have never needed to worry before. [source]
  • Severe drought across 40% of all land on the planet. Say goodbye to normal crop production. Should we stay at the status quo, rates of hunger and starvation would spike as prices for available food would shoot high. [source]
  • Reduced nutritional value of existing food would result in a food-security crisis for some 821 million people (estimating conservatively). While sources decline to predict morbidity rates, there’s little question the death toll would be considerably higher than it is now. [source]
  • If we reach a 2℃ warmth above the pre-industrial level, the result is a climatological feedback loop that would cause temperatures to jump 4-5℃. There are currently no reliable models for how severe the effect could be. [source]
  • Warming water rising 2-3 feet above current levels expands, displacing approximately 700 million climate refugees. [source 1, source 2, source 3]
  • More frequent and more intense hurricanes. We’re talking multiple F5 and stronger storms with an expanded hurricane season. We’ve already seen how devastating multiple storms in a single season can be. Imagine those storms on steroids. [source]
  • 60 % of all coral reefs will be highly or critically threatened. Millions of people would lose their primary food source. Whole fish groups would go extinct and disease would infect those that remain. This alone could cause global markets to completely crumble. [source]

And those, dear friends, are just the tip of the proverbial rapidly-melting iceberg. There’s no way of estimating what could happen as a result of the severe migration. The social/political unrest could topple entire governments and result in unchecked war and genocide. No country is immune from the potential fallout. Humans have never knowingly faced a greater threat to the whole planet and our very survival.

Radical Solutions Require A Radical Response

birds gather around the little water that remains in a dying resevoir

The current US president is fond of calling those intelligent enough to acknowledge the challenge of climate change as radicals. He calls their proposals radical and thinks that alone is sufficient reason for ignoring them. He’s right in that the only solutions left to us now genuinely are radical. They are upsetting to the status quo and require changing some of the most fundamental aspects of our lives. There’s no harm in admitting that the whole thing is just a little bit scary.

Where the president and his supporters are wrong, though, is in thinking that if they yell loud enough, ignore hard enough, bully scientists long enough, that it will all go away and they’ll get the continued disaster without any consequences. They are wrong and there’s absolutely nothing they or anyone else can do to stop the disaster if we do nothing. 

Here’s the thing: all those little things like switching the kind of straw one uses and recycling their plastic water bottles and all the other little tasks one does individually provide a false sense of accomplishment. Those things only help if the larger players are doing their part. Household recycling only helps if those materials are actually being recycled through means that are environmentally helpful. Straw use only matters if material from landfills stops ending up in waterways. If the big guys aren’t in the game, individual household participating is irrelevant There’s nothing you or I can do to stop the inevitable.

That means you and I have to become radicals as well. We have to vote, starting at the local level, for city council members and mayors that support clean air initiatives in our own towns.  We have to get radical in voting with our pocketbooks by paying attention to how everything we buy is made and not purchasing from companies who are not doing their best to offset their own carbon footprint. We have to get radical in pressuring our elected representatives to take governmental action, even in the face of an ignorant and incendiary president. That pressure has to come hard and continuously and needs to unseat anyone who doesn’t get with the plan.

When it comes to climate change, there is no such thing as being too radical. Yes, it’s going to be uncomfortable. Yes, it’s going to mean changing the way we do things. But the alternative?

We die. 

The whole planet dies.

Not kidding. Not fear-mongering. This is the reality. 

Time to get radical.

Petty Annoyances - old man talking

I am a grumpy old man. I fully embrace that reality and don’t apologize for being who and what I am. I have worked hard to get here and have no intention of changing any time soon.

That being said, I have to acknowledge the fact that I did not become the curmudgeon that I am without some help from everyone else on the planet. The fact of the matter is that I am grumpy largely because everyone else behaves in such a stupid and illogical manner. Not that I expect everyone to be a Vulcan in their approach to life, but applying a bit of reason and thoughtfulness to one’s daily activities, especially where it applies to interacting with others, would certainly go a long way toward making me a slightly less than an unpleasant person.

My expectation is that people who interact with me in person on a regular basis are well aware of when their actions are annoying. I’m not one to hide either my feelings nor my opinion. If you’re standing next to me and doing something stupid, I’m probably going to address the matter right then and there. Ask my children about this; I don’t mind embarrassing them a bit if it means they stop acting inappropriately.

People who only interact with me online, or don’t interact with me at all, are less likely to realize the degree to which I find them annoying. If it were just me that they were annoying, I wouldn’t expect anyone to actually care. If one is not directly interacting with me, and especially if they don’t even know I exist, then there’s no reason for them to have any compunction about the degree to which I find them annoying.

However, when someone is doing something that I find annoying, chances are pretty high that I’m not the only person with that opinion. Take, for example, the president of the United States, please. Almost everything he does is annoying, to say the least. There are some days when his level of annoyance becomes so great that I have little choice but to ignore him completely and, to the limited extent possible, pretend he doesn’t exist. If I were the only person on the planet who finds the president annoying, that would be on me. However, I’m far from being alone in my opinions on this sexist, bigoted, profane, lying, homophobic, con artist. The number of Americans that find him annoying is in the hundreds of millions and if we expanded that opinion globally I’ve no doubt the number would be well over a billion.

While the president might be an extreme example, given that few people are actually so continually annoying with everything they do, he does serve to highlight the problem that comes from holding ill-formed opinions and thoughtless activities that affect other people. We all tend to be a bit selfish by nature and that is, to some degree, understandable. Problems arise, though, when the effect of our actions on other people causes them discomfort. To the degree that a person fails to realize the consequences of their actions, they become exceedingly annoying.

Take former vice-president Joe Biden, for example. On a general basis, I kinda like Joe; he’s that compassionate grandfatherly figure who has his faults but nothing so big one can’t excuse him. Joe is a touchy-feely kind of guy. I get it. He is a part of that generation that was taught to connect with other people by touching them—a hand on a shoulder to infer support, holding a hand to show compassion, patting a knee to communicate that one has heard what the other is saying. There were once books and conferences that taught this method of physically relating with other people. Whether or not Biden read those books or took those courses I don’t know, but he was influenced by them as a lot of other successful people were.

Then comes the era of #MeToo, a movement long overdue where people, especially women, are speaking up about the many things others do, especially men, that are hurtful, offensive, and annoying. High on that list: unwanted touching. Much to the surprise of some, there are a lot of people, not just women, who do not appreciate people touching them in any manner without permission. The level of uncomfortableness has been present for decades but only now, emboldened by the changing social climate, are people feeling free to speak out.

So, Joe gets called out for his frequent and well-documented habit of touching people, kissing the top of heads, putting an arm around someone he doesn’t actually know, putting a hand on a shoulder and giving a squeeze. Did the vice president intend to do any harm? No, absolutely not. But like many people, especially men, he has been tone deaf to the level of annoyance his actions cause other people, especially women. Our society has finally decided that such actions, especially against women, are no longer tolerable and we’re taking the sometimes painful steps of correcting that behavior.

Not every annoyance is as critical as how and when one person touches another, of course. Most annoyances are smaller, less significant actions that we do without giving anything a second thought. Those are the activities I want to address for a moment. While they don’t have an impact that requires a separate #MeToo movement, they’re almost certainly things that bother a lot of people and no one has felt emboldened enough to say anything. I’m not especially bold, just grumpy enough to go charging on into these topics without necessarily caring if I step on a toe or two. These are actions that need to stop.

Asking guests to remove their shoes

Petty Annoyances - old man talking

Wearing shoes is one of those strange acts that seems to bring us joy in some moments, pain in others, and pure frustration for many. All of my children have issues wearing shoes. While the Marine doesn’t have a choice, the others are quick to shuck their footwear the moment they hit the back door. There are times I’m fairly certain my daughter has hers halfway off before she breaches the threshold.

I am the exact opposite, however, and I’ve known several others like me. We wear shoes almost all day and taking them off can, at times, be a source of extreme discomfort. So, when we visit someone who insists that all their guests doff their shoes at the front door, I’m often tempted to turn around and leave. At the very least, the request spoils my mood for the remainder of the visit.

Beyond my own physical level of discomfort, the soles of my feet being extremely sensitive to everything they touch, I’ve never understood people’s reasoning for asking guests to remove their footwear in the first place. Granted, it’s natural with children who would likely live naked until they reach a point of personal awareness where they’d prefer to hide their bodies. For adults, though, there are other issues to consider.

Let’s talk about the spread of fungal infections. I’m not just talking about Athlete’s Foot here. There are several different types of fungal disease that can spread through bare feet. Once a fungus is in a receptive environment, like a carpet, it’s not eliminated the next time one vacuums. Getting rid of some fungi is more difficult than trying to get rid of pesky mold. Making this more of a challenge is that one is not likely to know that the carpet is the source of the problem, allowing infections to recur.

Going barefoot is also a health risk for anyone who has any type of circulation issue in their lower extremities. Diabetics often have to deal with this matter. When one’s feet lose circulation they don’t feel small pains, such as the prick of a needle or wayward tack that was hiding in that deep pile carpet. Many diabetics can sustain significant foot injury and never realize that anything has happened until they see blood on the floor.

There’s also the problem with foot odor and no, it’s not always associated with poor hygiene. Certain prescription drugs may cause foot odor as a result of taking the medicine. Keeping one’s shoes on helps control the smell and prevents one from being offensive.

Arguments about not wanting to put undue wear on the carpet are silly. Modern carpets are far from delicate no matter how deep and lush the pile. In fact, new carpet fibers are so heavily treated that the chemicals in the carpet can cause an allergic reaction on bare skin, another good reason to keep one’s shoes on their feet.

If one is seriously worried about guests tracking in mud or other dirt onto their pristine floors, then consider providing mats and wipes one can use on their footwear rather than insisting that they remove their shoes. If someone in your family has an autoimmune disease that requires a high level of cleanliness, consider providing disposable foot covers which are not only more friendly but also is safer for your family member.

Personal comfort is the only good reason for removing one’s shoes at any home other than one’s own and comfort is not something one can mandate to other people. If one genuinely values their guests, they’ll allow them to keep their shoes on and enjoy their visit without being annoyed by having to look at everyone’s funky toes.

Referring to your fad as a lifestyle

Petty Annoyances - old man talking

Bile rises into my throat and I want to vomit every time I hear someone use the phrase, “It’s not a __________, it’s a lifestyle!” No, Karen, your obsession with 30-year-old Beanie Babies is a symptom of your psychosis and you really should seek professional help. Fads are not a lifestyle. Just because something is popular enough to consume every waking moment is not enough to make it a lifestyle. Furthermore, it’s annoying as hell for one to treat it as such.

A large number of fads can be time-consuming. Rabid fans of k-pop, for example, often go full-on into cosplay and merchandise hoarding and fiscal irresponsibility in the name of their fandom. That does not make k-pop itself a lifestyle; it simply means that desperate people are so out of touch with their identity that they feel compelled to latch onto something larger. Strict religious adherents suffer from the same malady.

Some fads can also be healthy in moderation. Obese people, of which the United States has an excessive supply, can often find at least short-term benefits in certain fad diets. However, those benefits are often short-lived and may also lead to additional unexpected health issues. Mythologies around nutrition are unsound and frequently dangerous. We need balances for our bodies to function at their optimum capacity and fads, by their very nature, pull one away from moderation in any form.

When someone refers to a fad or a movement as a lifestyle it demonstrates a lack of understanding as to all that a lifestyle encompasses. One can be dedicated to something that is not a lifestyle. One can benefit from things that are not lifestyles. Lifestyles, however, are multifaceted and often complicated matters that typically involve large groups of people.

Before one goes running to an online dictionary in an attempt to prove me wrong, not that anyone would ever do that, let’s look at one of the most complete definitions I’ve found, oddly enough in the Business Dictionary.

Lifestyle: a way of living of individuals, families (households, and societies, which they manifest in coping with their physical, psychological, social, and economic environments on a day-to-day basis. Lifestyle is expressed in both work and leisure behavior patterns and (on an individual basis) in activities, attitudes, interests, opinions, values, and allocation of income. It also reflects people’s self-image or self-concept; the way they seem themselves and believe they are seen by the others. Lifestyle is a composite of motivations, needs, and wants and is influenced by factors such as culture, family, reference groups, and social class.

Lifestyles are the combination of many elements, not the obsession over just one. Attitudes can be part of a lifestyle but are not a lifestyle unto themselves. Activities are often associated with specific lifestyles but are not lifestyles on their own.

When we think of what constitutes a lifestyle, we need to think in larger terms than one specific element such as a diet or a fashion choice. Being urban, or rural, or country can be lifestyles because they not only invoke a specific attitude, but activities, socio-economic settings, employment opportunities, and moral outlook. Luxury is a lifestyle that many people try to mimic but only a few obtain because of the economic requirements for that lifestyle. Tribal lifestyles incorporate the whole reality of existence within a limited group of people who share a common ancestry and culture.

Compare those examples to frequent misuses of the term and one can see how a diet is not a lifestyle, a hobby is not a lifestyle, and sexual proclivities do not constitute a lifestyle. Lifestyles are broad, complex, and, perhaps most importantly, involve an economic factor that limits or helps define participation in that lifestyle. Calling one’s multi-level marketing scam a lifestyle in an effort to elevate its importance is essentially lying and definitely part of the con job inherent to such schemes. Don’t try to make more of your interests than they deserve. Enjoy what you do, but don’t overinflate its value.

Failing to vaccinate your family

Petty Annoyances - old man talking

To some degree, I shouldn’t need to include vaccinations in this list. I argued with myself about whether it was necessary. After all, I’ve expressed my opinion on the matter previously, most extensively in the article 10 Horrible Deaths Awaiting Offspring of Anti-Vaxxers. I include it again because there are perhaps some reading this time who didn’t see the previous mentions and because, quite frustratingly, we’re seeing an increase in the occurrence of diseases that had all but been eliminated.

Let me be extremely clear: anyone who does not vaccinate their children is a goddamned fool and don’t expect me or any other reasonable person to backtrack from that opinion. What has changed is that I’m far from being the only one annoyed with this situation and as cases of measles and mumps have begun cropping up across the nation, city health departments and governments are letting anti-vaxxers know that they’ve had enough and are no longer welcome. In fact, while the Constitutionality of such acts is still questionable, more cities are attempting to make it illegal for non-vaccinated persons to be out in public spaces where they run the risk of infecting everyone else.

Scream and shot about personal freedom all you want, public health has to come first and quite honestly we’re done with your children infecting the rest of us. No, we don’t want our kids suffering from your idiocy. No, the vaccinations absolutely, positively, without question DO NOT cause autism.

We are at that point where I support the public shaming of people who do not vaccinate their children. This is no longer a choice anyone should have. We let it go for too long and now we’re seeing a resurgence of measles and other childhood diseases that should be going the way of the Dodo. On any other level, hurting other people is a crime. When you don’t vaccinate your children you’re hurting other people. Period. That’s a crime by any standard of morality ever conceived.

Understand, I do not come to this position easily. My preference and that of most people is to allow plenty of leeway for others to hold and express beliefs that are different from the mainstream. If you want to believe that having crystals in your house brings you good luck, fine, run with that. If you want to believe that essential oils do something beyond making everything and everyone greasy, cool, be greasy.

What I need to get through the thick head of every anti-vaccination person in the world is that YOU’RE HURTING OTHER PEOPLE AND IT’S NOT OKAY. At the point that your belief system, no matter what it is, begins doing harm to other people, it needs to go away. Permanently.

Denying established and proven matters of science

Petty Annoyances - old man talking

Following closely on the heels of people who refuse to vaccinate their family are people who deny established and proven matters of science. Again, this has always been one of those areas where people like me look at science deniers, roll their eyes, and go on. For the most part, science deniers are harmless as long as they don’t breed too often.  

Then, we elected an idiot for a president who can barely say the word science correctly and has absolutely no understanding of anything going on in the field. First, he attempts to deny climate change. Then, he questions the efficacy of the Food and Drug Administration and the Center for Disease Control and Prevention. Now, just in the past week, this monkey made the unbelievable statement that the noise from windmills causes cancer. Every science authority on the planet looked up and collectively asked, “Are you fucking crazy?”

The answer is yes, he is.

Once again, the problem with science deniers, and especially the problem with having them in places of authority, is that they’re beginning to make decisions that ultimately hurt the entire planet. From increasing the threshold for carbon dioxide emissions to reducing funding for alternative energy sources, science deniers are bringing a level of devastation on this planet that we’ve likely not seen in the past 100,000 years, or at least since the last ice age, which ended approximately 11,700 years ago (Pleistocene epoch). This is not a good thing. This goes so far beyond annoying as to be ridiculous.

Fundamental to this problem is that a lot of people don’t seem to understand how science works. So, let this chart explain it to you.

Petty Annoyances - old man talking

The scientific method is involved and meticulous and on many topics becomes extremely complex. What’s important, though, is that on the most critical matters the findings of a study are not considered legitimate until they’ve been peer-reviewed and ideally the experiments and/or procedures repeated producing the same results. Scientists are smart enough to know strange things can happen during a single study or experiment. Anomalies might appear that skew the results. So, having someone else look over the data and repeat the experiment is critical to proving the hypothesis.

At the end of the scientific method, however, once there is agreement on a large scale as the result of multiple repititions of the same process, those results are considered fact. To disprove those results, one would need an equal number of equally intense study following the same scientific principals producing a different result. One cannot simply formulate a hypothesis and claim it is fact. The hypothesis has to be tested and re-tested or else it is worthless.

When we deny science that says we are damaging our planet, we are not only endangering ourselves but every person who might live at some point in the future. Science matters on levels that are often difficult to explain but yet remain absolutely critical to the very survival of our species. When scientists tell us we need bees and butterflies to survive, they’re not just saying that because they like bees and butterflies. FOOD STOPS GROWING without bees and butterflies and since we need food to survive it becomes rather critical that we pay attention to that warning.

Furthermore, for anyone who is not a scientist to question the findings of scientists is like asking a five-year-old to inspect a Boeing 737 Max 8. Amateurs are not qualified to question the results of a thoroughly vetted scientific finding even though that scientific finding may not align with a person’s world view. The opinion os random naysayer does absolutely nothing to prevent disaster from happening the next time a 737 takes off with a plane full of people.

We are well past the point where people need to understand that while science may not always be exact and that our understanding of certain “givens” may change from time to time, we still have to trust those findings unless we are holding empirical and undisputable findings that prove otherwise. Climate change is real. Windmills do not cause cancer, but coal ash does. And yes, if the bees and butterflies disappear, we are going to starve. To believe anything else is foolish.

Claiming public funds are “my money”

Petty Annoyances - old man talking

One of those misleading things a politician has ever done was give the average citizen that government funds are still their money. The whole concept that “you’re paying for that wall,” is flawed and lacks logic. Therefore, to complain that one doesn’t want “my money” going toward a certain event or program or cause is dramatically ignorant and grossly demonstrates one’s ability to consider basic economic principals.

The example that I think explains the situation best is this:

Let’s say one walks into a store and drops a quarter into one of those bubble gum machines and receives a gumball. One puts in the required currency in the appropriate form and receives that for which they paid. End of transaction, right? One doesn’t get to object to the color of the gumball they receive. One doesn’t get to change the flavor of the gumballs in that machine. The fact that one has purchased their own gumball doesn’t give them the right to deny gumballs to others or to require increased payment for other people to get their own gumball. Put in a quarter, get a gumball, leave. If one wants different results one has to put in another quarter.

Do you understand the parallels? The amount of individual tax in relation to the size of the entire federal budget is about a quarter, less if one is including payment on the largest deficit ever. We put our quarter in, we get government back. Our taxes don’t change the flavor of the government—to do that one has to vote, making oneself a minority shareholder in the company that makes the gumballs. Our taxes don’t allow us to dictate who gets what—that’s the responsibility of Congress. Our taxes don’t give us the right to deny access to government to anyone. The fourth amendment of the Constitution guarantees equal access to the government and everything the government does.

When we pay our taxes, whether through payroll deduction or by writing a check to the IRS every April 15, that money is no longer ours just like the quarter we put in the gumball machine is no longer our quarter. That quarter now belongs to the person who owns the gumball machine, right? It stopped being your quarter the moment you turned the handle. The same applies to our tax dollars. Once we send it to the IRS, it no longer belongs to us, not even in the existential sense that we are the government. We are not part of the government that makes financial decisions. The only means through which we get a say on how those taxes are spent are by electing people to Congress and to the presidency who hold the same values as do we.

Extending this metaphor further, saying that “my money” is paying for anything, for convenience let’s say road repair, is like saying that your quarter paid for all the corn syrup used to make gumballs. No, your quarter didn’t pay for shit because it stopped being your quarter when you turned the handle on the gumball machine! The company’s money paid for that corn syrup and the federal government is paying for those road repairs. That’s not your money. You don’t get to decide what brand of corn syrup the gumball company uses any more than you get to decide which potholes get filled on any given Tuesday.

Yes, that means there are times we might not like how the government chooses to spend its money. There are also times we get flavors of gumballs we don’t like. Personally, I hate licorice and it would frustrate me as a child to put my money in the machine and see that horrible black gumball come rolling out. This wasn’t what I wanted! However, the fact that I don’t like licorice gumballs does not give me the right to insist that the gumball company stop making licorice gumballs. The company has a responsibility to make flavors of gumball for everyone, not just me. Since a significant number of people like licorice, the company has a need to make licorice gumballs. Same with the government. They’re job and purpose is not to cater to the whims of your desires but to do what is best for everyone in the country and it doesn’t really matter whether you like it or want it or use it or not.

Now, if we suspect that the gumball company is lacing the gumballs with arsenic then we have the right and responsibility to do something through whatever legal remedies might be appropriate. The same applies to the government. When Congress chooses to spend in ways contrary to the best interest of the nation, we have legal recourse: vote. Elect someone else who will make different choices. Even then, however, we need to realize that our elected representatives are more like proxy voters in a stockholder’s meeting. While our proxy might vote for one program and against another, the remainder of the “board” or stockholder’s proxies can overrule and outvote our proxy. That’s the way a representational democracy works.

The concept that we have any right to take any level of ownership over specific parts of the federal budget is ridiculous and was invented by conservative politicians in an attempt to create discord and dissatisfaction so as to influence the outcome of the elections. Unfortunately, there are enough people who bought that ridiculous notion to allow that plan to work.

Stop. The government, at any level, is not using “your money” on potholes or anything else. Get that stupid notion out of your head and we’ll all lead more peaceful lives.



I could go on, of course. There are plenty of things that annoy me, such as people using unnecessary abbreviations or neighborhood associations. What I’ve learned over the years is that dumping a full load of complaints in one sitting is counterproductive. So, let’s work on this small group first and when we’ve fixed those we’ll move on to something else.

Pizza Toppings, Neanderthals, And The Ability To Chill

Gravity is the root of lightness; stillness, the ruler of movement.
Tao te Ching, 26

Neanderthals would probably have not liked pineapple on their pizza and that’s why they’re extinct. Sort of. Not exactly, mind you. Although Neanderthals did presumptively occupy parts of what is now Italy so I guess it’s always possible they had something that was remotely like pizza, it wouldn’t have had a crust and they certainly didn’t have access to pineapples.  This is where middle-of-the-night extrapolation takes us. There’s no way to actually prove whether Neanderthals would have enjoyed pineapple even if someone had dumped a load of them in the middle of the colony. Brain exercises are fun to a point, but beyond that point, they just hurt.

I cannot speak with complete accuracy on anything that occurred outside my lifetime. I can presume, assume, and extrapolate based on scientific evidence, but to some degree the more full we try to make our minds the emptier they become. At least, we are more aware of the expanse of that emptiness. We might start the day not knowing what we don’t know, but the more enlightened we become the more we realize the vastness of our ignorance. Realizing how much we don’t know is frightening for some, but as we grow we also begin to discern what matters and what doesn’t. To not know the things that are fundamental is truly disturbing. To not know matters of trivia merely makes one a bad partner for game night at the pub.

Where we run into problems is when what appears to be important is actually trivia and what we consider trivia becomes important. I might have had one of those moments this week, one that had me double-checking my medication to see if someone had slipped some acid into one of the bottles. I’ve always considered that, within the realm of paleontology and other studies of ancient things and beings, studies of Neanderthals was relatively trivial. Sure, it’s kind of fun to know about this long-extinct species, and maybe there are some lessons to be learned from understanding why they become extinct, but Neanderthals became extinct some 40,000 years ago. About the only thing that hasn’t changed over that expanse of time are cockroaches. They’ll still find a way into even the most well-built structure. I have always considered Neanderthal studies something of use only on extreme nerd trivia night.

I should have known I was wrong, of course. There wouldn’t be millions of dollars spent on paleontological studies if it were nothing more than nerd trivia, would there? Well, maybe, but as it turns out all those scientists are increasingly finding more relevance between Neanderthals and modern life. Evolution and the law of natural selection cause things to change and adapt, but for all that change there are still pieces of that ancient history that still exist and are still a part of who we are today.  That little bit of knowledge once again reveals just how much we do not know about ourselves and how the habits of our ancestors continue to affect us.

Of course, there are wise words from the Tao te Ching that we might apply here. Part 27 says:

In the same way the sage is always skilful at saving men, and so he does not cast away any
man; he is always skilful at saving things, and so he does not cast away anything. This is called ‘Hiding the light of his procedure.’

Be careful, dear dudes, the Tao is not suggesting that we all become hoarders.  Garbage is still garbage and it has its place but that place is not in your home. Please.

What comes to bear from those wise words, though, is the admonition that there is no knowledge to be thrown away. What information might seem trivial to us at first later becomes the invaluable missing piece to a puzzle we’ve been trying to complete. So, we don’t throw away research on Neanderthals because, as it turns out, not only does that research explain why some like pineapple on their pizza and others don’t, it also helps explain why some have more difficulty being chill and abiding than others.

The sciencey stuff is about to get deep in here, dudes, so try to hang with me as best you can. Before the new stuff can make any sense, we have to understand what was already known. Don’t worry, I’ll keep this part as short as possible. Follow the links if you feel like chasing the rabbits.

First, understand that humans (homo sapiens) and Neanderthals are two very different species but they do share a common ancestor. For reasons we’ve yet to understand, Neanderthals ventured off from Africa and roamed across most of Eurasia. They evolved differently from humans, with differences in everything from what they ate to the color and amount of hair covering their bodies. Previous studies show that the presence of Neanderthal DNA accounts for things like one’s ability to clot blood more easily, immunity from certain diseases, and also some addictive qualities and susceptibility to depression. We also already know that unless one’s ancestry is 100% African, you’re not a “pure” human. For non-African people, roughly 2-4 % of your DNA is from Neanderthals. If you’re really worried as to whether you have Neanderthal DNA, now might be a good time to indulge in some genetic testing. The more non-African your history, the more likely some remnants of Neanderthal DNA exists. Interesting, isn’t it?

With that information already in our back pocket, along come a couple of new studies that I had the joy of exploring this week. I have to admit, the academic language of these things makes my head hurt after a couple of hours. What they ultimately say, though, is worth the mental endurance course to get to the conclusions. As more genomes are thoroughly mapped, the information we are discovering about why we are the way we are is invigorating to the point it almost has me squirming in my chair. Or maybe that’s the coffee. I’m never quite sure.

Pizza Toppings, Neanderthals, And The Ability To Chill

photo credit: charles i. letbetter

The First Study

First up comes the study of a bone fragment labeled Vindija 19, 23.  The bone fragment was found in a cave in Croatia and dates to a female Neanderthal who lived 52,000 years ago. This is important because the best previous Neanderthal DNA mapped was from one who died about 122,000 years ago in the Altai mountains of Siberia. The similarities between the two are striking. Consider that the fragments are separated by nearly 4,000 miles and some 70,000 years. Under normal evolutionary expectations, there should have been a shit ton of differences between the two. Nope. Barely any difference at all.

What does that tell us? For starters, that Neanderthals lived in very small communities or tribes with almost zero genetic diversity. Without that diversity, one bad flu season wipes out the entire tribe. While the Vindija woman’s parents were not as directly related as the Altai’s parents, they were still very, very close. These were communities that did not venture far so long as there was sufficient food to be found. They would have been extremely conservative, relied heavily on tradition, and had a culture based largely on fear of the unknown, and almost everything in their world represented an unknown. That exclusivity and fear of trusting anything, or anyone, from outside their community was likely a significant contributor to the extinction of the species.

The Vindija fragment also opens some doors to our understanding of how that remaining Neanderthal DNA might still influence modern humans. The research shows that Neanderthal DNA makes us more susceptible to rheumatoid arthritis, schizophrenia, and eating disorders, as well as influencing things like how we respond to psychotropic drugs (whether or not you can handle the shrooms, man), our retention of “bad” cholesterol, and even that beer gut that never seems to go away despite the fact we don’t really drink that much beer anymore. Mind you, that doesn’t mean you can blame any of those issues on your Neanderthal DNA. The DNA is merely an influencer, not a cause. You are still responsible for your own problems, dude.

At this point, I look at myself and think I’ve got to be pegging the high end of that four percent Neandertha DNA. That would make an easy explanation for so many of my problems, even though, again, that DNA cannot be blamed for anything more than merely contributing to one’s natural susceptibility to certain issues. Still, dude, there’s no way I’m all human.

What I find interesting in this study is that those traits still exist despite the fact that Neanderthals have been extinct longer than we’ve been able to record our own history. That their DNA would still have any influence at all shows both resilience as well as longevity. If the DNA of a species that has been non-existent for 40 millennia still influences a completely different and extremely evolved species, what influences might we pass on to whatever species follows us in several thousand years? Not that the question is necessarily one to cause worry, mind you, but it is something to ponder.

Then Came Study #2

Next up comes a study that compares the DNA of the Altai Neanderthal (the 122,000-year-old sample) to 100,000 living people whose DNA is part of the UK BioBank. The results of this study were similar to that of the Vindija study in that it shows a correlation, not causation, between Neanderthals and any number of psychological and neurological issues, especially an inclination toward depression. That makes sense if we stop and think about it. If Neanderthals lived in small, isolated communities that were largely inbred and not very adventurous, they didn’t have a great deal going to make them happy. I’m not even sure that shagging, which they seemed to do in abundance, made them happy. Rather, it was more of fulfilling a primal need. As that trait continued for thousands of millennia it became a part of their core identity that continues to be handed down long after they shagged a bunch of humans that happened to be wandering through.

Oh, and I found this interesting: Neanderthal DNA influences whether one is more likely to be a night owl. We can only speculate why that might be the case. Were they night hunters? We can create scenarios where that makes sense. Sneaking up on prey in the dark would give them some advantage. Or perhaps they stayed up all night to prevent other predators from sneaking up on them. Either way, it’s mind-boggling that such traits could still influence our biorhythms and our internal clocks today.

The third aspect of this study also matched the Vindija study in showing a link to addictive behaviors. This study zeroed in on smoking for some reason, though there’s absolutely no evidence that Neanderthals even had access to tobacco. The smoking addiction was prevalent in the modern humans, though, and the genomic inference is that the presence of addictive traits in one’s DNA facilitates an inclination toward such behavior and the difficulty some have in quitting.

100,000 people is a huge sample so the findings seem rather certain. Perhaps most striking, though, is the influence of DNA on skin tone and hair color. Now, we have to be careful here, because again, there’s a lot of different evolution at play here, but what the research shows is a link between variations of Neanderthal DNA and one’s blondness and fair skin or dark hair and darker complexions. What that means is that the blonde-haired, blue-eyed goal of Hitler’s “master race” was dependent in part on being less human and part Neanderthal. Feel free to remind white supremacists of that the next time they open their dirty mouths.

Oh, and this same research also confirms that there were, in all probability, zero red-headed Neanderthals. None. They didn’t exist. Which means we still don’t have a clue where those gingers came from. They may be even less human than the rest of us. I’m kidding, of course. Maybe.

But Wait, There’s More!

As if all that wasn’t fun enough, a third study published this week looks at early humans from somewhere around 34,000 years ago. This is important for our current discussion because of the close proximity between these early humans and the only recently-extinct Neanderthals. The DNA studied comes from three individuals, an adult male, and two children, as well as a hollowed-out thigh bone buried with the children. The two children were in the same grave buried head-to-head and the adult was buried nearby. From the proximity of the graves, by modern standards, one could assume that the three were related, but full DNA mapping revealed that to not be the case. Not even the children were related to each other.

Not only were none of the people tested related, but they were much more genetically diverse than one might have expected. Now, this is a bit difficult to wrap one’s head around, but in order to create individuals as diverse as those sampled, at least 300 genetically diverse people would have had to have sex with each other to produce people this diverse! Stop and just try to think about that for a moment. 300 people, over time, of course, coming from all over the freakin’ place, ultimately responsible for the three people in these graves.

What does this mean? What lessons might the sex habits of our earliest human ancestors have for us? Dude, this is getting heavy. You might want to take a moment to roll one and take puff or two.

According to the Danish researchers involved in this study, this discovery supports the hypothesis that ancient humans had incredibly complex social structures that involved mate swapping within small groups that were part of a larger social network that exchanged ideas and had sex. What we would now consider the nuclear family, two adults and their children, had yet to form. Instead, within the larger social network of what we might consider a decent-sized town, smaller groups cohabitated regardless of whether they were directly related, with groups of adults overseeing the development of a group of children that might or might not have been their own. Such diversity and social grouping is likely a primary reason why humans survived where Neanderthals didn’t.

In short, being horny little bastards has its advantages when it comes to the perpetuation of the species.

There’s more than that present in this study, though. We also see that these early humans were the product of extreme migration, having come up from the African continent by way of Persia (specifically Iran) through what we now recognize as Greece, Hungary, France, Malta, and Romania before settling in the Western part of what is now Russia. There were many more places involved, of course, but that gives you a general idea of how wide-ranging the diversity of these people were. At each step along the way, they encountered other communities of humans and enjoyed procreational activities within those communities before traveling on. As all these various humans traveled hither and fro, they scattered their genetic material far and wide, creating a diversity throughout the entire species that weeded out weaknesses and enhanced shared strengths.

We also see in this research a heightened sense of adventure. These early humans would have almost certainly put pineapple on their pizza at least once. In fact, they were likely down for trying just about anything someone put in front of them. Of course, there was the designated tester because no one likes food poisoning and back then almost anything would have been fatal. Still, one of the distinctions between these early humans and their extinct Neanderthal counterparts was their willingness to travel to new places and try new things. Had these habits, especially the enlarged social networks, not been in place, humans, as a species, might not have survived, either.

Pizza Toppings, Neanderthals, And The Ability To Chill

photo credit: charles i. letbetter

What We Learn From Neanderthals And Humans

So, we’ve done all this reading and studying and research and have acquired all this knowledge. Now, the question in front of us is what wisdom to we obtain from all this frantic activity of ours? Knowledge for knowledge’s sake can be fun but is ultimately useless if we don’t have any place to put it. Remember how you first felt when some teacher introduced you to Calculus? That pondering of “why in the world do I need to know this stuff?” is exactly the question we have to ask now.

I find in all this research at least three takeaways which we can apply to our attempts to abide in peace.

  1. What we, collectively, do now impacts humanity at a genetic level longer than our minds can possibly imagine.
  2. Being fearful and conservative doom us; being adventurous and liberal helps us thrive.
  3. Diversity strengthens us and is critical to our survival.

Taking those three truths, let us consider how we might apply them to our lives. Of course, this is just, like, my opinion, man. You are free to draw whatever conclusions you think you can support based on the evidence available.

Chill begets chill.

Traits repeated from generation to generation eventually become encoded into our DNA which is then passed down practically forever. I mean, if we still have the DNA of Neanderthals causing us problems and making us susceptible to some pretty serious problems, imagine the impact if we take it the other direction and pass down a genetic influence that makes it easier to abide! Human DNA does not change quickly but once a change occurs it takes a lot to unseat it.

This creates in us a responsibility to teach our children to chill. I gotta be honest, I’m not sure exactly how to do that. As I’m writing, the two little heathens who co-occupy this house have been running back and forth screaming at each other as siblings often do. They’re both diagnosed ADHD with one on the autism spectrum. We’ve yet to figure out how to get them to sit still for more than five seconds, let alone totally chill. There’s also that pesky matter of free will that dictates one has to choose to chill. Chill cannot be forced upon anyone.

Therefore, the best we can do is set an example that our children and others want to emulate, to show that abiding is a better alternative to all this running around like the proverbial chicken with its head cut off. They must see us abiding in the face of ridiculousness and chaos. We can teach them the words of the Tao but it is how we live, not what we read, that changes humanity at a genetic level.

One thing of which I am convinced as that, as the Jesus told his dudes, the chill shall inherit the earth. The survival of our species depends on we who do not see war as an option and understand the value of shrugging our shoulders and walking away. We do this, we teach our children and grandchildren to do this, and we save humanity at least for a while longer.

Bowl with the other hand.

Survival of the species depends on us moving around, not staying in one place, and having large social networks. It also depends on us trying new and different things. We tend to think that everything edible has already been found, but the fact is that overpopulation of the planet puts us in a position where we need to find new sources of protein and other foods that can be generated to meet heavy demand. Somewhere along the line, that’s going to mean putting something on our pizza that is more foreign than pineapple. Not everything will work, mind you, and we don’t have to keep doing things that don’t work for us. What is important is that we try.

This also means not cutting ourselves off from those who are different from us, whatever that might mean. The fact that many of us still carry Neanderthal DNA is evidence that, on more than a few occasions, Neanderthals and humans did some interspecies banging. Yes, I’m talking about coitus, dude. Billions of us, Caucasian, Indian, Asian, Persian, and every blend that is not pure African has some Neanderthal DNA lingering. We exist because humans dared to leave the African continent and hook up someone fundamentally different from them. When we cut ourselves off from other people, regardless of the reason, we weaken the species.

Our longevity as humans depends on us doing things differently. For example, I’m a right-handed bowler. I’m pretty much a right-handed everything. A lot of people are the same way. One advantage that many left-handed people have, though, is that they tend to be more ambidextrous than those of us who have a dominant right hand. That means they can adapt to certain situations more easily. So, what happens when we, say, bowl every fourth game or so with our “weak” hand? Over time, that hand becomes stronger, more able to not only bowl but handle other tasks as well. We become more adaptable and better capable of handling adversity. The better we adapt, the longer our species survives.

Hook it on up, dude

Obviously, we have to be a bit careful in drawing correlations between the sex lives of early humans and our sex lives today, but they had one thing right: diversity is good. Contemporary humans have this silly notion, perpetuated by our mythologies, that we should only hook up with people who are like us. Yet, had our ancestors taken that approach, chances are pretty fucking high that you and I wouldn’t be here today.

This gets a bit tricky. On one hand, do we really need to add another soul to this over-crowded planet? We’ve already reached a level where the planet cannot continue to sustain life in its current form. All biological and environmental markers indicate that we’re headed toward a mass extinction event the likes of which we’ve not seen in at least 20,000 years. We’re already killing off other species at the rate of about 200 per day. So, the justification of making more people, no matter how noble the cause, is a bit strained.

We also have to consider that pesky problem of sexually transmitted disease, especially if we start getting really diverse and numerous in our selection of sexual partners. One of the children in the third study had malformed hips and legs. At first, the thought was that inbreeding might have led to such a deformity, but with the proof of extreme genetic diversity, it seems more likely that the child was a victim of a parent’s STD.

Still, if any form of humanity is going to survive that requires a more diverse attitude toward procreation than humans have held for the past two thousand years or so. By diverse, I don’t mean going out and banging the first person of color one finds, either. There’s not a great deal of genetic diversity between a black person in Virginia and an Asian in Maryland. To get genetic diversity means someone is going to have to do some traveling. We’re talking intercontinental hookups, dude. As with those humans studied in the third study, those who survive into the next hundred thousand years need genetic markers in their DNA from all over the world, not a 10-square-mile patch of white bread Iowa.

Concluding Thoughts

I always reach this point wondering if I have been sufficient in explaining the key complexities of the research brought to bear. There really is a great deal of information and extrapolation to be derived from everything I’ve tried to consume this week and I worry that I’ve done little more than confuse you. If that is the case, I deeply and sincerely apologize. There’s also the chance that I’ve misinterpreted everything. If that’s the case, I’ll apologize for that as well.

What seems obvious to me, though, is that now, more than ever, humanity needs more people who understand how to abide. Long-term survival is not easy and if we think we have it all figured out we’re kidding ourselves. How many times does the Tao teach us that the more we know the more ignorant we become? Even now, I’m sitting here with thousands of questions in my head that all the research may never answer.

Survival of the species hinges on backing away from violence, racism, and conservatism and embracing the ability to abide, to take no offense in the casual action of others that do us no harm, to share the good stuff when we have it, and abide in peace with one another all over the world. You and I are participants in that survival. There’s no magic, no mantra, merely abiding.

May many others join us.

Abide in Peace,
The Old Man

Where we pass the hat

$
Personal Info

Billing Details

Donation Total: $20

Pizza Toppings, Neanderthals, And The Ability To Chill

photo credit: charles i. letbetter

end of the world

A Christian researcher has blended theories about Planet X and the eclipse and decided that the world ends Saturday

Source: This Saturday, an invisible rogue planet will bring about the Rapture . . . maybe – Salon.com

There are fools everywhere and there is always someone wanting to push for the end of the world because they think that it’s going to get them to heaven or nirvana or whatever other blissful paradise all the sooner. This has been the case as long as ancient Hebrew literature introduced its end times prophecies somewhere around 600 BCE. For over 2500 years gullible and desperate people have been falling for the latest declaration that everything is going to come to an end.

And they’ve been wrong every time. We’re still here. We’re still doing well, thank you. No big deal. I mean, like, whatever, man.

This latest prognostication doesn’t set a date for the end of the world, but rather the “rapture,” an event pre-millennialist Christians believe pre-dates the second-coming of Jesus and the eventual end of the world. There’s a thing about a 1,000-year buffer in there, but end-times believers have never been able to agree on that part. The rapture, though, has been a part of fundamentalist theology that was dormant up until the revivalist era at the end of the 19the century. That movement eventually evolved into the televangelists we see now, with their dire warnings that we’re all doomed to hell if we don’t repent and send them five dollars a month.

Rapture theology even captured a spot in religious fiction. The Left Behind series authored by the late Tim LaHaye and Jerry Jenkins spent an amazing amount of time at the top of religious best-sellers lists.  The concept that, without warning, all of the Christians are suddenly snatched off the planet, opens the door to some wild and imaginative consequences, depending on who those Christians are.

Oh, and don’t forget that line in the Bible that says, “The dead in Christ shall rise first.” Apparently St. Paul saw the zombie apocalypse coming before anyone else did.

Were this latest prediction to be true, and it isn’t, the earth’s encounter with an invisible planet that doesn’t exist is what sets off the Rapture. I guess Jesus is hiding behind this invisible planet, just waiting to jump out and shout, “GOTCHA!” Then, he’ll grab a bunch of people and take off.

Note: there are a lot of Fox News watchers who are falling for this invisible planet idea. Of course, it has been thoroughly debunked, but that doesn’t seem to matter for all these people with a death wish. Still, just for the sake of argument, hear the dude from NASA talk a minute.

Isn’t science amazing? We don’t have to just trust blind speculation that someone comes up with to get attention. There are these wonderful things called facts and we can rely on those in place of mythologies and fairy tales.

Even if all the “eligible” Christians did suddenly disappear, and you’ll hear them argue over who is and isn’t eligible, one has to ask whether anyone would actually notice. After all, over a million people “disappear” each year and how often do you hear anything about that on the news. Children, especially, seem to have a problem. By some counts, a child disappears every 40 seconds. That’s a lot of kids! One would think that someone would notice, that the world might get just a little upset by that.

We don’t, though, do we? Sure, you would get upset if it were your kid. You’d call the cops, put out an Amber Alert, post pictures all over the place. But would the child’s disappearance disrupt world order in any way? Probably not.

So, maybe this whole Rapture thing isn’t that big a deal even if it does happen, which it won’t. My late father, the Southern Baptist preacher, always speculated that if the Rapture did happen, a lot fewer people would be taken than anticipated. “Not everyone who says ‘Lord, Lord,’ shall enter the kingdom of heaven.” Imagine how embarrassing it could be on Sunday morning if the Rapture happens on Saturday and the crowd at church is still the same size and the preacher is still there. Yeah, Joel Osteen, I’m looking at you. Awkward.

These predictions come and go with such frequency now that I’m shocked anyone pays any attention. Had the Salon article not popped up in my newsfeed while I was waiting for the Armani show this morning, I wouldn’t have noticed at all.

I’m looking at Saturday’s schedule. I have Blumarine, Missoni, and Ferragamo to cover. I’m not expecting any interruptions. They’ll walk, I’ll write, and then everyone will go eat. No big deal.

R.E.M. covered this topic back in the 90s, you know. “It’s the end of the world as we know it, and I feel fine.”

While the world may not be ready to end, this article certainly is. Armani is calling. Here, watch a video.

Abide in peace, dudes.
-the Old Man