The Neurocycle: Day 19 of 63

Rupicapra rupicapra

When you hear the word chamois (which I pronounce “shammy” without a clue as to how correct that is), what do you think of? A leatherlike absorbent cloth that dries your car whilst you wring it out over and over? A pair of padded bicycle shorts necessary for comfort on long distance rides? A goat-antelope?

Originally the word had the last of those three meanings, although over time the other two would also apply. I had no idea. A chamois is actually a goat-like antelope, or maybe an antelope-like goat, I’m not sure which. Chamois are small in size, maybe 28 inches at the shoulder and 130 pounds would characterize the largest of them. They live in the mountain ranges of southern Europe, from the Basque country in the west all the way to Turkey in the east. They are neither overly abundant nor endangered through most of their range, apparently.

Chamois leather is prized for being soft and supple. Consequently humans have used it for a number of products over the centuries…items from gloves to drying towels to bicycle shorts crotch-padding have been made of chamois leather…so much so that the item itself is now often called a “chamois” or “shammy” many decades after it stopped being made from the creature’s hide.

So the next time you suit up for a long bicycle ride with a pair of chamois shorts or pants, remember the humble goat-antelope from Europe who originally made such things as padded shorts possible. Who knows, someday fossil fuels will no longer rule our world, there will be shortages of petrochemicals, and chamois shorts will once again be truly chamois.

Here endeth the lesson on the chamois.

The Neurocycle: Day 18 of 63

The Color Blue

Most people today can describe the color blue (it happens to be my favorite) and have incorporated it into their daily expressions: a blue sky, the deep blue sea, blue jeans, etc. So it came as a surprise to me when I read that many ancient cultures had no separate word for blue (as evidenced in the works of Homer and others), and indeed some tribes today think of blue as just one of the many shades of green.

Now, we know the color blue exists as a scientific fact. It can be expressed mathematically as a distinct wavelength in the visible spectrum of light. So why, even though most other colors such as red, yellow, green, brown, black and white are mentioned in ancient Greek texts hundreds of times, is there not a single mention of blue in them as its own color? The answer, at best, is elusive and debatable. We know that it doesn’t come down to biology…even cats are capable of seeing colors, after all, and so humans have been for their entire existence. Apparently it’s a societal phenomenon rather than a natural one.

From what we can tell today, most cultures did not develop words for all the colors at once…instead the words appear over thousands of years in a sort of sequence. Words for black and for white appear earliest, which makes sense as this is probably the most fundamental way of distinguishing two different objects (the darker one and the lighter one). Red receives its own word very early, which also makes sense since it is the color of blood, and recognizing blood can be the difference between life and death (or between eating and not eating, if you are a hunter tracking wounded game). Yellow and green tend to receive words next, followed by the more compound colors orange, brown and pink. The most recent words for colors tend to be violet, purple, and of course blue.

This leads to the question: What natural phenomena exist today that we don’t yet express in our language, but that future people will? It’s hard to know, but one thing in my reading I found interesting (because I didn’t know it) is that there is a physical difference between the colors violet and purple. I couldn’t tell you what that difference is, but I’m guessing in a few hundred years any average Joe could point it out as easily as I can distinguish blue from green today.

Here endeth the lesson on the color blue.

The Neurocycle: Day 17 of 63

The Sleeper Must Awaken

Although I normally avoid reviews of things on my blog as much as possible, today’s post is sort of a review on the new Dune movie. I’ll cut to the chase and say it’s definitely worth watching, but if you read the book first you’ll get so much more out of it. Luckily the movie only covers about two-thirds of the first Dune novel by Frank Herbert, so you don’t necessarily have to read the whole thing.

I read the novel at the age of 13, in anticipation of the David Lynch version of the movie (which you may remember was hyped up to a ridiculous degree, if you’re old like me). I enjoyed that movie immensely, and still do, but it’s the lessons of the book that have stuck with me over the years.

One of the big themes of the novel (and the movies) is that for many of us, our true potential for greatness sleeps inside. A mantra repeated in the novel is “the sleeper must awaken.” The awakening is rarely easy or without consequence; that’s why most of us choose to remain asleep. But, if we want to achieve greatness, the awakening is necessary.

Paul, the protagonist of Dune, is the son of a great Duke of the house Atreides of the Landsraad, and his Lady a member of the Bene Gesserit sisterhood. This gives him access to a number of benefits in his teaching and training; to say he is privileged would be an understatement. But you don’t have to be wealthy or noble-born to access your will and your desire to make a difference.

Sometimes you choose the ways in which you will serve the universe and its inhabitants. Sometimes the universe makes choices for you. Knowing when to resist and when to let go can be incredibly difficult, but it can be the thing that leads to a fulfilling, even if not-always-comfortable, life.

Here endeth the lesson on awakening the Sleeper.

The Neurocycle: Day 16 of 63

The Uncertainty Principle

In the 1920s a scientist and mathematician named Heisenberg began publishing work on a concept that has become central to our understanding of quantum mechanics. It is the notion that we can never know with absolute accuracy the position and momentum of a particle.

This concept flew in the face of classical physics, which assumed all aspects of the physical world could be measured and known accurately to an absolute certainty. This had always been assumed to be true, at least in theory…there might be limitations on the equipment available to make these measurements, sure, but given powerful enough equipment all of these things could be known and definite.

But the work of Heisenberg and others showed that, no, the location and movement of a particle is not something we can know right now to an absolute certainty. An important distinction is that we can track and determine certainly the particle’s past position and momentum…what we cannot know definitely is its position and momentum in the present, and we cannot predict with certainty its future attributes.

As the last hundred years have rolled away, the uncertainty principle has become more and more accepted in the scientific community. If it is true that we can never know with absolute accuracy and precision some of the most basic facts about basic matter in our universe, then how much more does that truth apply to our lives, to people, and to our interactions with those people? I’m not saying we should always shrug our shoulders and answer that we can’t truly know anything, but I am saying the moments when we find ourselves the most cocky and confident in our knowledge are the same moments we should be mixing in healthy doses of self-scrutiny and self-skepticism.

Here endeth the lesson on the uncertainty principle.

The Neurocycle: Day 15 of 63

Telling a Captivating Story

One of the ways I’m trying to improve is in my storytelling. Anyone can tell a story, and to some people it comes quite naturally, but telling a story that people engage on, are moved by, and remember is something altogether different.

A couple of good, quick points of advice I recently read:

First, the story must involve change. The quote I read went something like, “An airplane takes off, travels through the air, and lands somewhere else. A good story does the same.” The change doesn’t have to be for the better, and it doesn’t have to involve the self-improvement of the protagonist. But without change a story is just an anecdote and nothing more.

Second, a character’s change must come down to a 5-second or less moment of truth. This can be the moment they fall in love, hit rock bottom, have an epiphany, or some other event that can be pinpointed in time. This may not be completely realistic to how change happens to us in real life, but a good story must involve a discrete turning point. Otherwise, the audience will get bored trying to intuit and track what exactly is going on with the character internally. This 5-second moment of truth usually happens at or near the end of the story, but doesn’t necessarily have to.

With these two tips I hope to get better at storytelling, whether written or verbal.

Here endeth the lesson on telling a captivating story.

The Neurocycle: Day 14 of 63

The Ghosts of Halloween Past, Present, and Future

Halloween is admittedly one of my favorite holidays. As a youngster I gravitated toward T-shirts with skulls and other “scary” designs (much to my mother’s chagrin), and I loved horror films and candy. So of course I enjoyed the one day of the year when all these things were considered normal and in fact encouraged.

Halloween has always been a festival of fun in my lifetime. But as anyone who has done any light Googling on the topic can tell you, Halloween was not always a fun holiday. For the Celts, Samhain was taken very seriously as knowledge of future weather and crop events was critical to survival. The ancient Romans, and later the Roman Catholic Church, similarly took this time of year seriously with religious rites and observances meant to ward off evil spirits and honor the souls of the departed. The turning of the seasons toward winter (in the Northern Hemisphere, at least) was a time for anxiety and fear, and rightly so, since people tended to die of malnutrition, cold exposure, and the diseases brought on by them.

Now, Halloween has become more a time for merriment, with only a tiny sprinkling of mischief and the macabre. Lately the costumes, formerly worn by only adults in the distant past, and in my lifetime by mostly children, are being put on our pets who are at best blissfully unaware of anything other than the heightened degree of attention they are getting. The costumes themselves are more a reflection of what has transpired in popular culture over the previous year, than any kind of otherworldly masquerade. You are much more likely to see a comical or heroic character from a recent movie or TV show than any ghost, goblin, or monster.

I’m fascinated by the way Halloween evolved, from a serious life-and-death ritual where we tried to consult with good ghosts and scare away bad ones, to a celebration of (and perhaps a commentary on) popular culture and consumerism. It makes me wonder how humans of the future will treat the things that give us angst today. Will future people celebrate Tax Day with parties, feasts, and costumes? And what will the Halloween of 2071 look like? Will our pets finally be spared from the involuntary costume wearing, and the holiday become all about dressing our robots in statement-making attire? Will drones trick-or-treat for batteries?

Here endeth the lesson on Halloween past, present, and future.

The Neurocycle: Day 13 of 63

Pumpkin Power

‘Tis the season to argue about whether pumpkin spice-flavored foods like lattes are delicious or gross. People seem to either love them or hate them with nobody on the fence about it. But there’s one thing that can’t be denied: Pumpkin (the fruit itself, not the flavoring) is packed with health benefits and is an ultimate, high-nutrient, low-calorie food.

Pumpkin is very high in vitamins A and C, which have a long laundry list of benefits including immune system boost. Pumpkin is also high in immunity-strengthening vitamin E, iron, and folate.

Pumpkin is high in fiber and in lutein, and in a whole host of other nutrients that have been proven to keep your skin and eyesight healthy. And the seeds are also packed with most of these good things plus protein.

Just remember, as with most foods, pumpkin is best for you when fresh. Frozen is a close second, and canned a distant third. Sugary foods and drinks that are simply pumpkin-flavored are really not good for you, so don’t lump those in with pumpkin.

Fresh pumpkin can be cut into small chunks and oven roasted by itself or with other fall veggies, or stir fried in a tasty curry or other dish. It can be quickly pureed in a blender and made into a soup that will warm you up on cold fall days and be healthier than something from a can. This year, that old jack-o-lantern (if not too old) could have a second life in your saucepan or Instant Pot.

Here endeth the lesson on pumpkin power.

The Neurocycle: Day 12 of 63

Doing It Exactly Wrong

The quote “When you do something exactly wrong, it always turns up something,” is attributed to the artist Andy Warhol, and I find it intriguing. Obviously the “wrong” in this quote does not mean morally or ethically wrong, and it doesn’t necessarily mean incorrectly either. It’s something I’ve found difficult to put into words…the key to the whole thing for me is the modifier “exactly”.

I interpret “exactly wrong” in this sense to mean in a way or manner that rejects or ignores society’s norms. And to put a finer point on it, the norms that are rejected or ignored are norms that are followed just for the sake of following norms. For example, in old times when children were taught to write in cursive, the norm was that the letters had to always slant to the right. There was no reason for this whatsoever, other than to mandate and instill conformity. Left-handed children especially found this challenging. But what if a child rejected that norm, and slanted their handwriting to the left? What if a child chose to block print their letters, and never write in cursive at all?

The “what if” is what the “it always turns up something” part of the quote refers to. In a word, it’s about discovery. If you do something differently or “exactly wrong” you will likely end up discovering things that would remain unknown to the world had you continued down the mainstream path of the millions or billions before you. In Warhol’s case, he discovered a lot of new art techniques, notably the over-inking and paper blotting of prints (something graphic artists in the 1950s were taught not to do). This became a cornerstone of his signature style.

Why not do something exactly wrong once in a while? It probably won’t make you a pop art legend or a celebrity of any kind like it did for Andy Warhol. But there’s a good chance you will discover something about yourself or the thing you’ve done exactly wrong. At worst, it will fail and you’ll have learned a valuable lesson. At best, you could end up unlocking doors to worlds you didn’t know existed.

Here endeth the lesson on doing it exactly wrong.

The Neurocycle: Day 11 of 63

(Prehistoric) Life Imitates Art

Remember the movie Jurassic Park, where scientists take dinosaur DNA preserved in amber and splice it with other animal DNA to re-introduce dinosaurs to Earth? Some scientists are now working on doing just that, not with dinosaurs but with wooly mammoths, with a plan to use CRISPR technology and an artificial uterus.

The first question one might have upon hearing this, which is the first question I had, is why? Well, the answer these scientists give is that re-introducing wooly mammoths to arctic and subarctic ecosystems will help reverse the trends of climate change. The argument is that wooly mammoths scrape away layers of snow to forage for food, and this exposes the ground to the cold arctic air. Without mammoths, the snow has been left intact to insulate the ground, warming it and causing the release of greenhouse gases.

As true as this may be, it seems like a pretty roundabout and feeble way to fight climate change. Even during the prime of their existence on Earth, it is doubtful at best that wooly mammoths scraped away more than 1% of the entire arctic ground snow layer. So using wooly mammoths as a weapon against climate change seems a lot like bringing a rock to a gunfight.

After reading articles I gathered during some light Googling, I’m convinced the real answer is twofold. First, there’s the guilt humans feel over driving untold thousands of species to extinction, a fact that is real and unforgiving. These species include the wooly mammoth at the hunting hands of our distant ancestors, based on mounting evidence in the fossil record. Second, and probably the more direct answer, we’re doing it because we can.

Which begs the question, what is “it” that we’re doing exactly? Producing a living wooly mammoth individual is one thing, but re-establishing the species as a dynamic part of a functioning ecosystem is quite another. The earth is not nearly the same as it was during the prime of the wooly mammoth, and this is especially true of arctic and subarctic regions. Also, based on what I know of elephants and other large mammals, the vast majority of their behavior is not instinctual but learned, from parents and other adults in the social cohort. Without a herd of wooly mammoths to learn from, it seems like the first “new” wooly mammoth is going to be pretty clueless, and at best unlikely to survive in the wild. The animal could be potentially released into a modern elephant herd, true, but even assuming that the herd would adopt it, aren’t we then just producing a modern elephant that happens to be bigger and hairier than its adopted cousins? As one scientist put it, if that’s what we are doing then we should just put our energy into saving the modern elephant…that seems much more direct and likely of success.

Here endeth the lesson on life imitating art.

The Neurocycle: Day 10 of 63

The Biggest Living Thing

What is the Earth’s largest living organism by mass, currently? If you didn’t know, you might picture a mammal such as a blue whale or an elephant. Or you may imagine a large single tree such as a giant sequoia. The truth is it’s a clonal colony of quaking aspen (Populus tremuloides) composed of thousands of tree stems covering hundreds of acres in the Fishlake National Forest in Utah. This genetic individual has been named Pando.

Because quaking aspen can reproduce clonally via lateral roots that emerge to form additional stems (also known as “sucker” roots), the aspen tree trunks that make up Pando are genetically identical. And these tree trunks are connected by a single massive root network underground. This is why Pando is considered as one living thing rather than thousands.

The estimates vary, but most scientists figure Pando’s age to be around 14,000 years old, being born about the same time the last Ice Age ended and the glaciers retreated from the area. Aspen tree trunks only live to be about 130 years old, but quaking aspens can replace old trunks over time with new stems via suckering, and thus live a very long time.

Unfortunately, Pando has not been forming new stems at a sufficient rate to keep from declining as its tree trunks age and die. This is figured to be caused by factors correlating to Euro-American settlement of the area, namely fire suppression, cattle grazing, and browsing by mule deer. Hopefully this trend can be reversed before Pando is gone forever.

Now you can tell your friends you know what the biggest living thing on Earth is. It’s Pando, who is basically a forest. Here endeth the lesson.