The Neurocycle: Day 49 of 63

Unlimited iPhone backup in iOS 15

I’m straying from my usual attempts to be relatively timeless in my writing. I realize this post will be meaningless in a year or so, but I want to share this cool trick in iOS 15 I just learned. You can perform a full cloud backup of your phone, regardless of whether you have enough available space to do it in iCloud.

In iOS 15 there is a feature to help people prepare for transferring over to a new iPhone, even those who don’t have a big iCloud storage plan. Don’t worry, you don’t need to actually have or buy a new iPhone to use it, and the backup lasts 30 days. To get there, go into Settings, click General, then click Reset or Transfer iPhone. Then click Get Started.

If it turns out you have enough iCloud space and your iPhone is fully backed up already, the process will be over quickly as it isn’t needed at the time. If you don’t, your iPhone will backup to iCloud…don’t do this during a call or other important use as it will occupy your phone for several minutes. When done, it will leave instructions for how to transfer to your new iPhone (which, as I’ve said, isn’t necessary because the backup is done at this point). Exit out of Settings.

The backup lasts 30 days, so in theory as long as you do this every 30 days (which you probably should do anyway) you will have a free cloud backup of your iPhone. Here endeth the lesson on unlimited iPhone backup in iOS 15.

The Neurocycle: Day 48 of 63

Conspiracy Theories

I grew up on conspiracy theories. I was born in the time of The Manchurian Candidate, when theories about the Kennedy assassinations flowed freely, and shortly before Watergate would give rise to another set of theories.

I like a good conspiracy theory as much as anyone else. I love the The Matrix franchise. There are three big problems with most conspiracy theories, though.

1. Conspiracy theories are only interesting when they are theories, and they lose their mystique as soon as the truth comes out. I need only cite one case to support my point. Last year, when fear of Covid gripped the world, the US Government announced very clearly and bluntly that it had lied in the past about not having secret UFO info. What’s more, it had kept a massive backlog of unresolved UFO encounters secret, but it would be secret no more. The result? Nobody cared, not the Far Right, not the Far Left, not the Far Out. The fact that there seemed to be much more to worry about at the time should not matter, if the conspiracy were as dark and dastardly as conspiracists had painted it to be for decades. It turns out it just didn’t matter much to folks, and still doesn’t.

2. Conspiracy theories rely on lack of information about them as proof they exist, as a matter of necessity. But a false theory would have a lack of info too, whether anyone has yet thought of it or not. So the logic behind “there’s no info only because it’s been covered up” is flawed.

3. You and I simply aren’t important or threatening enough to conspire against. The world is run by a few very rich, very powerful people, it’s true. It’s been true since ancient times. But unless you’re Julius Caesar or someone of similar stature in the known world, your existence (our existence) as consumers and human statistics doesn’t threaten the fabric of the system. We are the system: cops, criminals, soldiers, dissidents, the whistle blowers and the blissfully unaware all eat food and drink water and go online and use highways and generate waste and all of that. Yes, somewhere on servers there is likely every single fact about you…but nobody in power cares enough about you to use it individually. The only people interested in you are common thieves. That may be disappointing news for some, but to me it’s comforting. I am still on guard against thieves, but not against my government. That remains true for me whether the President is named Clinton or Trump or West or anything else.

Here endeth the lesson on conspiracy theories.

The Neurocycle: Day 47 of 63

Fat Biking Snow, Beach, or Trail

As recently as seven years ago, fat tire biking was widely considered a passing fad, if it was considered at all. Now most mountain bike companies have a fat tire version, and a few companies even focus on fat bike models.

The best and maybe most surprising thing about fat biking is the wide variety of terrain you can have fun on. Because of the extra low ground pressure and the extra big contact patch of fat tires, you can ride places you wouldn’t dream to try on a “normal” mountain bike.

Snow is one of the most exclusive and scenic surfaces. Before you go, make sure the snow trail is packed/groomed. Not even a fat bike can float on loose snow. Also, know the temperature limitations of your chain lube and other fluids, as well as your own body’s limitations. The last thing you want is to be in a cold, remote place you can’t get back from, with temps below freezing and help unlikely.

The beach is another scenic place you can ride. As with snow, make sure you can always get back to safety…know the local tides and conditions especially if you’re targeting a hard-to-get-to scenic spot around rising tides. Salt in sand and water can ruin a bike quicker than anything else, so take extra care in rinsing off your bike thoroughly right after your beach ride.

And, of course, you can enjoy a fat bike on good old-fashioned trail mud. Be aware that while fat tires provide better traction and “float” thanks to having more tire in contact with the ground at any given time, this big contact patch also means you typically can’t turn on a dime as you could on most mountain bikes. Know your bike and what it can/can’t do, before you take a downhill switchback too fast and find yourself in involuntary flight mode.

With just a little prep you can enjoy snow, sand, silt and pretty much anything else. Here endeth the lesson on fat biking.

The Neurocycle: Day 46 of 63

Ascribing Malicious Intent

One of the worst habits we can get into is reading too much into the actions or words of others. In particular, assuming without evidence that another’s action was specifically intended to do us harm is an especially nasty habit, known to cause many a needless rift or battle between souls who ought to be more or less in harmony. Even when the assumption doesn’t break into an all-out war, the stress, anger and anxiety that result are still quite damaging to the assumer.

That being said, how does one break themselves of the habit? The first step is learning to recognize when such an assumption is being made. It may be an especially ugly feeling that you get, or the forming of accusations, or a sort of self-preparation for a confrontation with the one who allegedly set out to harm you. Whatever the signs are for you, learning to recognize them is key to modifying this behavior.

Second, once you recognize you are starting down this dark road, tell yourself to stop and give yourself time to objectively assess things. Third, once you are in a calm, objective space, evaluate the situation looking only at the cold hard evidence. Don’t rely on hearsay and don’t put a tone on any written communication. Fourth, if you are still feeling like the target of malice, seek advice from a trusted and compassionate ally. If such an ally is not close at hand, give yourself advice but imagine it coming from someone who is wise and cares about you. Fifth, always remember that just as you are the hero of your own narrative, so is everyone else, and they don’t think about you (positive or negative) anywhere near as much as you think they do…they are each too busy looking after the hero of their own story, themselves!

With time and practice, you will be able to break yourself of this damaging habit. If nothing else, you should be able to get yourself into enough of an emotionally-neutral space that you can ask the other person calmly what they meant when they said or did the offending thing. 99 times out of 100, it was not meant to offend. On the rare occasion that it was, the offender is typically neither powerful nor important enough to care about their intentions.

Here endeth the lesson on ascribing malicious intent.

The Neurocycle: Day 45 of 63

Bullet Journaling

Want a low-tech organizer that is 100% customizable and includes calendars, to-do lists, notes, musings and reminders? And never needs any batteries or Internet? Try a bullet journal.

All you need is a little notebook and a pen. Start by making the first few pages the Index, and by numbering every page. The key to the bullet journal is to be able to find entries later by date.

You can make monthly or yearly calendars, if you want to, on your early journal pages. I used to do this all the time in my bullet journal but over time I’ve become more focused on daily entries.

You can use any system you invent that works for you, but I’ve found there are three basic types of entries or “bullets”: Tasks, Events, and Notes. For me, Tasks are always symbolized by a dot or “true bullet point”, which I make an X thru when I complete it (I don’t strike out the words; this is important as I’ll explain below). Notes are always symbolized with a dash. Events have a variety of symbols depending on the type—I have one symbol for phone call, another for video chat, another for in-person meeting, another for web conference, etc.

One thing I’ve learned is to strike through words only when they become no longer relevant. A completed task is relevant as a success; don’t dismiss a success. That’s why I simply put an X over the task’s bullet point when I complete it. If a task is urgent I put a “!” next to it, if it gets extended I put a “>” next to it, and I have other profession-specific symbols. The important thing is to play around with symbols and systems so the bullet journal works for you. You are the one and only person the journal is for. If you find it too much of a chore and/or you literally never refer back to what you’ve written, then you are probably wasting more of your time than the journal is worth. For me, I’ve been bullet journaling for three years, and while I obviously still use electronics my bullet journal is like my second brain that focuses and organizes all the ephemera of my first brain.

Here endeth the lesson on the bullet journal.

The Neurocycle: Day 44 of 63

Meditation

There are numerous medical benefits associated with meditation. At the same time it can be an important pathway to achievement, for self, others, and communities. According to Dr. James R. Doty, MD, author of Into the Magic Shop, a meaningful and fulfilling meditation practice involves four steps:

1. Think of your intention. It might be just to relax, unwind and heal, or it might be to help others, or it might be a larger goal for yourself. Put that intention at the front of your mind to start.

2. Relax your body. Start at your toes and slowly untense all the way up to your scalp. Envision your heart muscle relaxing, beating slowly and pumping efficiently. The heart, brain and body are solidly connected and communicating.

3. Gain control of your mind. Contrary to popular belief, this doesn’t mean that no thoughts are allowed. But when distracting thoughts arise, embrace them briefly and let them go.

4. Allow your heart to open. Think on someone who loves you and/or you love unconditionally and fill your heart with that feeling. If you want to, shift your focus on someone for whom the relationship is more complex, and concentrate compassion and empathy on them.

Towards the end of your session, give yourself some uninterrupted time to transition back into normal life, recalling your original intention but otherwise undistracted from thought. Hopefully you achieve your goal, or if it’s a larger goal some more of the puzzle pieces fall into place. But don’t expect to be great at it immediately. Like anything, it takes a lot of practice to get good at meditating.

Here endeth the lesson on meditation.

The Neurocycle: Day 43 of 63

The Trichotomy of Control?

Ancient Stoic philosophers such as Epictetus taught that there are two and only two types of things in this world: things we can control, and things we cannot control. We ought to focus our energies with diligence on the things we can control, and ought to learn to accept with grace the things we cannot control. We should neither worry about the things outside our control, nor dally about improving the things within our control. This teaching has come to be known as the dichotomy of control.

Sounds quite simple and appealing, doesn’t it? And indeed it has been adopted (or at least something like it has been formulated) by a wide variety of groups, from Greek philosophy to Zen Buddhism to AA to Christianity. It is an unparalleled system for setting priorities and learning how to be happy, or at least not excessively unhappy.

In the real world, though, there are gray areas in between the two types prescribed by the dichotomy of control. There are things for which, while the ultimate outcome is not our decision directly, we do have some degree of influence over the outcome and can choose whether or not to apply that influence. Here’s an example: A good-paying position opens up in a friend’s department in the company you work for. Your friend trusts your judgment and would listen to any recommendations you might have, although she is not actively seeking your input, and the ultimate hiring decision is hers alone. You know of someone who would be good for the position, knows about the opening, and needs a job…you have a choice whether to recommend this someone to your friend for hiring. You also know that your friend may not end up hiring this someone, even if you give a glowing recommendation for them, depending on who else applies for the job.

You could look at this situation either of two ways under the dichotomy of control: you could parse out the choice whether to recommend, or you could look at the hiring process as a whole. If you do the former, you should definitely recommend the someone you know, as it is within your control and the right thing to do. If you do the latter, you should not worry one bit about the position and be perfectly at peace with doing nothing, as the ultimate hiring decision is not yours to make. Which way is the right way to look at this situation? The dichotomy of control, by itself, provides little if any clarification.

This is why I propose a third category to add to the dichotomy of control: things over which you have influence but not complete control. This category is a bit more elusive to define than the two classic categories, but I submit that it captures a great number of aspects of our existence. For example, our physical health and fitness is a function of both genetics (which we certainly can’t control) and our diet and exercise habits (which we certainly can control to a degree). Since we can’t control our genetics, we should focus on diet and exercise. If we eat healthy and exercise well, we have done what we could, and should accept that the ultimate outcome (our weight/appearance) is not completely ours to control. The key is in recognizing the precise point at which our degree of control diminishes such that it isn’t worth worrying about anymore. This point can admittedly be difficult or impossible to identify.

The life advice, however many categories you see in the world, luckily remains the same: Do what you can, where and when you can, and then find a way to accept the rest.

Here endeth the lesson on the trichotomy of control.

The Neurocycle: Day 42 of 63

Earth, the Kettle of Life

In the 1950s the theory that life began by lightning striking a solution of simple organic compounds (the primordial soup) to create complex amino acids gained rapid acceptance. The Miller-Urey experiment tested this theory at a micro scale, subjecting a glass container to a sparking electrical charge. The glass labware used, which contained a solution of simple organic and inorganic compounds mimicking what might have been found in a young Earth’s oceans, was borosilicate glassware much like the pyrex pans popular for cooking.

The results of the Miller-Urey experiment corroborated the primordial soup theory to an astounding degree. Amino acids, the building blocks of life on Earth, were created in the lab spontaneously. At the time this was hailed as a major breakthrough in science, a perfectly controlled experiment in an inert environment that could not influence the result. Or was it?

It turns out that borosilicate glassware, when in contact with a highly alkaline substance like the solution used in the Miller-Urey, can behave as a catalyst to chemical reactions. In other words, it can make things happen at the molecular level that otherwise wouldn’t happen. We don’t worry about this when cooking with pyrex glassware because with rare exceptions the food we cook and eat is largely water of a neutral pH.

However, before discounting the results of the Miller-Urey entirely, scientists recently repeated the experiment in three environments: one of borosilicate glass, one of teflon (considered more inert than glass), and one of teflon melded with a small amount of glass. The borosilicate glass environment produced the most amino acids by far, and the teflon environment produced relatively few amino acids. This seems to confirm the fears scientists had about the Miller-Urey, that the outcome was inordinately influenced by the labware, but the third environment’s results raised eyebrows. The teflon containing small amounts of borosilicate glass produced much more amino acids compared to the pure teflon, even though the number was fewer when compared with the pure borosilicate glass.

What does this mean? The most common minerals on this rock we call Earth are silicates. Si is the most common solid element. If the lightning struck the primordial soup of a young Earth the way scientists theorize, then there is a good chance the “kettle” containing the soup was composed at least partly of borosilicates. So, while the container in the Miller-Urey influenced the experiment’s outcome, that influence itself probably mimics the conditions where the spark of life first appeared on Earth, at least partly. The theory remains viable.

Here endeth the lesson on Earth, the kettle of life.

The Neurocycle: Day 41 of 63

Predictions for Black Friday 2046

Despite what purveyors of prognostication may tell you, it’s all but impossible to know the future with any precision…we’ve cracked the code for weather’s chaotic patterns, but still, 24 to 36 hours is the pathetic limit of what we can accurately model for future weather behavior. That said, there are a couple of tricks for making very general predictions into the future. The first is to look backward in time and observe trends…absent strong evidence to the contrary, these trends are likely to continue on their current trajectories into the future. The second is to look to works of art such as science-fiction for clues, since as we all know life imitates art sooner or later.

Using these two tricks, I’ve come up with seven general predictions for what Black Friday will be like in the US, 25 years from now. Time will tell whether I was right or wrong…and in most cases I very much hope to be wrong.

  1. It probably won’t be called Black Friday anymore. Twenty-five years ago, few people outside of the retail sales industry knew this day as Black Friday (coined by Chicago store workers in the 1960s who were overwhelmed, overworked and exasperated on this day every year). Most everyone just called it the Day After Thanksgiving until the era of 9/11. There’s no evidence to indicate the Black Friday name will stick for much longer, although it has already outlasted the 00’s flash-in-the-pan Cyber Monday. Who knows what nicknames will ever have staying power or why, but if I had to guess there will be a more positive-sounding moniker for the weekend as a whole, such as “Holiday Kickoff” or something of that nature. Tough to know.
  2. The day and weekend will still be a celebration of modern consumerism. One thing you can bet on, however, is that this weekend is about one simple act in the Western world: buying stuff. It has been all about consumer spending since the end of WWII, as our entire economy has been about consumer spending. This is a close-to-80-year era so far, with no signs of ending soon.
  3. Late November will more or less mark the end of the worst of hurricane and wildfire season (most years), but the beginning of the worst of blizzard and flooding season (most years). If you think “freakishly” bad weather is a historical blip limited in scope to the 20-teens and 2020s, I have a bridge in Brooklyn and a miracle weight loss pill I want to sell you. Extreme weather events/trends are the new normal. Thanks to human-caused climate change the weather will get worse before it gets better, and even the very most optimistic timeframe estimates for getting better are in the 2100s.
  4. Standards of living will be better, but only for about the upper 5-25% of incomes; for the other 75-95% life will be worse than today despite technological advances. The trend over the past 30 or so years, for both nation and globe, is that wealth continues to both grow and be concentrated in ever-fewer individuals. This means that although the world will be richer overall, much more than half of the population will be poorer. The middle class will continue to shrink in both number and importance. Today’s babies born to middle class, college educated parents can generally expect living standards and financial security much lower than what their parents have today, which already isn’t great. The saddest part is that one-third to two-thirds of the nation’s working poor will continue to fight for slack regulations on capitalism in direct opposition to their own well-being…hey, gotta own them libs, ain’t it?
  5. With a few exceptions the shopping will be online, and many products will be completely digital and meant only for a person’s online avatar. Facebook, errr, Meta, and others are betting heavily that AR, VR, and online personae are the future for consumers. These Big Tech titans are not stupid, nor are they poorly informed. We live increasingly online, even when traveling, even when on the toilet, even during our most private and intimate moments. The devices will surely change but the need to be online won’t. I’m a big fan of the movie Ready Player One, but even if I weren’t I would recognize that it is a reflection of both today and tomorrow…if you had the power to create a better life online than you have IRL, why wouldn’t you? And so you shall…you, me, and most of the 75-95% with ever-crappier IRL lives. Parties, gifts, concerts, plays, intimate gatherings and massive cultural events…these will happen IRL too, for sure, but more and more will be online. The Covid pandemic accelerated this movement…speaking of which, most survivors of Covid will speak shockingly little about it even 10 years from now, judging from the behavior of Spanish Flu survivors post-pandemic. The disease will still exist, but just as Influenza existed between 1918 and the present, Covid will exist and a number of people will die of it every year, and no one will really care or pay too much attention.
  6. Society, while on average more tolerant of races, cultures, beliefs, and genders, will still be run by white cis men. Also, there will continue to be examples of extreme hate toward those with darker skin, with beliefs other than Protestant Christianity, and with identities other than “straight woman/straight man”. Think about just how much you have heard and seen advocating for tolerance over the last 10 years. Now think about just how much real-world impact that all had. The amount of advocacy will continue to increase, and change will continue to be slow.
  7. The divide between progressive and conservative political identities will continue to deepen. This divide in the US has its roots in the time before the Depression Era of the 1930s. It’s not going away anytime soon. At some point it will culminate in warfare; in some respects it already has. Unfortunately I believe it will take millions of (more) deaths to make people realize how extraordinarily stupid and pointless identity politics are. We are so wrapped up in them we can’t see the forest for the trees. Whether all-out warfare will begin before or after 2046 I couldn’t say, but identity politics have been a continuous and increasing part of American life for decades…that won’t just end on its own, sadly.

Here endeth the lesson on what to expect 25 Black Fridays from now.

The Neurocycle: Day 40 of 63

Herd Behavior

Herd behavior is a term referring to groups of animals, the way they gather together into a mass, and how the mass starts to behave as if it were a single organism with one brain and one will, rather than a collection of individuals that each have its own will and own decision making process. In the past, many believed this behavior came about to benefit the herd or the species. This may be true in some ways and to some extents…mutual protection against predators and the use of collective forage information are two potential herd benefits. But more recently scientists have theorized that herd behavior is driven at least partly by individual self-interest, each in competition with the other herd members. Two behaviors in particular have been called out.

First, gathering into a herd, by itself, does not in fact offer protection against predators…a herd is much more easily found, stalked, hunted and attacked than an individual animal…that is, unless you happen to be occupying the center of the herd, protected by the simple geometry that means the animals at the periphery will be attacked long before (i.e. instead of) you. It is this desire for competitive advantage over one’s neighbors that consolidates the mass of individuals. This makes sense, since without some kind of pull toward the center of the herd, the mass would disperse and dissipate over time. The weaker and less able members of the herd find themselves at the herd’s edges, unable to effectively compete for the coveted inner slots. These individuals, easily located, identified, and picked off by predators thanks to the herd’s large size and aggregation of sight/sound/scent markers, are put at a distinct disadvantage by herd behavior. The herd is in effect self-culling.

Second, the use of and reliance on herd information also demonstrates individual self-interest. The idea is that, by spending less time and energy finding good forage oneself, the individual is better equipped to compete for the best microforage locations once the forage area is reached by relying on information supplied by others. By following the herd rather than conducting individual scouting expeditions, one becomes at least as competitive for forage as the other herd members, if not more so.

These two behaviors, while potentially benefiting some herd members (“survival of the fittest”), can become problematic especially when adopted by a decidedly non-herd animal (Homo sapiens). The competitive predator avoidance behavior relies heavily on the assumption that the predator (or other danger) is limited in its hunting capacities…in other words “they can’t get all of us”. In reality this often isn’t true. And if they indeed can get all of you, you were only making things much easier for your pursuer by gathering yourselves into a big group. Turning to the herd foraging behavior, it relies heavily on the assumption that the herd’s forage information is, well, reliable. In reality this often isn’t true either. If the herd follows itself to a forage area that can’t support the whole population, many individuals could die of starvation or malnutrition, when they would have survived had they spent time and energy up front scouting forage on their own. Those that do survive may well be undernourished and still put in danger by competing heavily for scarce resources in the forage area of the herd’s choosing.

Sometimes when humans exhibit herd behavior, bad things happen…bad for many individuals and bad for the human herd collectively. Stock market bubbles and tourist-trap thieves are just a couple of examples. Maybe the herd isn’t always right. Maybe it’s better to escape the herd sometimes.

Here endeth the lesson on herd behavior.