Blog Calendar
    September     ►
SMTWTFS
 
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
Archive RSS
About This Author
Come closer.
Carrion Luggage

Carrion Luggage

Blog header image

Native to the Americas, the turkey vulture (Cathartes aura) travels widely in search of sustenance. While usually foraging alone, it relies on other individuals of its species for companionship and mutual protection. Sometimes misunderstood, sometimes feared, sometimes shunned, it nevertheless performs an important role in the ecosystem.

This scavenger bird is a marvel of efficiency. Rather than expend energy flapping its wings, it instead locates uplifting columns of air, and spirals within them in order to glide to greater heights. This behavior has been mistaken for opportunism, interpreted as if it is circling doomed terrestrial animals destined to be its next meal. In truth, the vulture takes advantage of these thermals to gain the altitude needed glide longer distances, flying not out of necessity, but for the joy of it.

It also avoids the exertion necessary to capture live prey, preferring instead to feast upon that which is already dead. In this behavior, it resembles many humans.

It is not what most of us would consider to be a pretty bird. While its habits are often off-putting, or even disgusting, to members of more fastidious species, the turkey vulture helps to keep the environment from being clogged with detritus. Hence its Latin binomial, which translates to English as "golden purifier."

I rarely know where the winds will take me next, or what I might find there. The journey is the destination.


<   1  2  3  4  5  6  7  8  9  10   >
September 15, 2025 at 8:36am
September 15, 2025 at 8:36am
#1097412
Okay. Fine. I give up.

I ran the RNG as usual this morning, but when I went to the corresponding saved link, I found it had been paywalled. Too bad; it was an interesting story about the origins of a cryptid. But I couldn't find a non-paywalled version, so, ploink, into the trash.

Then I pulled another number out of the metaphorical hat and, behold: another paywall. This one would have been about the conception of the universe in the Middle Ages. But no, we can't have nice things.

Just to be clear, I'm not expecting everything for free. If I'm interested enough in a website, I'll subscribe. But what it does is put a damper on sharing articles. Now they've missed out on not only my subscription, but that of my legions of fans. I'm sure they're wailing and moaning about having three fewer subscribers, right now.

People who do the work deserve to be paid for it. Even writers. Traditionally, the main source of income for article writers is, ultimately, advertising. But no one seems to be able to make non-annoying ads, ads which, when I find something to read on my phone (which doesn't have ad-block), render the site essentially unreadable, what with hidden click-through points and constantly shifting floating ads.

Hence my use of ad-block, which, look, I know that makes me part of the problem, but there's a war on and I need my defenses.

Anyway, from what I've been hearing, ad revenue is drying up (possibly due to the proliferation of ad-blockers). Likely, this has prompted more sites to go to a paid subscription model. That can work if the site has decent content on a regular basis, but again, I can't share those articles here.

The existential problem with the advertising model, though, is that the advertiser gets to control your content, at least to some extent. Publish something controversial? Ads get pulled. Show a political bias different from the advertiser's? Ads get pulled. Dare to show a bare breast? Ads (and maybe other things) get pulled.

It's the golden rule once again: they who have the gold make the rules.

I'm not saying I'm changing the way I do blog entries. Just taking a break today to express my annoyance at the festering pile of bantha fodder most of the internet has become.
September 14, 2025 at 10:47am
September 14, 2025 at 10:47am
#1097342
Phun with philosophy today, from aeon:

    Reality is evil  Open in new Window.
Everything eats and is eaten. Everything destroys and is destroyed. It is our moral duty to strike back at the Universe


You know, every so often, Gnosticism pops up with different clothing and a fake nose. One of its basic beliefs is that the world is evil and one must work to find whatever "real" reality there is.

So, just based on the headlines, I figured here was another Gnostic in disguise.

Reality is not what you think it is.

One way to control you is to convince you that what is obviously reality is not reality, but the real reality is hidden away. This is, I believe, a form of gaslighting.

Like everything else that exists – stars, microbes, oil, dolphins, shadows, dust and cities – we are nothing more than cups destined to shatter endlessly through time until there is nothing left to break. This, according to the conclusions of scientists over the past two centuries, is the quiet horror that structures existence itself.

I suppose, from some point of view, it's horror. From another point of view, perhaps there's some comfort to be had in knowing that everything faces the same fate. No one is privileged.

Reality, as we now understand, does not tend towards existential flourishing and eternal becoming. Instead, systems collapse, things break down, and time tends irreversibly towards disorder and eventual annihilation.

And? it's not going to happen for billions of years. As I've always said: there's no such thing as a happy ending; there's only stories that end too soon.

We must start by admitting that the Universe is finite and will eventually end. Moreover, we must accept that the function of the Universe is to hasten this extinction.

I've said before that it very well may be that if life has a purpose, it's to accelerate entropy, so I've already done that admitting and accepting.

A metaphysics that responds to the full scope of the thermodynamic revolution needs to acknowledge the dissipative and destructive function lying behind the ‘generative’ force seemingly at work within reality. To do so requires moving from the classical optimistic metaphysics of becoming to a much more pessimistic metaphysics of absolute finitude and inescapable unbecoming: a metaphysics that reconceives of beings as nothing more than dissipative cogs in an annihilative machine.

Wow, I bet this guy's fun at parties. I should know; I'm fun at parties.

There's a lot more, of course. And just to summarize my own thoughts: I start with the same facts this author does, but I come to a somewhat different conclusion. My conclusion is this: Sure, the universe is trying to kill us. All the more reason to laugh in its face.
September 13, 2025 at 9:19am
September 13, 2025 at 9:19am
#1097273
After yesterday's gaze into the abyss, it's only fair that I share today's article, from Slate, which talks about something everyone knows: zucchini.

    The Vegetable That Wants to Die  Open in new Window.
Zucchini doesn’t even like itself, yet every summer, we pretend it’s worth growing and cooking. Is there any way to actually make it taste good?


Except not everyone knows it, do they? A good chunk of English speakers, as well as most Francophones, call it courgette. Just to add to the confusion, "squash" refers to a British drink that isn't beer or tea.

Why we refer to it by its Italian name is interesting, too. All squash is native to the Americas. But the particular variety with the green skin was bred in Milan less than two centuries ago. I can only assume that the French called it something else because they're French and needed to distinguish their cuisine from that of Italy. ("Zucchini" is plural; the singular form is zucchino, which I played with in the title today.)

A few summers ago, while I was visiting family in western Pennsylvania, my parents’ neighbor sauntered over and “gifted” us some garden zucchini... I was annoyed. Our neighbor hadn’t gifted us anything, he’d encumbered us with tough, water-logged, flavorless vegetable mass.

I'll admit, I'm not a big fan of zucchini. Partly, this is because, when I was a kid, we were that neighbor; there was entirely too much of it in our garden for us to eat. Partly, it's because my mom couldn't cook it worth a damn, and that sort of thing sticks with you. But to call it "flavorless" is a bit of a stretch, in my opinion. It has a flavor; just one I don't particularly like.

To be clear, I still consider it food. This is not the case with, say, eggplant (aubergine), which I consider not-food.

Tastes, however, vary, and I know from experience on both sides how hard it is to get someone to like something they just don't.

Because zucchini is a culinary pain in the ass, and people are running out of ways to cook it.

It can also be a literal pain in the ass, if you know what I mean.

It’s a scourge of a plant that grows fast and is prolific.

Can't argue with that. While it's been decades since I grew zucchini (or anything else, since as soon as I left the farm, every useful plant in my vicinity commits suicide), I still have nightmares about the bushels of the stuff I'd have to pick.

The hard fact is that among summer produce, zucchini just isn’t desirable, and it’s not an ingredient that I particularly like to cook with, either.

I don't think of it as an ingredient. I think of it as a side dish.

The traditional cooking methods that most recipes bring to bear don’t really do zucchini many favors (or flavors). The internet is overgrown with zucchini bread recipes (which, guys, is just spice cake with weird, wet fiber smuggled inside).

A friend of mine once made a zucchini pie. Like, a sweet pie, not a savory one, more like apple pie. Well, she might have done it more than once, but I was there for it once. Best I can say about it is that it was edible.

Sautéed zucchini and yellow squash is a classic side dish, but generally just reminds me of the boring meat/potatoes/vegetable plates you’d encounter at a middle-of-the-road steakhouse.

I think the problem with being a food writer is that you get jaded pretty quickly, and always seek out the new and shiny over the tried and true classics. Me? I'm not a food writer; I just sometimes write about food. So rather than go seeking out a new way to cook something, I prefer to master the old standbys.

Could it be that zucchini is a good or, dare I say, even great vegetable that’s been the victim of passionless flavors and ideas? Maybe zucchini isn’t the problem. Maybe I am.

No comment.

More flavorful solutions for zucchini abound. A proper Indian or Thai zucchini curry is wonderful, and the squash swells with deliciously flavorful aromatics like ginger and cumin.

And this is where I started to get interested. Not in zucchini, mind you: I'm of the considered opinion that if you have to do too much to a food to make it palatable, it becomes not worth it unless you're in survival mode. But, as I noted above, zucchini is a squash bred in Italy from stock that's native to the Americas. And now, here we are, with south Asians incorporating it into their dishes, on the entire other side of the planet from its origins.

That's the thing about food: once it's out there, it's out there, and it's fascinating to me how this works on a global scale.

Which is to say, zucchini is definitely work. It needs to be cared for, both in the garden and in the kitchen. Understandably, that may be more effort than the average person is willing to put in.

And way more effort than I, a distinctly below-average person, am willing to put in.

But I'm perfectly willing to try the product of others' creativity and hard work. I might even compliment it. As long as no eggplant is involved.
September 12, 2025 at 7:59am
September 12, 2025 at 7:59am
#1097212
This Nautilus article is fairly old, and long, and honestly may not be of interest to anyone but me.

    The Man Who Tried to Redeem the World with Logic  Open in new Window.
Walter Pitts rose from the streets to MIT, but couldn’t escape himself.


And no, he wasn't a Vulcan.

Not wanting to risk another run-in that night, Pitts stayed hidden until the library closed for the evening. Alone, he wandered through the stacks of books until he came across Principia Mathematica, a three-volume tome written by Bertrand Russell and Alfred Whitehead between 1910 and 1913, which attempted to reduce all of mathematics to pure logic.

While I'd never heard of Pitts before this, the Principia Mathematica is something I'd known about. A lot of work went into that ponderous set of tomes before Russell apparently came to the conclusion that it can't be done.

But this is more than a story about a fruitful research collaboration. It is also about the bonds of friendship, the fragility of the mind, and the limits of logic’s ability to redeem a messy and imperfect world.

One of the lessons of the story of PM itself is that there are some things that can never be proven within a logical framework. Another is that it certainly seems weird even to arithmophobes that it took over 350 pages to show that 1+1=2.

McCulloch, 42 years old when he met Pitts, was a confident, gray-eyed, wild-bearded, chain-smoking philosopher-poet who lived on whiskey and ice cream and never went to bed before 4 a.m. Pitts, 18, was small and shy...

And yet, the two died within half a year of each other.

The moment they spoke, they realized they shared a hero in common: Gottfried Leibniz. The 17th-century philosopher had attempted to create an alphabet of human thought, each letter of which represented a concept and could be combined and manipulated according to a set of logical rules to compute all knowledge—a vision that promised to transform the imperfect outside world into the rational sanctuary of a library.

Liebniz is better known as Newton's rival in mathematics. The two developed calculus independently.

McCulloch, Pitts, and Lettvin were all poets at heart and in practice, and McCulloch and Lettvin regularly published their verse.

I'm just including this bit to show how you don't have to be a Writer to be a writer.

There is, of course, a lot more at the article, and it delves into the development of neuroscience and computer science, among other tough subjects. So, like I said, I understand if it's a little overwhelming. But personally, I like reading about the people doing the science, because they are, after all, people, with all the illogical and poetic crud that implies.
September 11, 2025 at 9:19am
September 11, 2025 at 9:19am
#1097110
Discussions like this one in Big Think have been going on for a long time, and will continue.

    What exactly is “life”? Astrobiologists still have more questions than answers  Open in new Window.
Our Earthbound definitions of life could leave us blind to the Universe’s strangest forms.


Okay, but I'd think that's inevitable, since we don't and can't know everything.

Defining exactly what we mean by “life” — in all its varied forms — has long been a formidable challenge.

Some things resist easy categorization.

Physicist Erwin Schrödinger wrote a book titled What is Life? in 1944.

I haven't read the book, so I wonder: is the title question asked in a tone like a little kid asking their parents the Big Questions?

By the traditional dictionary definition, “life” requires metabolism, growth, replication, and adaptation to the environment. Most scientists, therefore, don’t consider viruses alive because they can’t reproduce and grow by themselves and do not metabolize. Yet they possess a genetic mechanism that enables them to reproduce, with the help of a living cell.

So the problem here, as I see it, is one of categorization, not of science. Not to mention "traditional dictionary definitions" aren't the same thing as scientific definitions. But mostly, we like to label things and put them in clearly-divided cubbies. Unfortunately for us, things are rarely that neat. For another, simpler example, consider the uproar when the definition of "planet" was set up, and the definition excluded Pluto. But people can make up their own set boundaries (they just won't be accepted by science). If you want to expand the definition of "life" to include any reproducing system, you can do that. It may not comport with what we colloquially know as "life," but you can do it.

Because astrobiologists think not only about life as we know it but also life as we might find it, some of them gravitated toward the broad definition of life proposed by NASA: “A self-sustaining chemical system capable of Darwinian evolution.”

I'm sure a great deal of thought went into that definition, but it seems pretty broad from my outside point of view. The article questions it, too.

As usual, when considering such questions, we’re hampered by our limited state of knowledge.

Well, yeah. If we didn't lack the knowledge, we wouldn't be seeking it, would we? You can use "hampered," but I might have picked "challenged."

There is also the N=1 problem. How can we expect to arrive at a good definition of life when we have only one example: life on Earth?

Well, see, we can't even arrive at a good definition for life on Earth, as the article shows. If we find stuff on another planet that quacks like life and waddles like life, we'll call it "life," and modify the definition accordingly.

That's not how science works, I know, but again, the categorization question isn't really a science question, but a philosophy question that needs to be informed by science.

Maybe it’s partly a linguistic question. Grammatically speaking, “life” is a noun. But in biological terms, it’s more like a verb — more of a process than a thing. Defining life is something like defining wind, which describes air in motion — a state of being rather than a specific object. Wind molecules are the same as those of air, but their dynamic state is what defines them.

Okay, but there are processes that we don't define as life: the hydrologic cycle, for instance; or fire.

Maybe we should be consulting philosophers.

Finally.

I'm not sure if I was really clear above, so I'll reiterate this: Sure, we don't know. As I noted a few days ago, the universe is large and, practically, no one can explore it completely; consequently, there will always be things we don't know. And to me, that's a good thing; it means we can still learn.
September 10, 2025 at 8:06am
September 10, 2025 at 8:06am
#1097052
Here's a source I've never linked before: Zocalo Public Square.

    What Happens When You Trade Doomscrolling for Hopescrolling  Open in new Window.
Our Phones Have Become Misery Machines. Sharing Positive News Can Change That


I have other addictions, so I'm not trying to shame anyone here, but: how hard is it to just... not? From what I've been seeing in non-scrolling media, it would be easier to quit smokes or smack.

Their attention has been captured, but not by anything in particular, not really, they say. Like a lot of us, my students are chronic doomscrollers.

You know what it reminds me of? Back in the days when it was cable or nothing (and I chose "nothing"), I'd go visit someone or I'd be sharing a hotel room, and the remote comes out, the TV fires up, and then it's click... click... click... click... click... click... just constant shuffling through channels, without resting on any of them long enough to absorb any information, just click... click... click... click... I seriously considered trying to invent and market a device that scrolled through cable TV channels by itself, hovering on each one for some user-defined interval between 15 and 120 seconds, but that would take away the clicker's illusion of control.

And, like a lot of us, they’re miserable as a result.

Are they? Or were they miserable before, and doomscrolling helps? It's important to point causality in the proper direction.

And those misery machines are hard to turn off, by design.

I'm pretty sure that's meant metaphorically, but my phone enraged me a couple of months back. I went to literally turn it off, which is usually a long-press of a side button. You know what those forkwads did? They reprogrammed the long-press to summon the AI "assistant."

Yes, I was able to change it back in Settings. But I shouldn't have to.

Developers engineer our phones and apps to capture and keep our attention, to make us “lose time” by mindlessly moving from app to app. This design attacks us where we’re most vulnerable by taking advantage of our innate need to scan our environment for threats.

Hm. Perhaps it really is today's version of channel-surfing. I have some skepticism about the implied evolutionary psychology interpretation, though.

Spending all of that time on our misery machines “cultivates” our reality, making us think that the world itself is miserable—which is what media scholars like George Gerbner call “Mean World Syndrome.”

Thus leading to a barrage of misinformation like "crime rates are up" when they're actually down, which in turn leads to more misery as the military moves in to control everyone.

But there may be hope. The best way to disrupt the recursive loop of doomscrolling is to be more intentional about our media use.

Or, you know, figure out how to turn the phone off.

Instead of doomscrolling, we should hopescroll—looking for positive news, not threats. I asked my communication and journalism students to try it this year by creating class social media accounts devoted to sharing positive news.

I prefer notscrolling, but if it helps, it helps.

Unfortunately, most positive news these days reads like orphan-crushing machine stories.  Open in new Window.

The engagement with our accounts—across all social media platforms—was so pathetic, in fact, that I considered canceling the assignment. Stories of progress and problem-solving don’t get a lot of attention or engagement, alas.

Stevie Wonder could have seen that coming.

But you gotta try, or you're just being cynical like me. Don't be cynical like me. Be cynical like yourself.

Many students reported that they shared what they learned with their roommates, on their family chat groups, and in conversations with random folks throughout their week. They liked having something positive to talk about, and they found that folks wanted to hear about the good news.

So, they wanted to hear the good news in meatspace, but continued to doomscroll in cyberspace? People are weird.

One student reported that shifting their attention away from “institutions that benefit from people’s fear” and toward “those who aim to heal” made them feel more resilient. Several students noted that they saw a shift in their moods that surprised them: “Honestly, I did not expect that much would change, however, after reading about communities working together for a large cause, individuals trying to make a difference in their own way, and new innovations being made in hopes of creating a better future, it readjusted my perspective that not all is bad and/or lost in the world.”

Cynical or not, I can appreciate that.

In keeping with the article's theme, it concludes with a what-you-can-do-about-it section.

Look, I'm not trying to add to the doom. Not today, anyway. Nor am I ragging on the article or its premise. I hope that it helps someone.
September 9, 2025 at 10:08am
September 9, 2025 at 10:08am
#1096997
Sometimes I find articles actually related to writing. This is one of them, from Big Think:

     Why Tolkien thought “sub-creation” was the secret to great fantasy and science fiction  Open in new Window.
According to Tolkien, fantasy requires a deep imagination known as “sub-creation.” And the genre reflects a fundamental truth of being human.


People sometimes look down on fantasy — not the prize-winning, metaphorical magical realism kind, but the kind of fantasy that has swords, sorcery, and dragons.

I contend that all fantasy is metaphorical.

The snobbery of those who look down on fantasy has a long pedigree — so much so that, in 1947, J.R.R. Tolkien felt the need to defend the genre in his work, “On Fairy-Stories.”

I can understand, to some extent, the snobbery. The popular stories could be legitimately bad. But the true lit-snobs don't even give it a chance.

To enter Faërie is not to enter a world of simple make-believe; instead, we perform an act of “sub-creation,” in which we form a world within our wider “reality.”

So that's what the slightly clickbaity title refers to. A bit disappointing it's not about making the perfect hoagie. I suspect that to many writers (even me), that's old news with a new headline.

When we sub-create a world, we “make a Secondary World which the mind can enter.” This world has its own internal logic, laws, and systems.

Well, I just always called that "using one's imagination," but if it helps to think about it that way, why not?

We see, feel, and live in this world in a way far beyond the words on a page can alone provide. We color in background details and add sights, smells, and wonders that go beyond the narrow bounds of the words in the book.

Yes, that's worldbuilding. Most writers do that to some extent; at the very least, they're creating a world much like ours but which doesn't contain the same characters. Fantasy / SF writers take it to extremes.

Then there's a section about Beowulf, but I'll only comment on the last bit of it:

It says that no matter what monsters we face, we shall overcome and live on. We shall not be defeated.

And that's part of what I meant up there when I said all fantasy is metaphorical. Even the pulpiest fiction can be metaphorical—if often clichéd.

Books of all sorts are escapist. Fictional narratives and made-up characters define a novel.

The article goes into this a bit, but I don't think "escapist" should be a dirty word.
September 8, 2025 at 8:25am
September 8, 2025 at 8:25am
#1096931
September 8, today, is Star Trek Day, but I swear this article came up at random from about 40 possibilities. From Atlas Obscura:

    How to Take the Ultimate American Stargazing Road Trip  Open in new Window.
Head out West for dark skies for a Milky Way so bright it casts a shadow.


Okay, so it's a different kind of star trek, and you're going, boldly or not, where others have gone before. But it's still cool, because I'm also a fan of road trips.

In the West, where I grew up, there is rarely such a thing as a short drive. If you want to get outside beyond a city like Denver—and much of the reason for living in Denver is the chance to get outside—you have to get in a car, and then spend a lot of time there.

Even in Denver, there's no such thing as a short drive. The traffic sucks.

And the ability to drive at night, if you can, offers an unparalleled chance to explore the stars. If you are the passenger, or your kids are in the backseat, even better.

Better yet is being unburdened by small humans with pressing needs. There's a reason Wesley Crusher was almost universally reviled, and it wasn't because he was smart.

I recommend starting your journey in the Centennial State, Colorado, for the beginning of a multi-state tour through the Colorado Plateau, a high-altitude region of tablelands and canyons running through four states.

Perhaps a bit of bias, there. I'd personally recommend the Sierra Nevada, except that everything after that might be a letdown. And that one didn't even make the list.

Book your tickets to arrive toward the middle of a lunar cycle. This way, you will be ensured darkness in the early evenings for up to two weeks, before the waxing crescent moon starts to dominate the nightscape.

This may not be as clear as it should be. It also may not have the intended effect. By "middle of a lunar cycle," I take it she means around the New Moon, though, as a cycle, it can begin and end at any phase. I think the intent here is that a cycle is full to full.

But even that can be misleading. Starting with the New Moon, the waxing Moon sets later and later on subsequent evenings. If you're limiting yourself to a time period of about sunset to midnight, your best bet is to start with the waning half-moon, or the confusingly-named "last quarter." This phase rises at around midnight, so you have a few hours of glare-free evening stargazing. For the next week, it rises later and later until it becomes a New Moon.

Maybe that's what the author is actually saying but, like I said, it can be confusing. To keep it simple, the best time to see the stars is when the Moon's not in the sky, and there are apps and calendars for that.

Sheesh. Other planets don't have these problems. We had to go and get ourselves an oversized satellite.

Anyway. No need to quote more of the article; after this, it suggests locations where one can see the stars without too much interference from human-produced light.

In short (and for the benefit of people from other countries, or Americans who may want to cross the equator to see an entirely different sky), the best way to see the stars from Earth, with or without a telescope, is:

1) The Moon should be on the other side of the planet, so as not to overwhelm the rest of the celestial light show;

2) Far away from terrestrial light sources such as cities or highways;

3) As high an elevation as you can manage;

4) Unfortunately, stargazing is better when it's cold out;

5) And, oh, yeah, don't go when it's cloudy and/or raining. Snow is right out.

There's a reason they put the great big expensive telescopes on remote desert mountains.

You may not have the time to do a proper stargazing trip until you retire, so it's probably best if you live long and prosper.
September 7, 2025 at 1:54am
September 7, 2025 at 1:54am
#1096846
Oh, boy, here we go.

“The surest sign that intelligent life exists elsewhere in the universe is that it has never tried to contact us.” – Bill Watterson

Let's start with this: I've got nothing bad to say about Watterson. Calvin and Hobbes is arguably the greatest comic strip ever produced (though I will also listen to arguments for The Far Side and Pogo). One of my most prized possessions is a hardcover set of its complete run.

With that out of the way, that should be enough to clue a reader in to the high likelihood that he was making a joke with that quote.

But, joke or not, I get really damn tired of hearing statements like that.

Yes, I've gone into this before. Dozens of times. Maybe even hundreds. I've quoted the Drake Equation.  Open in new Window. I've discussed why the related Fermi Paradox  Open in new Window. isn't a paradox. I've railed against the implication that humanity is that terrible. I've come down hard against using the term "intelligent" in this context.

But it all comes down to speculation, because when considering the existence of technologically-capable life, we have a sample size of exactly 1. It thus borders on religious beliefs: either "How can we be so arrogant as to believe that we're the only tech-capable life in the Universe?" or "How can we be so arrogant as to believe that alien life would be anything like us?"

Actual religious people have argued against the existence of tech-using aliens on the grounds that God supposedly made the entire Universe just for us. I reject that "reasoning," even if I come to the same conclusion.

Now, I've also said this before, but it's important to the point I'm trying to make: the Universe is a big place. As another funny guy, Douglas Adams, once wrote: “Space is big. You just won't believe how vastly, hugely, mind-bogglingly big it is. I mean, you may think it's a long way down the road to the chemist's, but that's just peanuts to space.” It is so big that some people think it may be infinite in extent. It is so incredibly huge that I'd be very surprised to find out that there's no other tech-using life out there. But that doesn't matter, because it's also so incredibly huge that there's no plausible way that humanity will ever be able to explore more than an infinitesimal fraction of it before it expands beyond all reach, crunches back into oblivion, or simply runs out of the capability for energy transfer (whichever universe-ending scenario is ascendant in astrophysical circles right now). In other words, we can never know everything that's out there.

But Adams had another quote that I think is relevant: “Isn't it enough to see that a garden is beautiful without having to believe that there are fairies at the bottom of it too?” Of course, he wrote both of those lines in a story whose entire premise revolved around being unable to swing your arms in space without hitting some sort of sentient being, so take that as you will.

Now, if you asked me (which you didn't) "So you don't think there's life outside Earth?" I'd scoff at that, too. Life seems to have gotten a foothold pretty early on in Earth's history, and there's no reason to believe that we're unique in that respect. What I take issue with is the implicit bias we have that evolution must necessarily produce a species with the traits necessary to create things like rockets or radios (which would be things that, if they existed in our galactic neighborhood, we might have a chance of detecting). Evolution doesn't work like that. It's not a steady march of progress from bacterium to rocket scientist. Plenty of other species on Earth do just fine here without having mobile phones or Moon landers, and they'd be doing just fine (in some cases, better) if we weren't here.

That idea is an example of the arrogance I alluded to above: that we represent some sort of end product of evolution, or that the entire planet (or even the Universe) exists to serve us. From what I understand, even our evolution was kind of touch-and-go for a while; a single tweak in a different direction, and we wouldn't be here to marvel at it.

So, no, Watterson's quote doesn't amuse me. It doesn't tickle my confirmation bias. It's anti-humanist and ignores the vast majority of people who aren't actively trying to do harm.

"But, Waltz, isn't it hypocritical of you to sing humanity's praises like that while simultaneously believing that the coming global apocalypse is inevitable?" No. No, it's not. Because I also subscribe to Lone Asshole Theory: if you have a million people, and one of them is an asshole, the asshole can ruin everything for the other 999,999 people.

We don't have a million people, though; we have over 8 billion, and I'd venture to guess that considerably more than one in a million is an asshole. But all it will take is one mistake, or one willful gesture of contempt for the planet, and boom.

So, while I'm not above being hypocritical, I can easily justify my seemingly contradictory beliefs in this case.

In conclusion, the most likely reason, in my view, that we haven't been contacted isn't because we're sinners, but because there's no one nearby with the capability to detect us or get here quickly. That's it. That simple.

Could I be wrong? Of course. I'd need real proof, though, not "I saw strange lights in the sky" or "I was abducted and anal-probed" (the latter of which is consistent with undiagnosed sleep paralysis). There's always the possibility that there's a fleet of warships on their way here right now at something close to light speed, ready to do to us what Vader did to Alderaan.

But we have more immediate problems to worry about.


Notes:
September 6, 2025 at 1:02pm
September 6, 2025 at 1:02pm
#1096788
Well, today's "Blog Week Birthday Bastion 2025Open in new Window. [E] prompt is a bit different from what I'm used to. But I've always said I can write about anything. However, I never promised it'd be good.

The full prompt is below, in the dropnote, but it's basically: do two reviews.

This doesn't give me much room, so today, I'll just share the reviews I did, which are both for entries related to yesterday's prompt. The reviews are public anyway, so I'm not sharing some sort of secret information or anything like that.


"Birthday Bastion 2025 | Day FiveOpen in new Window. from "\\ Exurgency // Open in new Window. [18+] by LdyPhoenix Author Icon

A great illustration of the tension between what we want and what we want. That is, to give up (or mostly give up, in this case) something we love to achieve better health or some other desired outcome: it’s a real trade-off.

The song in the video prompt is a humorous ode to coffee, and this entry is mostly about the subject matter of the song rather than the video itself. This is not a criticism; it’s an acknowledgement that we can take prompts in all kinds of different directions.

While I’ve never developed a taste for coffee myself, I do understand missing something that you used to enjoy. Could I give up something I love in exchange for the possibility (not the guarantee) of better health and/or a few more days of life? Probably not, but I enjoyed reading this entry anyway.

The only thing I might suggest is to go into more detail about “coffee might disappear” and “environmental impact,” but I realize that this could end up in controversy, and I probably wouldn’t go there, either.


"Those Little ThingsOpen in new Window. from "Racing Through LifeOpen in new Window. [18+] by Kit Author Icon

We all have our favorite things, and other things we don’t enjoy. You say this is “an interesting challenge” because the prompt is about things that you don’t partake in, but in this entry, you rose to the challenge very well.

Like you, I prefer tea to coffee, which probably made this video prompt less relatable. But turning it into an entry about the things you do enjoy is a good way to address the prompt. One might say it goes off on a tangent, but there’s absolutely no reason not to.

One thing I appreciated about this entry was the organization, with bolded section headers. This helps make it easy to follow. And while not everyone is going to share your preferences, you do well in explaining what it is about these things that you enjoy. Not that we’re owed such an explanation, but it makes things more relatable: while everyone likes different things, we mostly like those different things for similar reasons.


Notes:
September 5, 2025 at 1:32am
September 5, 2025 at 1:32am
#1096675
I'm going to start by addressing the whole "needing to use a VPN to trick YouTube into thinking I'm in Europe" bit. With the VPN, that's mildly annoying. Without it, it would be rage-inducing. (I set it to Netherlands, just in case, and it worked.)

And if you don't know what I'm talking about, it's this video right here, which is today's Blog Week prompt:



Getting a "not available in your country" message? Yeah, fuck that.

Since not everyone has a VPN, though, all I'll say about the video is that music should stand on its own, not require lighting, effects, dancing, costumes, or other gimmicks.

But if there's one thing I've learned in life, it's that my relationship to music is not a popular one, and my opinion on the subject is definitely in the minority. Will I change my opinion to better fit in? Hell, no.

Part of it is that I've always been frustrated by my own lack of musical talent and ability, despite many years of lessons in piano, violin, voice, and guitar. There's something about making music that I Just Don't Get. You know how some people Just Don't Get math? That's me with music. The difference, I think, is that while arithmophobes recoil in abject terror at the very thought of having to add or subtract, I absolutely love music.

Well, most music. Well, some music, anyway. Opera, for example, can bite my ass. I understand the talent and work that goes into it, and if you like it, great; for me, it's like shoving an ice pick in my ear.

Another thing that makes me different is that while for most people, their musical taste ossifies around the onset of adulthood, there is newer music that I like. Not all of it, of course. But I didn't like all the music that was around in my childhood, either. The bad stuff didn't last: just look at any week's Top 40 chart from when you were a kid. In my own research along those lines, maybe one or two of them stood the test of time. I don't even remember most of the crap they played back then.

I also like some music that came before my time. While I don't subscribe to the idea that music can be divided into decades, it's useful to know when a particular song was produced, just like it's useful to know when a book was published or a movie was released. Technology changes, sometimes for the better, and sometimes for the worse.

Autotune, for example. You get some performer who looks good and can dance but can't really carry a tune, and boom, autotune fixes that. Except it doesn't, because autotune is clearly a misuse of technology, much like biological weapons or shining lasers at aircraft. Some of my favorite music, though, was made by people who weren't, or aren't, beautiful—but they had brilliant voices, or at least a knack for songwriting.

The very first song played on MTV when it started, back when they only played music videos, was "Video Killed the Radio Star."

Again, I recognize I'm in the minority here. When it comes to music, I'm a minority of one.

Perhaps we all are.


Notes:
September 4, 2025 at 12:53am
September 4, 2025 at 12:53am
#1096575
“Poets have been mysteriously silent on the subject of cheese.”

That's from G.K. Chesterton. Chesterton lived a hundred years ago. Since that time, things have changed. Poets have changed. The nature of mysterious silence has changed. Most importantly, cheese has changed.

Well, okay, not really, unless you count the introduction of industrial chemical "cheese," which as far as I'm concerned is a legitimate counterargument against the usefulness of technology. Oh, sure, it melts more evenly, and it's cheaper, but it's not cheese. It barely even qualifies as food.

But, mostly, there's a good reason for not waxing (pun intended: fake cheese looks and tastes like wax) poetic about fermented dairy products: poets have no sense of humor, and cheese is inherently funny.

"But Waltz, lots of poets write funny poems." No, comedians write funny verses; poets have way too much angst to transcend themselves by writing limericks or senryu.

Which is not to say I don't appreciate poetry. I can do angst. I have a fondness for melodrama, and melodrama verges on comedy. But rare is the poem that transports my psyche the way a good comedy act can.

So, of course, I looked for modern poems on the subject of cheese, and I found this one,  Open in new Window. but I can't tell if the poet meant to be funny but missed the mark, or shot for seriousness and landed on humor.

And then there's this,  Open in new Window. which is firmly and decisively all about cheese, and not even the plastic kind. But the strict rhythm and rhyme make me believe it, too, was meant to be funny. Or maybe not; like I said, cheese is inherently funny.

Another one comes from reddit,  Open in new Window. though, far from being a loving ode to spoiled milk, it expresses the poet's hatred of one particular cheese style (one which, say what you will about it, but at least it's not Kraft Singles).

So, in short, Chesterton's proclamation (itself a prime example of dry British humour) is outdated, superseded by those who, perhaps to spite Chesterton, have given us the artistic expressions of their souls on the subject of delicious cheese.

But no poem, certainly not the ones I found for this discourse, can ever truly capture the magic of cheese, any more than writing about beer can give us the sublime experience of actually drinking the magic brew. Perhaps that's why it took so long to write any: while love, the traditional subject of a poet's pen, is simple enough to be transcribed, described, and inscribed, the glory of cheese is not.


Notes:
September 3, 2025 at 1:08am
September 3, 2025 at 1:08am
#1096452
A few years ago, I drove through Nebraska, and stayed in a hotel there overnight.

This may not seem important to anyone. It was certainly boring for me. Nebraska isn't exactly the most exciting state in the US. It's primarily known for two things: corn and insurance.

The only significance to my visit was that it checked #48 off of the list of US states I've visited. The only ones left now are Alaska, for the obvious reason that it's cold and far away; and Michigan, which can also be cold, but isn't so far away.

The thing about Michigan is that it's not on the way to anywhere, at least not for me. That's the only reason I ever visited Nebraska: it was on the way to elsewhere (in this case, visiting a friend in Utah).

And it's not like the state doesn't have reasons to visit. For me, those reasons are breweries. I simply haven't gotten around to it yet.

Alaska's a different story. On maps that focus on the US, like the one in today's prompt (which you should be able to see if you expand the "Notes" below), it's never in the right place and rarely the right size. Based on that map, an uninformed person might believe that it's an island off the southwestern corner of the country, and that it's smaller than Texas.

Of course, we all know better, but that sort of thing confused me as a kid. That's what parents and teachers are for. But every once in a while, I'll see something on the internet about someone thinking Alaska's a big island somewhere, one that just happens to possess, in part, a long, straight coastline (where, in consensus reality, the state borders Canada). Some people are unteachable, I suppose. Or there are a lot of trolls. Or both.

Point is, though, there's almost nothing in Alaska that interests me. I'm sure there are breweries, which is reason enough to go even if I didn't have all 50 states on my fuck-it list. I have this vague idea that at some point in the relatively near future, I'll drive across the country again, this time looping up into Michigan for the hell of it (that's a pun, see, because there's an actual place in Michigan called Hell and, yes, it periodically freezes over). But the plan is to end up in Seattle, get on one of those cruise ships that I've never been on, and let it take me to Alaska. One day in Juneau or Anchorage (Fairbanks is way too far inland), and I can say I've been there. Maybe see some whales from the ship; I don't know.

That way, I can also tick "cruise ship" off my list, something I want to do but have been avoiding, because they're basically giant Petri dishes, perfect breeding grounds for microscopic pests of many varieties.

I'd better get started actually planning that, though. If the most optimistic of my friends are right, there won't be a 50-state union after the next year or so, rendering my list largely moot. If the least optimistic are right, I'll have to dodge radioactive craters on the way, because of the coming inevitable global apocalypse.


Notes:
September 2, 2025 at 12:51am
September 2, 2025 at 12:51am
#1096359
As we acknowledge Writing.com's 25 years of existence this week, I'll be blogging about that instead of the usual stuff I find.

Today's prompt has to do with AI. Artifical Intelligence: bane or blessing?

Yes.

But first of all, I'd like to clear something up: AI isn't really artificial intelligence. Certainly the argument can be made that what we call AI is artificial, but I subscribe to the philosophy that, since we are part of nature, anything we create or modify is natural, including food dyes, nuclear weapons, microplastics, and computers with their programs.

This isn't a very useful philosophy, though, except insofar as it reminds me that not everything we make is "bad" and not everything we find in the wild is "good." So I'll continue to use "artificial" to refer to something some human made.

It's the "intelligence" part I have a real problem with. It's hard enough to define that for humans. It's even harder to define it for nonhuman animals, such as dogs or housecats, neither of which would exist in their current form without human intervention, and can thus be considered "artificial" in a way.

Since we don't know what intelligence really is, labeling a complex computer program thus is questionable at best. And I should also note that AI has been around in some form since the early days of computing. We gamers have dealt with various levels of AI in the form of game NPCs, and let's not forget they programmed computers to play chess, a game that used to be considered to be something only an intelligent entity could win.

I'm splitting hairs, probably. But what we call something matters. You can call your dictatorship a "Peoples' Republic," or your fascist political party "socialist," but that's just propaganda. A lot of the hype surrounding AI is propaganda of another sort.

Many of us have been using AI as writers for a while, now. Spellcheck is a rudimentary AI; grammar checkers, a more advanced one. I never want to be dependent on either, because I'd rather internalize rules and styles for myself, but I've used them.

I've also, obviously, used what we call AI for graphics (notably above in this blog), mostly because I have no artistic talent whatsoever. What I've never done is have a Large Language Model write for me. I mean, sure, I've played with them a bit, but only to satisfy my curiosity; none of their output has made it into my writing here.

As for whether it's a good thing or not, well, we hardly ever get to see things in black and white, ones and zeros, all or nothing. What we call AI is technology, and like almost all technology (and a lot of "natural" things), it can be used for good or evil or anything in between. You know, like nuclear fission can produce relatively clean energy, but it can also be used to blow shit up real good.

I don't trust any report on it that sings its praises. I also don't trust any report on it that concentrates solely on the downsides.

It has its problems, absolutely. Like any tool, it depends on how we use it. You can use a hammer to build, or smash someone's head. Since its current form is pretty new, though, and people don't like change, you get a lot of fear surrounding it. It's like how in the early days of civilian GPS, people freaked out about it getting them lost, as if no one had ever gotten turned around following a paper map.

Thing is, like it or not, it's here, and it's not going anywhere until the power goes out in the coming inevitable global apocalypse. What I'd urge everyone to remember is that you have no control over what other people do with it; you can only control what you do with it.


Notes:
September 1, 2025 at 9:36am
September 1, 2025 at 9:36am
#1096293
As we acknowledge Writing.com's 25 years of existence this week, I'll be blogging about that instead of the usual stuff I find.

Back in 2004, when I joined, the internet was a very different place. Social media wasn't really a thing; we used IRC and other platforms to chat and meet people. Not everything was measured, tracked, monetized, optimized, advertised, capitalized, and homogenized.

I've been a writer for most of my life. While my fellow students groaned and rolled their eyes at having to write 500 words for this or that class, I was puzzled: 500 words is easy, except for how in the hell can I say everything I want to say in such a short piece? Didn't matter whether it was fiction or nonfiction.

Then I went into engineering school, which didn't emphasize writing as much. Which is unfortunate, because engineers have to write things like technical documents and reports, and for those, it's important to have some skill in putting words together good. Not what you'd call creative writing, though. Engineers get creative in other ways.

So it was that, when I joined here, I finally felt like I had a chance to share my more fictional and expressive side. So I did. Joining four years after the platform's origin, I did feel like an upstart and an outsider, and in some ways, going on 21 years later, I still do.

That's right, next week, my account will be old enough to order drinks in the US.

I have this worldview that life runs in 7-year cycles. I don't talk about it much, but the idea is always there, lurking in the background like someone tapping on my shoulder to get my attention. While my 21 years here don't neatly overlap the 7-year cycles in my life, it's made a kind of sub-cycle.

For the first seven years, I was pretty active here, writing mostly stories and some poems, though I took advantage of blogging from nearly the beginning.

After that, I was less active for seven years. I still did the two newsletters I've been editing since 2007 or so: Comedy and Fantasy. And I remained an active judge at Writer's Cramp, and did Moderator stuff. But I didn't blog much, or even some years at all, for those seven years. This was largely the result of me shifting my focus from writing and community to dealing with some personal issues: my father had died (my mother passed before I joined), I retired, I traveled quite a bit, and processed my divorce. Never did an actual hiatus, but I certainly wasn't as much of a presence here as I'd been in the beginning.

Around seven years ago, then, I started to become more active again. My current daily blogging streak is going on six years, between the previous blog and this one, but even before then, I'd started writing stuff again. I also got more into activities here, notably the October Novel Prep Challenge. So things are different now, I'm different now, but sometimes, I look back at an older item and marvel at how great it was.

If the seven-year cycle thing holds, I don't know what the next group of years will bring. But I'm pretty sure I'm here until I die, or the site goes away, or the internet is destroyed in the coming inevitable global apocalypse.


Notes:
August 31, 2025 at 10:53am
August 31, 2025 at 10:53am
#1096227
Technology to solve technology problems, from an article in Slate:

    Coating satellites with super-dark Vantablack paint could help fight light pollution crisis  Open in new Window.
Light streaks caused by passing satellites mar images taken by the world's most expensive telescopes. The problem is set to get worse.


It's not just the world's most expensive telescopes, either. I've seen these damn things through personal telescopes (not mine; I don't own one).

A new type of super-black, highly resistant satellite paint promises an affordable fix to the satellite light pollution problem that has marred astronomical research since the recent advent of low-Earth-orbit megaconstellations.

"New" and "recent" may be exaggerations. Vantablack has been around for over 10 years, and Starlink for over five. Matter of perspective, I guess.

The constellation's thousands of spacecraft orbit so low that the sunlight they reflect outshines many stars from our perspective on Earth.

I could also take issue with their use of "spacecraft" for these small (but shiny) devices, but they are in space and someone crafted them, so, okay.

When the $1.9 billion Vera Rubin Observatory opens its telescopic eyes to the sky later this month...

They're open now. I did an entry on that observatory a while back: "Hey RubinOpen in new Window.

...astronomers expect that up to 40% of its images will be degraded or completely ruined by satellite streaks.

That does seem like a lot.

But a new paint being developed in conjunction with astronomers might help. The paint, called Vantablack 310, could reduce the amount of light reflected by satellites in orbit down to just 2% of what is reflected by uncoated satellites...

Pretty sure the original Vantablack got licensed exclusively to artist Anish Kapoor. I guess the exclusivity doesn't apply to newer formulations.

According to Noelia Noël...

..whose parents should be arrested and imprisoned...

...an astrophysicist at the University of Surrey, these satellite streaks will significantly reduce the scientific return on investment that the taxpayer-funded Vera Rubin telescope represents.

I think it's mostly British taxpayers, which I guess doesn't much matter for this.

The partnership has now produced a new type of blacker-than-black space paint, which reflects less light than available alternatives and can be easily applied by satellite makers in their clean rooms.

That's cool and all, but I wonder about the dozens, or hundreds, of brightly reflective satellites up there already.

The new coating is based on a proprietary blend of carbon black, a soot-like form of carbon, mixed with special binders that make the paint resistant against the harsh conditions in near-Earth space.

Carbon: Is there anything it can't do?

There's more at the link, and let's just hope Kapoor doesn't get his hands on this stuff.
August 30, 2025 at 9:37am
August 30, 2025 at 9:37am
#1096170
An article about reading, from Cracked:

    People Were Apparently Reading ‘Welcome Back, Kotter’ Novelizations in the ‘70s  Open in new Window.
Possibly the most surprising thing about the Welcome Back, Kotter series of tie-in novels is that they… kinda sound good?


Now, Cracked has declined from its peak. Most of their articles are about celebrities, which I have no interest in reading about, and besides, you can get that anywhere. But even in the website's heyday, its target demographic was much younger than I am. That still seems to be the case. That's okay; not everything is about me, but those are the main reason I rarely link to them anymore.

I still get their newsletter, though, for the occasional article of relevance to me, like this one. Relevant, because I'm of an age where I saw "Welcome Back, Kotter" when it first aired (contemporary, if I recall correctly, with shows like M*A*S*H and Happy Days), and because... I read the books.

In my defense, I was a kid at the time.

Ken Jennings has brought us many things: the secrets to Jeopardy!, an increasingly confusing series of patterned suits, and now, awareness of the existence of a series of Welcome Back, Kotter paperback novelizations.

On the other side of things, I don't think I'd ever heard of Ken Jennings before this. Game show host? Okay, that's fair; I haven't watched game shows since the WBK era.

All things considered, possibly the most surprising thing about the Welcome Back, Kotter series of tie-in novels is that they… kinda sound good?

I can't weigh in on their quality. I know that other novel spinoffs of TV shows vary widely in quality, from utter trash (Quantum Leap) to pure gold (a few of the Star Trek novels). But I read them way too long ago to have an informed opinion on them. Certainly, I liked them at the time, but I cringe now at some of the other stuff I liked as a preteen.

What they did do, even if they were trash, was keep me reading, which led to me wanting to write. There were other books, of course, mostly science fiction, but those somehow stuck in my head—if not the content, then at least their existence.

It’s unclear how well these novels sold, but they ended a good two years before the series did, portending harsh realities the inhabitants of James Buchanan High School couldn’t conceive of at the height of their success.

Well, the inherent problem in running a show about a high school (or about kids in general) is that, if it's not a cartoon like South Park or The Simpsons, it has a short lifespan.

And, honestly, I don't know what it is about that show (and the books) that made it memorable in the first place. Perhaps it was the diversity of the cast/characters, which I didn't really notice at the time, but, in hindsight, might have been groundbreaking for TV (though, as usual, Star Trek did it first, but Kotter was set in some version of the present, not some idealistic future). Maybe it was just the quality of the writing, which, again, it was too long ago for me to say anything about.

Or maybe it was because Travolta went on to be a major actor. In my headcanon, his character Vincent Vega from Pulp Fiction was actually Kotter kid Vinny Barbarino, grown up and turned to a life of crime (hence his surname change).

That's my story, and I'm sticking to it.
August 29, 2025 at 10:47am
August 29, 2025 at 10:47am
#1096120
I've suspected this for a while now. Good to have something that supports my suspicions, so I don't come across as a complete conspiracy nut. From Vox:

    How the “Grim Reaper effect” stops our government from saving lives  Open in new Window.
When curing disease is bad for the federal budget.


Well, perhaps our (this is a US article about the US) government's job isn't to save lives.

There's a bit of background I'll skip just to get to the point:

Simply put: Curing hep C means people live longer, which means they spend more years collecting Social Security, Medicare, and other benefits. That could mean that whatever cost savings the actual hep C treatment produces might be wiped out by the fact that the people whose lives are being saved will be cashing retirement checks for longer.

Yes. They only need us as long as we're productive. After that, just die already so we can stop paying you.

Put together, the deficit and the elder-biased composition of federal spending implies something that is equally important and macabre: Helping people live longer lives will, all else being equal, be bad for the federal budget.

I've sometimes speculated (being a writer and all) on what would happen if, by some technology or magic, all human diseases were suddenly eliminated. That's a good thing, right? Yeah, from one point of view. From another, it would cause utter chaos. Worse if we could also eliminate aging and death: that would be reserved for the secret cabal of elite rich overlords.

Okay, maybe I'm a bit of a conspiracy nut.

I don’t have an easy fix for the situation, but it feels important to at least understand.

I'm not even sure it can be fixed, except for leaving the government out of it entirely, which practically nobody in either major party wants to do (perhaps because they are the government).

There are a couple more examples, including cigarette taxes and covid, and then they had to go and make a literary reference:

It all reminds one of Logan’s Run, in which people are killed off upon hitting age 30 lest they take up too many of society’s resources. That movie is a dystopia — but as a budget proposal, it’d score very well.

Okay, look. I get that more people watch movies than read. Hell, I watch more movies than I read books, these days. But Logan's Run was a film adaptation of a book, and, as is usually the case, the book was far superior. One of the many things they changed in the movie was that you hit Lastday at 30. In the book, it was 21. Yes, you read that right. They did keep the idea of Runners; that is, people who rejected the cutoff and tried to escape.

Now, it's been a while since I've either seen the movie or read the book, so I may be misremembering some details, but that much, I'm sure of (I just verified it on Wikipedia, I mean).

But that's not the important point; just an illustration of why you don't, for example, quote the movie Frankenstein when you're trying to make a point about the book Frankenstein. (Quoting Young Frankenstein is, of course, always appropriate.) No, what I'm trying to say is that the dystopia created for Logan's Run (either version) was that the birth rate got too high, skewing the population young. (This was a big talking point in the 60s, when the novel came out.) The particular dystopia we're suffering through right now is the polar opposite of that.

The economists and agencies doing this math are, of course, only doing their jobs. We need to know what government programs will cost over the near- and long-run.

I have an intense distrust of any version of the phrase "only doing their jobs."

But the fact that increased human longevity on its own worsens the budget picture should lead to some reflection. For one thing, it suggests that sometimes we should embrace policies simply because they’re the right thing to do, even if they don’t pay for themselves.

Not something any branch of the government is known for.

Lots of things the government does cost money. The military doesn’t pay for itself. K-12 schools don’t pay for themselves. Smithsonian Museums don’t pay for themselves. That doesn’t mean those aren’t important functions that it makes sense to put some of our tax dollars toward.

I could argue some of those examples, but I'm not an economist; I just see that the intangible benefits of these things far outweigh the monetary costs. We can also argue about how much of the budget should be spent on them (and I'd add space exploration to the list), and that's okay; that's what we should be doing in a representative democracy. The government shouldn't be run like a business. It should provide those services that the free market finds too unprofitable to consider.

But, of course, people can disagree about that, too.

There is no law of nature saying the US has to weigh its priorities that way. As long as we do, the numbers will imply that it’s better for the budget for people to die before they get old.

I can't be mad at an article that, after talking about US government policies, ends by paraphrasing a song by the very British band The Who.

There is, of course, a lot more at the link; I just hit the points I most wanted to address. I'd suggest reading it for yourself, even if you're not in the US, because some of it might be applicable to other governments, as well.
August 28, 2025 at 10:42am
August 28, 2025 at 10:42am
#1096044
I vaguely remember discussing this pronunciation in the previous blog, but not recently and not this article. From Mental Floss:

    The Right Way to Pronounce ‘Gyro’  Open in new Window.
It’s a notoriously tricky one, so don’t feel too bad if you haven’t been getting it right.


And by "gyro" they mean the food, not the spinny stabilizer, which most people seem to get right (insofar as anything is "right" when it comes to pronunciation).

Alongside philosophy, democracy, and the Olympics, the gyro is one of the most famous—and delicious—things invented by the Greeks.

Matter of personal taste, of course, but I don't find most Greek food all that appealing. The gyro is an exception. I know this because there's a pretty significant Greek-origin population near me, and they used to do a festival every year. Maybe they also still do the festival; I don't know.

I'm not ragging on it, mind you. Lots of people, Greek or not, love it. Like I said, matter of personal taste.

But like I said, the gyro is an exception for me, and I want to pronounce it right when I order it.

But if you grew up outside its nation of origin, you may have a hard time pronouncing the food item the next time you order one. So is it “jee-roh”, “jye-roh”, or “yee-roh”?

I've been told it's actually khee-roh, with something like the guttural kh sound found in languages as diverse as Hebrew and Scots Gaelic. But I wasn't aware that Greek was one of those languages. I don't consider my source to be perfect on this point: he was a Brooklyner of Italian ancestry.

Gyros consist of a pita wrap containing meat (usually pork and beef in Greece, while lamb is more common in the U.S.) sliced off a vertical rotisserie.

The rotisserie thing is probably why it shares a spelling with that other gyro.

From 1965 to 1980, the United States experienced a wave of immigration from Greece. The largest number of immigrants ultimately settled in New York, many in the neighborhood of Astoria, Queens.

Which is why I don't completely dismiss the opinion of the guy from Brooklyn out of hand.

Part of the problem arises from the transliteration of the Greek gamma, or γ. Gamma generally represents the “g” sound in the Greek alphabet, pronounced like the “g” in “gift.” When gamma comes before “ee” and “eh” sounds, however, like the one in gyro, that hard “g” sound turns into more of a rough “y.” Hence the word is “year-oh” instead of “gee-roh.”

Still no gutturals involved, though.

When in doubt, one might just point with one's finger at the menu or whatever, like one does at East Asian restaurants. Or, like, there's this ubiquitous Thai beer called Singha, which Americans usually pronounce every letter of, but I've been informed it's actually just "sing." But when I tried ordering a "sing" at a Thai place, the server said (in a Thai accent), "You mean sing-ha?"

So I'm still not sure. What I am sure of is that pronunciation of a written word can be difficult, which is one reason you still get people wrongly pronouncing .gif like jif.
August 27, 2025 at 10:10am
August 27, 2025 at 10:10am
#1096000
From MIT Press Reader, an article with a title that caught my attention without being overly clickbaity.

    Flat Earthers on a Cruise  Open in new Window.
How evolution wired us to act against our own best interests.


It is, fair warning, a book ad. As I've repeated numerous times, though I hate ads, I tolerate movie ads before movies and book ads on a site devoted to writers and readers.

Now, before I get into the text of the article, I want to try to explain why the picture in the header pissed me off. To understand what I'm saying, you'd have to click on the link to view the picture; embedding it here would be too much work.

In brief, the photo's a take on the famous March of Progress  Open in new Window. artwork, which has, in fairness, been parodied quite a lot in the 60 years since its creation. At the "head of the pack," as it were, is a dude immersed in his mobile phone.

I did say the illustration is 60 years old. In those 60 years, we have learned a great deal about evolution in general, and human evolution in particular. We have learned enough to render that illustration obsolete. So the first part of what pisses me off is that, apparently, people still see human evolution as a linear path, which it absolutely was (and is) not.

The second thing that annoys me about it is that it seems to be mocking the idea of "ascension" (which, again, has been refuted by science) by the cell-phone guy assuming a posture similar to one of humanity's ancestors. This, of course, ignores all of the great human achievements that enabled the production of mobile phones in the first place. Okay, fine, I get mocking things; I do it quite a lot. But the implication, at least in my interpretation, is that we're "devolving," which is utter nonsense, as evolution doesn't have a direction.

And, finally, I'm goddamn sick and tired of people complaining about other people using their phones. Okay, sure, if someone's carrying on a loud conversation on one in a public place, or watching TokTik without an earpiece, complain away. But a person absorbed in what they're doing has no obligation to look at you, or even acknowledge your presence, so leave them in peace.

Whew. Anyway.

The article doesn't start out by improving my mood:

We have long regarded humans as the most rational of animals.

Snort.

But as polymath Bertrand Russell noted, we spend our lives looking for evidence of that claim and find little.

The relevant thing about Russell wasn't that he was a polymath (though that's cool). The relevant thing is that he devoted a huge chunk of his life attempting to discover a self-contained logical system, one in which everything can be explained back to first principles (the "first principles" in this case apparently echoed Descartes' philosophy of the reality of one's consciousness). In the process, he discovered that no such meaningful logical system exists, or can exist. So, my take on this? No, we're not rational. We cannot be rational.

We blame others for our mistakes, rationalize after the fact, and make impulsive choices even when patience would yield better rewards.

Well, whose fault is that? Certainly not mine.

Also, while it may sometimes be true that patience can yield better rewards, humans tend to die at an alarming rate, and what's the point of waiting for something maybe-better when you could get hit by falling space debris tonight?

Some behavioral imperfections appear uniquely human. One is what the evolutionist Bill Hamilton referred to as the nonadaptive strategy of malevolence: harming others with no form of benefit for oneself.

Like many things that we once thought were "uniquely human," I'm pretty sure some nonhuman animals do that, too. It's just that we don't know as much about their motivations, so we can't say for sure.

After all, only humans insult strangers online or back incompetent leaders out of blind loyalty.

Despite what some might believe, the reason "only humans insult strangers online" is that there are only two kinds of entities online: humans, and human-programmed scripts.

Though we behave like know-it-alls, we are easily manipulated and taken in by charlatans of all kinds.

Overly generalized.

We prefer a product that is 80 percent lean to one that is 20 percent fat, and an unnecessary item that costs $9.99 seems cheaper than one that costs $10.

My all-time favorite example of that is when a fast food chain came out with a 1/3-pound burger at the same price as a quarter-pounder. Turned out people didn't want to pay the same price for less meat. Yes, I meant to type that; they honestly thought 1/3 was less than 1/4, because 3 is less than 4, and "why do I have to learn this math stuff that I'll never use?" I didn't believe it myself, at first, but then I looked it up, and it seems that's really what happened (though I suspect there was a secondary effect caused by "quarter pounder" being a much more fun thing to say than "one-third pound burger").

We are willing to get into our cars, stand in lines for hours, and squish into horrendous shopping centers to save a pittance on a special offer for snacks dripping with sugar and fat.

Oh, fuck right off with that "we" bullshit. Not all of us do that. Come on, I'd only do that for a special release of beer.

The article devolves (pun intended) into a questionable abyss of evolutionary psychology, during which:

A great deal of the data from developmental psychology, anthropology, and neuroscience confirms that, for adaptive reasons that no longer exist, our minds have evolved a strong tendency to distinguish between inert entities, such as physical objects, and entities of a psychological nature, like animate agents. We thus are dualists and animists by nature. As a result, we attribute purposes and intentions to things, even when none exist, and imagine hidden motives and conspiracies where there are none. For us, stories always have a purpose, which can be evident or hidden.

This is not the sick burn some might think it is.

We are, in short, belief machines, and we manufacture a lot of those beliefs. And when belief comforts us or helps us make sense of a chaotic world, we cling to it, no matter how irrational. We’re even willing to endure ridicule, as in the case of flat-earthers who set out on a cruise to reach the ends of the earth.

Thus is the article title explained. I'd been wondering about that.

The rest of the article/ad/excerpt is fairly brief, and I've already taken up too much space on this. In summary, two things:

1) I wouldn't take anything here as absolute fact;
2) I've come to the conclusion that humanity is neither good nor evil, but we contain multitudes of both with everything in between;
3) No, I'm not always rational or logical, like when I expect to have two things in a summary and end up with three.

187 Entries
Page of 10 20 per page   < >
<   1  2  3  4  5  6  7  8  9  10   >

© Copyright 2025 Robert Waltz (UN: cathartes02 at Writing.Com). All rights reserved.
Robert Waltz has granted InkSpot.Com, its affiliates and its syndicates non-exclusive rights to display this work.

... powered by: Writing.Com
Online Writing Portfolio * Creative Writing Online