|
About This Author
Come closer.
|
Carrion Luggage
Carrion Luggage
![Traveling Vulture [#2336297]
Blog header image](http://www.InkSpot.Com/main/trans.gif) ![Traveling Vulture [#2336297]
Blog header image Blog header image](/main/images/action/display/ver/1741870325/item_id/2336297.jpg)
Native to the Americas, the turkey vulture (Cathartes aura) travels widely in search of sustenance. While usually foraging alone, it relies on other individuals of its species for companionship and mutual protection. Sometimes misunderstood, sometimes feared, sometimes shunned, it nevertheless performs an important role in the ecosystem.
This scavenger bird is a marvel of efficiency. Rather than expend energy flapping its wings, it instead locates uplifting columns of air, and spirals within them in order to glide to greater heights. This behavior has been mistaken for opportunism, interpreted as if it is circling doomed terrestrial animals destined to be its next meal. In truth, the vulture takes advantage of these thermals to gain the altitude needed glide longer distances, flying not out of necessity, but for the joy of it.
It also avoids the exertion necessary to capture live prey, preferring instead to feast upon that which is already dead. In this behavior, it resembles many humans.
It is not what most of us would consider to be a pretty bird. While its habits are often off-putting, or even disgusting, to members of more fastidious species, the turkey vulture helps to keep the environment from being clogged with detritus. Hence its Latin binomial, which translates to English as "golden purifier."
I rarely know where the winds will take me next, or what I might find there. The journey is the destination.
December 15, 2025 at 8:34am December 15, 2025 at 8:34am
| |
This PopSci piece is nearly a month old, which matters when you're talking about transient space phenomena. Still, I'm sure most people remember the subject.
As I've been saying: It's not aliens.
Provisionally.
It's obviously "alien" in the sense that it comes from a whole 'nother part of space. Few would doubt that, and those few would be in the same category as young-earth creationists and flat-earthers: complete deniers of piles and piles of evidence.
The only "controversy" - mostly manufactured - was whether it was the product of alien sentience. The problem with any "sentient alien" hypothesis, though, is the same as the problem with Bigfoot: we can't prove Bigfoot doesn't exist; we can only continue to not find evidence of her existence.
During a press conference on November 19, NASA confirmed the icy rock poses no danger to Earth, and contrary to certain conspiracy theories, is not an alien spacecraft.
The "alien" people weren't necessarily spouting conspiracy theories, though. Just wishful thinking and projection. Any true conspiracy theorist would take one look at NASA's denial, and consider it proof that they're hiding something.
âIt expanded peopleâs brains to think about how magical the universe could be,â said Dr. Tom Statler, lead scientist for solar system small bodies, during the livestream announcement.
I remember when I was a kid, fascinated by astronomy, there was talk about comets or other visitors from other star systems. Much like with the detection of extrasolar planets, though, it was only recently that we actually confirmed their existence.
The universe is strange enough, but people are strange enough to have to try to make it even stranger. I actually kind of love that about people. It's only when they take it too far and replace reality with one of their own that I get disgusted.
There's more at the link, including actual images from actual telescopes, but fair warning: the images aren't exactly breathtaking, not like the famous pictures of nebulas and such from Webb or Hubble.
Mostly I just wanted to reiterate that it's not aliens.
It's still pretty cool, though. |
December 14, 2025 at 8:32am December 14, 2025 at 8:32am
| |
Here's a rare occasion when I talk about sex, thanks to Nautilus.
And already I have issues.
The headline may seem neutral enough, but then you get to the subhead, and it uses "fidelity" as a synonym, which conveys an implicit bias due to the positive connotations of "fidelity." And then you get to the evolution part, and wonder about direction of causality: did sexual practices shape our evolution (other than in the obvious sense of enabling evolution to continue), or were are sexual practices shaped by evolution? Or some synergy between them?
Well, substitute "I" for "you" in that paragraph. You know what I mean.
And let's not undersell the worst implicit assumption there: the primacy of heterosexuality.
Across cultures and millennia, humans have embraced a diversity of sexual and marital arrangementsâfor instance, around 85 percent of human societies in the anthropological record have allowed men to have more than one wife.
"Allowed?"
And yes, it's almost never the other way around.
Still, remember what Oscar Wilde said: "Bigamy is having one wife too many. Monogamy is the same."
Anyway. If that 85% figure is correct, and I have no facts to contradict it, then we should be considering polygamy—not monogamy, not polyandry, not any other mutually agreed-upon relationship—to be the default for humans.
The problem with polygamy as a cultural norm, though, apart from Wilde's quip, is math. The proportions just don't work out, unless you send a lot of your young men off to die in war. Which, of course, a lot of cultures did. Or unless you also accept polyandry, which, given the patriarchal, hierarchical nature of most societies, ain't gonna happen.
I'd be remiss if I didn't note that "monogamy" itself, the word, literally means "one wife," while "polygamy" obviously means "multiple wives," thus reinforcing the male-primacy point of view. But words can change meaning over time, and for the sake of convenience, just assume that whenever I use a word like that, I'm referring to sexual partners of any gender.
But in the broader evolutionary picture, some researchers have argued that monogamy played a dominant role in Homo sapiensâ evolution, enabling greater social cooperation.
"Some researchers." Right. It couldn't be the ones with an agenda to push, could it?
This theory aligns with research on mammals, birds, and insects, which hints that cooperative breeding systemsâwhere offspring receive care not just from parents, but from other group membersâare more prevalent among monogamous species.
I'm not sure we should be labeling it a "theory" just yet.
To decipher how monogamous humans actually have been over our evolutionary history, and compare our reproductive habits to other species, University of Cambridge evolutionary anthropologist Mark Dyble collected genetic and ethnographic data from a total of 103 human societies around the world going back 7,000 years. He then compared this against genetic data from 34 non-human mammal species. With this information, Dyble traced the proportion of full versus half siblings throughout history and across all 35 speciesâafter all, higher levels of monogamy are linked with more full siblings, while the opposite is true in more polygamous or promiscuous contexts.
I have many questions about this methodology, not the least of which is this: humans don't have sex for procreation. We have it for recreation. Procreation is a byproduct, not a purpose, most (not all) of the time. A study like that, concentrating on births, completely ignores the purely social aspect of multiple partners.
As support for my assertion there, I present the bonobos, our closest primate relatives, who engage in recreational sex all the time.
Most species don't seem to use sex for recreation, though I'm hardly an expert in that regard. This makes humans (and some other apes) different from the birds and the bees, and using other animals as models for the ideal human behavior is a good example of the naturalistic fallacy.
Point is, I submit that using live births as the sole indicator of degree of polygamy is just plain wrong, and will lead to incorrect conclusions.
âThere is a premier league of monogamy, in which humans sit comfortably, while the vast majority of other mammals take a far more promiscuous approach to mating,â Dyble said in a statement, comparing the rankings to those of a professional soccer league in England.
And with that, he betrays his bias.
This doesn't mean that the study doesn't have merit, mind you. It might be useful for drawing other conclusions. I just don't think it means what he, and the article, claim it means.
Meanwhile, our primate relatives mostly sit near the bottom of the list, including several species of macaque monkeys and the common chimpanzee.
Heh heh she said macaque.
Let's not forget another important thing: a species will almost always have the reproductive strategy that works for that species. It could be pair-bonding. It could be complete promiscuity. It could be something in between. Whatever works for that niche.
Personally, I'd hypothesize that humans fall somewhere in the middle for the simple reason that we live for drama, and what's a better source of drama than who's doinking who? Hell, it's the basis for at least half of our mythology, and all of our soap operas.
There's more at the article, obviously. I just want to close with this:
Unlike other humans, I don't care who's doinking who. The only thing I care about is that everyone involved be consenting or, better yet, eager. You want to be monogamous? Find another monogamist. Poly? Find other polys. Single? Get some good hand lotion. I. Don't. Care.
But if you agree to a particular lifestyle, whatever that may be, and then you go behind your partner's or partners' backs? That's what I consider wrong. Not, like, on the same level as murder or theft or wearing socks with sandals, but still, it's something to be ashamed of. |
December 13, 2025 at 9:18am December 13, 2025 at 9:18am
| |
More support for my "science fiction is the most important genre of literature" hypothesis, from a not-so-happy Medium:
Because those are references to Nineteen Eighty-Four and Brave New World, respectively, and, despite being Very Serious Literature studied by Very Serious Literature Students, both are indisputably science fiction.
Everyoneâs worried about 1984, right?
Well, no. 1984 was, as the article footnotes, a damn good year. They mention Ghostbusters from that year, but it's also the year Born in the USA came out, and we were still near the beginning of our long downward slide. Michael Jackson amused us all by lighting himself on fire, This is Spinal Tap turned everyone up to eleven, and the first American chick to do a spacewalk did a spacewalk. The greatness of the year was only marred by the re-election of Reagan, which was the direct proximate cause of most of our current problems.
Horrible authoritarian government that monitors its citizens every second of the day and controls them with a tight fist. You know the drill.
Yes, and I wish more people would actually read it instead of pretending to have read it.
And sure, the authoritarian world of 1984 is something we want to avoid. I think we call agree on that.
For various definitions of "we," sure. I can think of several people right off the top of my head that would love it.
Dictatorships are a wonderful form of government. For the dictator.
But thereâs another style of dystopia that weâre heading towards instead. One that we should be more worried about.
The Fallout universe? No, that would at least provide some level of amusement for the survivors.
Itâs a track the internet started us down, and generative AI is accelerating us down.
Oh no. No no no no. Don't blame the internet.
For those who havenât read âBrave New Worldâ by Aldous Huxley, it paints the picture of society where every country is united in a giant World State.
Okay, full disclosure, I last read both of those novels sometime before the year 1984. But I did read them.
Society has strictly defined and enforced social classes, where the upper classes live luxurious lives, while the lower classes do all the hard work for less reward.
As I'm sure you know, we're already in that part of the book. The internet didn't cause that. Reagan did. (Don't give me shit about that. That's the actual predictable outcome of Reaganomics.)
What makes the World State interesting is how they keep the lower classes in check. In 1984, the people are controlled through fear.
I mean, it's effective.
In âBrave New Worldâ, the government encourages everyone to take a drug called âSomaâ, a drug that is âeuphoric, narcotic, pleasantly hallucinant.â Taking Soma makes people happy and content with no ill side effects.
If you're happy and you don't know any better, that can feel like true freedom.
It's one reason I've ragged on the idea that happiness is a worthwhile goal, by itself, to pursue.
Soma, the fictional drug, creates a sense of comfortable complacency. It doesnât make people happy, but it distracts them from being bored or sad, and keeps them just satisfied enough to keep doing their jobs.
If that book were written today, the drug would also have to boost productivity.
Itâs hard to see Soma now without comparing it to the modern internet. The explicit goal of TikTok, YouTube, Instagram, Reddit, and basically every other part of the internet is to give you enough of a dopamine spike to keep you addicted to them.
Ehhhhh... I'm skeptical. While I avoid all of those (except the occasional hit of YouTube), other articles have ranted about how those platforms, and others, exist to keep people out- and en- raged. But definitely also engaged. (There's no such thing as outgaged, is there?) Engagement doesn't require pleasure spikes. And rage is, like, the opposite of a dopamine effect.
Video games, now, I could see the dopamine thing. But nothing's going to get me to say a bad thing about video games, unless they're the free kind that come with commercials, or the other kind where you have to keep paying to play or get cool in-game stuff.
The article goes on to do a fairly run-of-the-mill rant against generative AI and its companies, and you can read it if you want, but I'm fairly certain that my intelligent, perspicacious, well-informed and remarkably good-looking readers have already heard something similar.
Hereâs the thing about dystopias. Theyâre only bleak for the people on the bottom. Thereâs always a group on top who wins out.
I just said that, up there.
But then people ask me âOkay, but what do I do about it? How do we prevent this dystopian future?â
One of the most insidious aspects of mass media is that it almost always gives us the hope that there are things that we, personally, can do to prevent [whatever catastrophe]. Sometimes, they might be right. But not always. And it just leads to people stressing out more. What can we do about that? I don't know, and if I did, I wouldn't want to add to your stress by stating it.
It's like "The missiles are coming. But there's still hope! Get to a bomb shelter in the next 10 minutes!"
Or, "An asteroid is on collision course with Earth. What can you do to save yourself? You can try to leave Earth now if you can."
To summarize, their "prevention" is: don't use AI. And I can understand that. But the problem is, it's not going to happen, just like the solution to the problems of social media is "don't use social media" and people do it anyway. Or how the solution to terrible working conditions in the preparation for the World Cup was "don't watch the World Cup," but people did anyway, and advertisers got the message: doesn't matter how authoritarian the regime is, doesn't matter how many enslaved or conscripted workers they had, people will put eyes on our ads because it's a sportsball championship.
At this point, not using AI is like "you can help fight global warming by not having a car." It works for some people (hell, it worked for me for over a year), but you're fighting an uphill battle suggesting it. Your individual contribution is like a bucket of piss in the ocean. Nothing more. And you can't do anything at all about the billions of other buckets of piss, or the several thousand Colorado-sized swimming pools full of piss which, in this disgusting metaphor, represents the big companies.
Don't get me wrong; I'm not a fan of most AI (I have to admit I'm amused by the image-creating ones) or the tech companies that are shoving it down our throats. But I do think the article is dead wrong about one thing: it assumes that it's either 1984 OR BNW, when the reality is... we're heading for both at the same time. |
December 12, 2025 at 9:43am December 12, 2025 at 9:43am
| |
Ah, yes. December. An appropriate time to think about Cheeses, our Savour.
After three rants in a row, it's nice to get back to a safe topic like delicious ch-
Wait.
"TikTok hack?"
If you are adding grated Parm to a pasta sauce or really need it to melt into your dish freshly grated is the way to go.
That's true, but, and hear me out here, the easiest way to grate Parmesan is to buy it pre-grated.
I admit it's not as tasty as the fresh stuff, but all I'm doing is adding a bit of extra flavor to my sad, lonely, bachelor microwave pasta dish.
This hack leaves all the work up to one staple kitchen appliance: the blender.
Okay, so, let me get this straight.
You think it's easier to grate the cheese in a blender than to use a handheld cheese grater?
You're not thinking this through. You have to factor in cleaning. Cleaning a handheld grater is dead easy if you have a dishwasher: throw it in there. Cleaning a blender, on the other hand, is a chore.
Start with a wedge of your favorite Parmesan cheese. Cut the cheese into several cubes. No special knife skills needed!
Ah, yes. And then there's this additional step. And you do need special knife skills; Parmesan is a hard cheese, and unless you know what you're doing, you could add fingertip to your pasta topping.
Why is Freshly Grated Cheese Better than Pre-Packaged Products?
Because the Bulk Cheese people paid us and the Pre-Packaged Cheese people didn't.
...okay, no, I agree that freshly grated is best. But sometimes laziness is better than not-laziness.
The article does go into some semi-technical reasons why it's best to Make Yourself Grate Again, and you can go there if you want to know.
Anyway, that's all for today. No deep philosophical insights, no questionable scientific studies, no articles chock-full of confirmation bias. Just some nice, cheesy jokes. |
December 11, 2025 at 9:46am December 11, 2025 at 9:46am
| |
What am I doing on a site called VegOut? Glad you asked. Just questioning the validity of this article.
*taps cane*
I'm telling you. Kids these days.
...shall we start with the obvious? If this really applied to everyone, or even to the majority, in my cohort, then that's the polar opposite of "set you apart."
So even with the headline, my inner cynic is poking out of my pants, looking around, and going, "Yeah, right."
Growing up in the 1960s or 70s wasnât just a different era â it was a different psychological environment entirely. No smartphones. No constant supervision. No algorithm telling you what to think. You were shaped by freedoms, challenges, and cultural norms that simply donât exist for todayâs kids.
And spankings. Lots and lots of spankings.
(I'm not saying that was right, mind you. It's just the way most parents did it back then, because they, too, were shaped by their pasts.)
And psychologists agree: the way your brain developed during those formative years left you with cognitive patterns, emotional habits, and mental strengths that set you apart â for better and for worse.
I can't disagree that experience during one's formative years is a big factor in one's personalty. Without getting into the age-old debate of nature vs. nurture, both have some role to play. What makes my skepticism rise is the idea that we all, or most of us anyway, had experiences similar enough to make sweeping generalizations.
My own upbringing was unique. Others were even more rural. Many were suburban, or urban. I feel like those environments mold people differently. And I'm not even going to get into how the article is US-centric; let's just acknowledge that it is, and run with it.
About the only thing I can think of was what we all, indeed, had in common: TV was three commercial channels, plus PBS, broadcast over airwaves and received by antennas. Radio was our only source of music, and, for me at least, they had both kinds: country AND western. Worst of all, no video games, at least not until the late 70s, which also featured disco, so kind of a mixed blessing there.
I'm not reminiscing about this out of some nostalgia, mind you. There was a lot to dislike about the situation. Let's not forget how the military action in Vietnam affected all of us at the time.
Point is, I feel like a lot of these things do, in fact, apply to me. But that doesn't make my skepticism go limp.
As someone who writes about psychology, mindfulness...
Normally, I'd Stop Reading Right There.
...and the changing dynamics between generations, Iâve noticed something fascinating: people who grew up in the 60s or 70s often share certain traits that younger generations donât naturally develop.
Okay, but a statement that general, couldn't it apply to every "generation," however you want to define that?
The article gets into the specifics of the "certain traits," but swap those out, and you're back to making sweeping generalizations about any generation.
1. You developed resilience through boredom, not stimulation
Kids today are never bored â they have a screen in their pocket that delivers instant entertainment on demand. But in the 60s and 70s, boredom was your constant companion.
I remember being bored on occasion. I always found something to do, though (living on a farm'll do that). If nothing else, I loved reading books. And there weren't a lot of other kids around; the ones that were, were delinquents (one of them is in prison for murder one right now).
I'm not saying I was an angel, but I was an only child in a rural area and I had to get creative. But I'm certain boredom wasn't a "constant companion."
This is why so many older adults are naturally more grounded and less dependent on external stimulation. Your childhood literally trained your brain to be comfortable sitting with your own thoughts.
It couldn't be just because older adults of any generation generally become more grounded and less thrill-seeking. Could it?
2. You learned independence because no one was tracking your every move
On this point, I provisionally agree. I'm not sure how that's better, though. There's something to be said for training your kids to grow up in a surveillance environment. Makes 'em more paranoid and less likely to do some of the shit I got away with. This is what Elf on a Shelf is for, by the way: letting kids internalize that someone, somewhere, is always watching.
3. You became socially adaptable long before the internet existed
Growing up in a pre-digital world meant you had to socialize the old-fashioned way:
Face to face.
At school.
On the street.
On the phone attached to the kitchen wall with a cord that could barely reach the hallway.
I'm not convinced this is better, despite having lived it. Different to today? Sure. Better? To use the catchphrase of my generation: Meh. Whatever.
4. You experienced consequences directly, not digitally
When todayâs kids make a mistake, they might get a notification, a screen warning, or a parent stepping in immediately.
Much as I, like anyone else, am tempted to think how they grew up was "the way things ought to be" and "better," again... not convinced.
For instance, there is something to be said for being able to chat in real-time with people all over the globe. The potential is there to acquire a wider perspective, not just be limited to your own neighborhood, region, or country.
5. You grew up with scarcity, so your brain understands value
Whether you grew up comfortable or not, the 60s and 70s were an era before excess, convenience, and instant gratification.
This actually made me laugh out loud. Not because I felt like we were rich or anything, but because my parents were older, and lived through the actual Great Depression. We didn't have scarcity. We had convenience. Sure, we weren't ordering crap from the then-nonexistent internet, but, and I can't stress this enough, we lived like royalty compared to the shit my parents went through when they were young.
Not that there weren't people my age living in poverty, but you can say that about today's situation, too.
6. You developed patience and attention span from slower living
And promptly threw them out the window as soon as I possibly could.
There's more at the link, but my attention span is already overtaxed.
The one thing I'll say that stands out to me: when I was a kid, we only had to find a way to find the fit in / stand out balance with maybe 100-1000 other kids, at school or whatever. Call it 500 for argument's sake. You can find a way to stand out against a crowd of 500. Maybe you're the best musician. Maybe you're the best artist. Maybe you're the best at math, or science, or bike-riding, or delinquency. Or you excel at football, or have the nicest tits. (For me, it was being a comedian. Class clown five years running, baby!) Now, kids aren't competing against their school group, but among millions of other kids from all over the world. How do you stand out in a crowd like that? Some do, of course.
The rest? Meh. Whatever. |
December 10, 2025 at 9:48am December 10, 2025 at 9:48am
| |
I don't expect this BBC article to be of interest to everyone. But, as someone who was adopted, I have Opinions.
I will note, for those not wanting to click on the link (but come on, it's the BBC, not some malware site), that the article opens with a big pic of Tom Hiddleston as Loki from the MCU.
Hollywood blockbusters and horror films frequently using adopted children as psychopaths and villains causes harm in real life, adoptees have said.
Okay. I'm not going to argue that such a thing can't cause, or hasn't caused, harm. I don't know. What I do know is that the same can be said for any minority: villain-casting has the potential to reinforce stereotypes if the writers are careless. Though the same can be said for hero-casting them.
James Evans, 23, was two-and-a-half months old when he was removed from his birth family due to their inability to parent and harmful behaviour.
Now with a masters degree in scriptwriting, James said films such as Thor, Annabelle and The Conjuring: The Devil Made Me Do It, among many others, made him "frustratingly uncomfortable" at how adoptees are depicted.
I didn't see those last two, but they sound like horror movies. No one in a horror movie, generally, is portrayed in a great light.
He was fostered by two families before Ruth and Andrew Evans adopted him when he was two and said no film or TV series had ever made him feel "properly seen".
Well... if he feels that way, he feels that way. I suppose there might be unresolved trauma, given his unfortunate first two years. My situation was different (adopted as an infant and kept in a stable, if not perfect, home). (What family is perfect?)
One of the most high-profile adoptees in cinema is the Norse god of mischief Loki in the Marvel films.
This bit, though, I can say something about.
First of all, in that movie, Thor himself starts out as a massive, throbbing cock, and he's the one who's the biological (or however it works in Asgard) offspring. The whole movie is his redemption arc. Loki is... complicated, I'd say more a foil than a villain, and really shows up as a major villain in The Avengers.
But what the article doesn't talk about, and may not know, is that Loki gets his own redemption arc, in the form of the series that bears his name. I'm not going to spoil it, but damn, I can't think of a more powerful redemption arc in all of literature.
Yes, I said literature. Shut up.
These stories reinforce damaging stereotypes of adopted people as imposters or "devil children" where trauma is used as a "lazy" plot device for evil, he said.
Okay, like I said, stereotyping is bad. But writers use trauma as a background for villainy with bio children, as well.
The other end of the spectrum is the "grateful adoptee", when a child's adoption is seen as a fairy tale ending, such as Miss Honey taking in Matilda in the Roald Dahl book and subsequent films.
Another one I'm not familiar with, but having read other Dahl, I find it difficult to believe it's truly a fairy tale ending.
This ignores "the loss and grief" of children being taken away from their birth parents, James said.
Sheesh, talk about stereotypes.
I've no doubt there are Issues involved there. But, again, Issues can stem from a lot of childhood trauma, not just having been adopted.
While James has been "loved and cared for" and has "the best support system" in parents Ruth and Andrew, he said just because his trauma was invisible, does not mean he did not need help.
To be clear, I am not trying to minimize or mock his lived experience. Again, though, lots of people need help.
James said the portrayal of adoptees through the fairy tale lens was as damaging as being presented as villains as it tells society they were ungrateful if they behave outside this stereotype.
You think that's bad? Try being a fairy-tale stepmother.
"If an adopted child's parents are parenting them, they are their real parents.
"They are the ones who are there every day fighting for their child and that is real parenting. Biology isn't fundamentally what defines parenting, it's what you do."
On that, I am completely in agreement. My real parents are the ones who changed my diapers, kissed skinned knees, and put up with my teenage bullshit. Not the ones who happened to share five minutes of fun.
Despite all this, James and Susie highlighted some good portrayals.
And yet, the ones they highlighted don't paint the whole picture.
See, there's another adoptee in literature. Probably the most famous one. The one that is the most canonically not a villain, but the absolute polar opposite thereof, despite his trauma.
I'm talking about the guy in the blue suit and red cape.
Now, I'm not saying they can't do better. That we can't do better, as writers. But I object to the idea that adopted people should never be villains. Just like with anyone else, they have agency. We're not angels; we're not monsters. We're just people. It would be like saying "We can't show this Black dude as a gangster, because that would make people think that all Black dudes are criminals."
As for me, if I want to see a positive role model for my own experience, all I have to do is look. Up in the sky. |
December 9, 2025 at 8:28am December 9, 2025 at 8:28am
| |
I generally prefer to share the less mainstream (at least in the US) material in here, but this one from CNN angered me enough to make me save it.
Just yesterday, I was text-ranting to a friend about this very thing, not knowing that this one would come up at random the very next morning. Here is, in part, what I said (with maybe some clarifications and autocorrect decorrections):
Look, lending has so far managed to mostly cover up just how POOR most people are.
And it's worse than that.
By making something affordable [through loans], sellers adjust prices upwards faster than otherwise.
College tuition used to be pricey but not outrageous. Then student loans. Then it shot into orbit.
Cars, also faster than inflation.
I'm not judging people for going into debt. Mostly.
I'm judging the fucking system.
They want us to stop owning.
They want us to rent everything.
They want subscriptions, not sales.
We're meat for the table.
This isn't capitalism. This is feudalism. Neo-, maybe.
Not bad for an off-the-cuff text rant, I guess, but not as much thought as I usually put into these blog entries.
Hence, the article.
Stubborn inflation continues to make the cost of living unbearable for many Americans.
Is it inflation? Or is it no one wanting to pay them?
A number of inventive solutions have emerged â but with a common theme: putting consumers deeper into debt.
I don't think I have to tell you what I think of that, after the above.
This weekâs 50-year mortgage proposal from the Trump administration is the latest example of the trend.
I swear to all the gods, I don't know how people survive. I don't have the best memory, I know, but I do remember seeing 60-year mortgages being offered a while back. Doesn't anyone else remember that? And by "a while back," I mean "just before the entire house of cards collapsed in 2008."
They were bad ideas then, and they're bad ideas now (there's not a lot of difference between 50 and 60).
I swear I'm not getting political, here. They're bad ideas no matter who's running the show, or pretending to.
The potential for a 50-year mortgage comes as the auto industry has been pushing seven-year car loans, which have become an increasingly popular option with the average price of a new car hitting a new record of more than $50,000.
Don't you get it? The prices will simply adjust upwards again to compensate for the increased demand.
And the explosion of buy now, pay later options online and at brick-and-mortar retailers has normalized taking on longer-term debt for purchases as small as food delivery.
Come to think of it, I haven't heard much noise about predatory payday lenders recently. Did someone finally put a band-aid on that chainsaw wound?
Case in point: While a 50-year mortgage could lower monthly payments, the amount of interest a borrower would pay over 50 years could be double what would be paid at current rates over 30 years, the traditional length of most mortgages.
Oh, but it's worse than that. Way worse. I can't be arsed to pull up an amortization calculator right now, but in very general terms, earlier mortgage payments are mostly interest, not much principal. It takes, if I recall correctly, about 20 years or so for a 30-year mortgage without pay-aheads to get to the point where one month's payment is half-interest, half-principal.
That amortization would be even less favorable with a longer-term mortgage. You could be paying for years, and not build significant equity in your house.
You're basically renting the damn thing from a bank. A bank who, if someone fat-fingers something or just loses some paperwork, can just take it right away from you.
Not to mention that I question the ability of recently-built houses to even last 50 years.
There's more at the link, but honestly, I'm too riled-up to rant any more about it today. You may be wondering why this pisses me off so much, considering that I'm retired with a fully-paid-for house and car. Well, it's simple: I don't just think of myself, and I wish other people would do the same.
Just remember: if you can't pay cash for it, you can't afford it. Yes, this includes houses, though there's usually good reason to accept debt for a house. And if you're thinking, "But, Waltz, I couldn't afford anything if I couldn't take out a loan for it," then you may be beginning to see my point here: that we're not free agents, but locked into perpetual servitude to our Overlords.
Like I said. Neo-feudalism.
EDIT:
So, after some feedback, I realized I might not have been as clear as I thought.
This still might not make things clear, but I'm going to try:
Whether to rent or buy a home/condo depends on many factors, including personal emotional ones.
But in general, the longer you expect to stay in a place, the more sense it probably makes to get a mortgage.
BUT with a nonstandard, >30 year mortgage, you're paying more up front in interest, making it not much different from renting (though circumstances may vary).
I say "if you can't pay cash for it, you can't afford it." But there are circumstances, such as buying a house, where it's okay that you can't afford it.
Cars are kind of a gray area.
Finally, when I rant against "the system," what I mean specifically isn't standard mortgage loans, but them making it easier and easier to borrow for other things, from education right down to everyday items such as groceries and that GameBoy you've had your eye on (in other words, consumer goods that are both necessities and not-so-necessities). This can give the false impression that you can afford something, when you actually cannot.
Now, I could be *wrong* about any of these things. I'm not quite infallible. But I hope that clears up some of what I said up there. |
December 8, 2025 at 11:13am December 8, 2025 at 11:13am
| |
Today, we have Mental Floss with another blurb about how fun English can be.
Well, I bloody well have now. (Or at least seen them type it.)
The English language is certainly bizarre in the best way.
For alternative definitions of "best."
Some of it is totally run-of-the-mill, and some of it is full of words that only seem to appear in one extremely specific situation.
I haven't seen MF try to explain idioms like run-of-the-mill (they might have and I missed it), but nothing about our language is truly ordinary, once you really look at it.
So letâs take a little stroll through eight words that only show up in one weirdly specific context.
You know, there's another way to look at this, too. What we call a "word" is pretty arbitrary. You can take two words and mush them together, like "homework" or "housework" (which are, absurdly, not the same thing at all). So another perspective is that these featured phrases are actually words that just happen to have a space inserted in them somewhere.
Inclement (Weather)
For that matter, no one ever describes the weather, or anything else, as "clement."
Bode (Well/Ill)
Bode is a free agent in theory, but letâs be honest: youâve only ever seen it next to âwellâ or âill.â
Free agent? Nah, it's in a threesome.
Hermetically (Sealed)
Now, this word is a little âunderground,â if you will. Hermetically sealed sounds like something out of a sci-fi lab, but it mostly refers to food packaging...
This is where I get to rant: if it's being used for food packaging, that's- well, not wrong, per se, but a distortion of its original intent.
It's a bit complicated, but despite Hermes being the Greek equivalent of Mercury, it has nothing to do with the god Mercury, the planet Mercury, or the element mercury. Instead, it's related to the Egyptian god of wisdom and knowledge, Thoth (I told you language was weird.)
Thoth was said to have invented a magic seal that could keep vessels airtight (presumably useful in certain Egyptian interment procedures). But he's best known for being the mythical founder of certain occult orders, and the adjective "hermetic" originally referred to these spiritual practices, and was basically a synonym for "occult."
So if you use, or see, "hermetically sealed," just remember you're communing with deep, ancient spirits. Treat them with respect.
Pyrrhic (Victory)
There's good reason for this one, which is derived from some ancient general who won a battle, but at too great a cost. It's not like you could have a Pyrrhic loss. Or a Pyrrhic anything else at all.
Contiguous (United States)
I'm going to quibble with this one; I'm pretty sure I've heard the word used in math contexts.
More at the link if you're interested. Mostly I just wanted to rant about the "hermetically" entry. |
December 7, 2025 at 10:24am December 7, 2025 at 10:24am
| |
I found another rare (these days) article from Cracked which isn't all about celebrities or movies.
It's not that I have some sort of snobbish disdain for the entertainment industry. I like shows and movies. But I don't give a damn about their actors' personal lives, usually, and I'd simply rather talk about what I consider to be more important things, like the pooping from yesterday.
So the website is kind of a pale echo of its former self, though I still get highlights from them. This was interesting enough to share. And wonky enough to quibble about.
We learn more and more about our universe all the time, but thereâs recently been a huge spike in stunning cosmic observations.
Also a huge spike in stunning bullshit, like the nonsense about Comet 3I/Atlas. Yes, it is, by the most technical interpretation of the term, an alien invader. No, it's not "aliens." One of the reasons I'm sharing this article is that it mentions that comet without giving the "aliens" hypothesis the oxygen it truly doesn't deserve.
As an aside, I miss the days when they'd name comets after the people who discovered them. As I understand it, most early comet detection is automated now, hence the naming after robots instead of meat. Could we at least give them fun names? Even if we have to name them after celebrities or some shit. I'd suggest letting people bid on naming rights, but Muskmelon would win every time, and the comet names would be like X, X-1, 2X, X3, XXX, X35 (which is "sex" spelled backwards), or 3I/Atlas.
Anyway, the article is, as usual, a countdown list, and there are pictures, and I'm only going to comment on a few of them (no pix).
15 Astronomers recently discovered 128 new moons orbiting Saturn
To date, Saturn now has 274 known moons.
I'm pretty sure I've mentioned this, or something like this, before. You know how when they redefined what "planet" meant, and the definition ended up excluding Pluto, and a whole bunch of people who otherwise only gave a shit about celebrities and who they were doinking suddenly became space critics? Well, I think they need to do a definition for "moon." See, right now, as far as I know, there's no fixed lower limit on how big something has to be to be called a "moon;" it just has to orbit a planet or dwarf planet. So, Saturn also sports the second-biggest moon in the system: Titan, aptly named. That fat bastard's bigger than Mercury.
But I'm also guessing Saturn also controls one of the smallest moons in the system: a tiny grain of dust somewhere in its awesome set of rings.
Yes, every particle in those iconic rings can be thought of as a moon, because it orbits the planet.
Point is, Saturn doesn't have 274 moons; it has millions. Just because we can't resolve all of them individually doesn't make what I'm saying less true. We can't resolve individual stars in distant galaxies, either, but they're still stars.
It is, however, a categorization issue, not a science one.
13 Uranus' 29th moon
A convincing argument against free will is that you just now thought of a "your anus" joke.
Hidden inside the planet's dark inner rings, new observations from the James Webb Space Telescope found Uranusâ 29th moon.
And that you're trying desperately to make a "dark inner rings" joke to complement that.
Anyway, see above for "what's a moon" quibbling.
11 Martian Dust Devils
Someone really needs to make that a band name. Or a sportsball team name.
8 Jupiterâs neon light show
This is cool and all, but this made me do actual research to see if "neon" meant the pretty colors, or the actual noble gas called neon that, when properly stimulated, makes pretty colors. But neon isn't a significant component of ol' Jupe's upper atmosphere, so I'm gonna go with "pretty colors."
5 Comet C/2025 R2 SWAN
Seriously, folks. If you can change the definition of "planet," you can change the comet-naming convention. No one wants to call them by something more suitable for being an OnlyFans password.
4 The universe is getting colder and slower
Aren't we all? Welcome to the party, pal.
Like I said, there's more at the link, and pretty pictures too. You might have to put up with ads and popups; I don't know, because my computer wears two condoms when it probes the internet. |
December 6, 2025 at 9:34am December 6, 2025 at 9:34am
| |
Science: investigating the really important stuff since 1666.
The Number One reason to go Number Two?
You've downed a cup of strong coffee, and soon you have an urge to poop.
You may be wondering how those of us who don't drink coffee can manage to pinch a loaf every now and then. I mean, sometimes I'm backed up for days or weeks, watching my friends who do drink coffee answer the call of doody and return with a satisfied smile on their faces. I'm jealous.
...Of course I'm joking. Everybody poops.
After you've done your business, you feel a sense of relief. So why does that bowel movement feel so satisfying?
You know, science isn't really about answering "why" questions. They tend to multiply themselves. Better to ask "how."
There are many physical, behavioral and psychological factors that could contribute to this feeling.
I'd think there would be a sound evolutionary reason for it: if it hurts, you don't do it as much, and if you don't do it, you get sick and die, and if you get sick and die early enough, your genes don't get passed on.
As the bowels fill up, nerve endings communicate an uncomfortable stretching sensation to the brain.
Except, presumably, in people who have somehow wrecked 'em.
Typically, thanks to the external sphincter, we don't immediately poop.
Young-person-like typing detected.
Emptying out the bowels by releasing stool relieves this pressure, which feels good.
Can't recommend the smell, though.
"When you relieve the distension, areas like the anterior cingulate gyrus and the insula show a reward response," she said. These regions of the brain play a role in reacting to pain and relief of pain.
Okay. That's still not "why."
The gut communicates to the brain via the vagus nerve, one of the major cranial nerves. Evacuating the bowels stimulates the vagus nerve. This can lower a person's blood pressure and heart rate, creating a relaxing feeling, Person said.
This feels like a circular argument. Stimulating the vagus nerve makes you feel good. Dropping the kids off at the pool stimulates the vagus nerve. Therefore, laying cable feels good.
Don't get me wrong; it's good to investigate the mechanisms behind bodily functions. It might help doctors figure out how to fix you. But it's still a "how" thing, not a "why" thing.
Or maybe I'm just full of shit. |
December 5, 2025 at 10:04am December 5, 2025 at 10:04am
| |
Wading back in to what we writers work with, here's a listicle about words from Mental Floss:
You know, sometimes, the perfect word doesn't exist, so we have to create it. It worked for Shakespeare. It worked for Charles Dodgson. Fantasy and science fiction are especially prone to the creation of new words, because they deal with new (to us) worlds. Though, admittedly, sometimes they go overboard with it.
Anyway, the article.
Language is ever-evolving, with new words springing up from a variety of places. Some are borrowed from other languages (âkaraokeâ), others are two words blended together (âdoomscrollingâ), and some are simply shortened (âdecafâ).
And sometimes, we just make them up because we feel like it.
As I've said numerous times, all words are made-up. It's only a matter of how long ago.
Science fiction is a particularly bountiful genre for the introduction of new words, in large part because authors come up with unique and otherworldly terms to describe their sci-fi worlds.
Like, where would we be without "frack" from the original Battlestar:Galactica?
As usual, I'm only going to comment on a few of them here.
Robot and Robotics
The word ârobotâ can be traced back to Czech writer Karel Äapek and his sci-fi play R.U.R. (1920).
I did a whole entry on that last month: "No Ifs, Androids, or Bots" 
Grok
Robert A. Heinleinâs Stranger in a Strange Land (1961) follows Valentine Michael Smith, a human born and raised on Mars, as he experiences Earth for the first time.
No word is sufficient enough to express my white-hot anger at having this word appropriated for nefarious purposes. Stranger was a life-changing novel for me, and I've read all of Heinlein's published works. Yes, even the weird, self-indulgent, freaky ones. (I'm not saying I loved all of them.) If there's one writer I can credit for instilling in me a lifelong love of science fiction, and reading and writing in general, it's Heinlein. Well, also Niven. But mostly Heinlein.
So Muskmelon comes along and, first, ruins the good name of Nikola Tesla. That was bad enough. Then he goes and appropriates grok?
My anger burns with the fiery power of a million supernovas.
Metaverse
In 1992, Neal Stephensonâs Snow Crash introduced the word metaverse to the world. Set in a dystopian future, characters use VR headsets to connect to a universally used virtual world called the âmetaverse.â
I don't have nearly the same level of seething rage over this appropriation.
For me, the most memorable thing about Snow Crash was the name of the main character: Hiro Protagonist. You'll never come up with a better name. Neither will I. It is, in practice, absolutely impossible to invent a better name for a novel's main character. Simply can't be done, like accelerating past lightspeed, counting to infinity, or finding an honest politician.
Newspeak
George Orwellâs dystopian sci-fi novel Nineteen Eighty-Four (1949) introduced many new words and phrases to the world.
And make no mistake, 1984 was absolutely science fiction. Most lit-snobs refuse to acknowledge this (or many of Vonnegut's works, as well) because they've been programmed to believe that science fiction is all escapist pulp brainrot and can't possibly be Serious Literature Being All Serious.
There are, as I said, more at the link. I like SF and I like word origins, so how could I resist? |
December 4, 2025 at 10:31am December 4, 2025 at 10:31am
| |
Been a while since I had an astronomy thing to share. This one's from Smithsonian.
I could quibble about their use of "history" there (it's more of a history of humans' lore and science about the cluster than the history of the stars themselves), or the almost-clickbaity "amazing." But that's all I'm going to say about those things.
The Nebra sky disc, as their find is now known, is circular, about 12 inches across and decorated with intriguing gold symbols. Some we can recognize instantly, and others are more ambiguous. But they all appear to be celestial: A crescent shape represents either the moon or an eclipsed sun, and a bright circle depicts the sun or the full moon.
The article includes a sketch of the disc in question, and I can see why the sun and moon might be ambiguous. I also found a photo of the thing from Wikipedia. 
Thirty-two stars cover the disc, but seven particularly draw the eye. They stand out, because they form a tight cluster. Given the cost of materials and craft needed to create this rare object, this must have been a deliberate attempt to portray a cluster of seven stars in the night sky. As such, it can represent only one astronomical formation.
The cluster doesn't really look much like the actual Pleiades, but then again, neither does the current Subaru logo.
(As the article notes, "Subaru" is what the Pleiades are called in Japanese. And there's no truth to the rumor that I drive a Subaru because I'm a huge fan of astronomy. That's only, like, 75% of the reason. They're solid vehicles.)
Nowhere else can a tight group of about seven stars be found easily with the naked eye. And this visibility is why we can find references to the Pleiades from almost all civilizations that left records, under many different names.
I met someone once with an interest in astronomy but very little actual knowledge. I'm not ragging on him, mind you; we all start out with very little actual knowledge, and he was eager to learn more, which is always to be encouraged. So, he knew the Big Dipper and the Little Dipper and maybe Orion. But then he pointed off to Orion's right and asked, "So is that the Micro Dipper?"
I mean, why not, right? Look at the thing.
Why is it said to have âabout seven starsâ? While the cluster is easy to spot, the precise number of stars we see will depend on the conditions: A veil of high clouds, light pollution and light from the moon can all have an impact. So can the time of night and our eyesight.
I never was able to count them, even back when I had near-perfect vision, so I just went with the "seven" thing.
In 1961, the late astronomer Patrick Moore put this question to the audience of the popular BBC weekly television program he hosted, âThe Sky at Night,â and viewers wrote in with their answers. The numbers varied: Some saw fewer than 7, others saw 8 stars, 9 or even, in one case, 11. But the average number was seven.
I have to wonder how much of that was priming, like I experienced. You're told seven, so you see seven. I don't know.
But the number seven will persist. Itâs a sticky number, popular in ancient times and considered lucky by many to this day.
Oh, no, it's not just popular or lucky. Come on. Seven is indelibly associated with astrology, the precursor to astronomy. Early sky-watchers noticed that, against what for all practical pre-industrial Earthbound purposes is a fixed background of stars, a number of bodies persisted (unlike, say, comets) but moved around. And that number was seven: Sun, Moon, Mars, Mercury, Jupiter, Venus, Saturn; all visible to the naked eye. Or whatever different cultures called those wanderers. The concept of a seven-day week was built around that, though the English version is mostly named in the Germanic, not Roman, style (Saturday is an exception).
When Newton, who was a mystic as well as a scientist, studied the spectrum of sunlight, he identified seven colors, our familiar ROYGBIV rainbow. As the spectrum is, well, a spectrum, a continuum, the mostly-arbitrary number seven was undoubtedly influenced by this ancient astrological tradition. (Can you tell indigo from violet on a rainbow? I certainly can't.)
Point is, yes, it's a sticky number, but it's a sticky number for sound historical (if not scientific) reasons.
In autumn, the Pleiades can be seen climbing above the eastern horizon soon after dusk. In spring, they catch up with the sun, and we soon lose them again.
Those sentences are, obviously, written from a Northern Hemisphere perspective. As the article notes, native Australians and other upside-downers also had a thing for what we call the Pleiades.
The Pleiades rise and set close to northeast and northwest from temperate latitudes. Like all stars that rise over the eastern horizon, they climb until they reach their highest point when they are due southâa moment known to astronomers as âculminationââbefore descending toward the western horizon.
Given astronomers' penchant for solitude and isolation, I'm surprised they don't call it "climax."
There is a belief in some circles that the Druids marked Samhain, the precursor of Halloween, using the culmination of the Pleiades at midnight, which is plausible.
Meh. Plausible, I'll grant. But we know little more than jack and shit about the actual Druids, and there's already too much speculation and reinterpretation going on about them.
The Pleiades are not a constellation in themselves, but theyâre part of a much larger pattern: the constellation of Taurus, the bull.
Here's where I really break with the article, but not in a "makes the whole article bogus" way. What we call constellations are based, primarily, on the lore of ancient Europe, with maybe some Mesopotamia thrown in. Their boundaries are arbitrary, and the Northern Hemisphere ones in particular borrow from that mythology. Other cultures put different interpretations on the shapes they saw in the stars, with different lore and different arrangements.
In other words, we can say "star A is in constellation X," but that's just a categorization thing, helpful in communicating to the general public and other astronomers, but having no real basis in science. It's kind of like saying "Paris is in France," but, apart from the coastline, the borders of France are a matter of custom, law, wars, culture, and human decision-making, not any fixed and objective measure. (Even when they follow natural boundaries like mountain ridges.) Those boundaries have changed with time, and, before a certain point in time, France didn't exist at all. Neither did Paris. Nor the Pleiades.
Still, yes, we don't consider the Pleiades a constellation in itself. Neither are either of the Dippers. That kind of recognizable shape that's not an official constellation is called an asterism. In an alternate universe where the sky is the same but human culture unfolded differently, our names and boundaries for the constellations would be different.
There's another bit of lore about the Pleiades that I especially like: the story of Bear Lodge, which the North American colonizers called Devil's Tower. "Bear Lodge" is a rough translation of the Lakota name of that prominent landmark. And the Lakota story about it is intimately tied with what we call the Pleiades.
There's more here, and in the references on that page, and elsewhere on the internet, but I'll paste the most relevant section of the Wiki page:
According to the traditional beliefs of Native American peoples, the Kiowa and Lakota, a group of girls went out to play and were spotted by several giant bears, who began to chase them. In an effort to escape the bears, the girls climbed atop a rock, fell to their knees, and prayed to the Great Spirit to save them. Hearing their prayers, the Great Spirit made the rock rise from the ground towards the heavens so that the bears could not reach the girls. The bears, in an effort to climb the rock, left deep claw marks in the sides, which had become too steep to climb. Those are the marks which appear today on the sides of Devils Tower. When the girls reached the sky, they were turned into the stars of the Pleiades.
That story bears (I'd apologize for that pun, but I don't do that) all of the hallmarks of classic mythology: an origin story, interpreted through the lens of the culture that spawned the myth. And I don't mean "myth" in the sense of "falsehood" here; I mean those powerful cultural stories that both arise from and shape the worldviews of the cultures that spawn them.
I couldn't tell you why that particular story resonates with me. As far as I know, I have no Native American ancestry. I guess I just love a good myth, while at the same time acknowledging that Bear Lodge was as much a natural formation as the Pleiades themselves.
The difference being that the Pleiades asterism belongs to everyone.
And no one. |
December 3, 2025 at 8:26am December 3, 2025 at 8:26am
| |
Now this... this is what science is for. A short but piquant article from PhysOrg:
This is one of those times when it pays to go to the article to see the picture. Because when I saw it, I thought they'd made the artificial tongue look like an actual, disembodied, floppy tongue.
The appearance of a hot sauce or pepper doesn't reveal whether it's mild or likely to scorch someone's taste buds, but researchers have now created an artificial tongue to quickly detect spiciness.
Or you just dare an insecure teenage boy to eat it, preferably when he's around people he's trying to impress.
Inspired by milk's casein proteins, which bind to capsaicin and relieve the burn of spicy foods, the researchers incorporated milk powder into a gel sensor.
Okay, so it's not a step on the way to unlimited free energy or anything, but it's at least useful.
"Our flexible artificial tongue holds tremendous potential in spicy sensation estimation for portable taste-monitoring devices, movable humanoid robots, or patients with sensory impairments like ageusia, for example," says Weijun Deng, the study's lead author.
Except, of course, for the bit about "movable humanoid robots." Don't give them a sense of taste. Have you not read science fiction? They will develop a taste for human flesh.
Still, the article goes into a bit more background, but, as I said, it's short.
As a proof-of-concept, the researchers tested eight pepper types and eight spicy foods (including several hot sauces) on the artificial tongue and measured how spicy they were by changes in electrical current. A panel of taste testers rated the spiciness of the same items.
I have to wonder if they had a diverse group on the panel; that is, some from spicy-food cultures and others from the American Midwest. Because while spice level is objective, reaction to it is subjective.
Now, I'm a fan of spicy food. I don't eat it to show off; I genuinely enjoy the heat... up to a point, but that point is far beyond that of my fellow Americans of Midwestern origin. But this would be useful to anyone, whether they're trying to find, or to avoid, the hotter stuff.
I'm just disappointed that it's not, ultimately, shaped like an actual, pink, floppy, disembodied, human tongue. |
December 2, 2025 at 9:31am December 2, 2025 at 9:31am
| |
Well, this isn't going to be my usual sort of thing. It's personal and might even border on offensive. A lived experience related by CBC:
What's this got to do with anything? Well, I'm about the same age. Of course, she's female and Canadian, so we couldn't possibly be more different. Still. The writer is only a few months older than I am, still the leading edge of Gen-X, if you have to believe in marketing age categories.
Iâve just passed another milestone birthday, and yet the familiar dread of reluctantly skidding into a new decade seems to have softened somewhat.
I'm almost there, and I don't feel dread. Just a profound resignation.
The quiet realization that my yesterdays outnumber my tomorrows feels less like a threat and more like a gift.
Oh, lucky you. I've had that realization for twenty years now.
Aging, Iâve come to see, is a privilege.
I suppose that's a nice, healthy way to look at it. Naturally, I don't agree.
My dear friend Natalie died after a brief illness almost a year ago at the age of 57.
That sucks. Truly. I'm not trying to diminish anyone's grief here, or play who-had-it-worse. All I want to do is try to understand someone else's perspective, and share my own, which is neither better nor worse, just different.
See, my own experiences with loss lead me to a different conclusion.
First, I spent 20 years watching one parent, then the other, decline into profound dementia, then die frightened and bewildered. Losing one's parents is, I know, the natural order of things. But the dementia thing is spit in the face.
The second thing isn't a direct experience, but something I found out about later. It was about a girl I dated in high school, but later fell out of touch with—not too serious, not too casual, but somewhere in the middle. I asked a mutual friend about her, years later, after a chance encounter on the internet. Not to stalk or anything, but just out of curiosity about an old friend. Turned out that this woman had gotten married, went on her honeymoon, came back and was walking around excited about her new life when she dropped dead on the street. One moment alive; next moment, corpse.
So, reading the article in the link up there reminded me that, if I had the choice between a slow decline into brainless senility, or just getting switched off like a lightbulb, I know which one I'd pick.
Of course, we don't get to pick. No, I'm not suicidal. I'm just not afraid of being dead. Maybe I am, at least a little, of dying.
But I cannot and will not consider aging to be a privilege.
It's just something that happens to most of us, like it or not, until it stops. |
December 1, 2025 at 9:43am December 1, 2025 at 9:43am
| |
If you like the *shudder* outdoors, here's a secret from SFGate:
Except I guess it isn't a secret anymore, is it? Now that you've told the entire world with a webpage. Way to go, assholes. Way to ruin it.
The article does, of course, include pictures, and they're cool. Though some of them are suggestive enough that you might not want to view them at work or around kids.
The slot canyonâs sandstone layers were so flawless they looked as though theyâd been thrown on a pottery wheel.
Careful, there. Wouldn't want to give the creationists any ammunition.
This sculpted maze could easily have been mistaken for Arizonaâs wildly popular Upper or Lower Antelope Canyon.
You'll have to forgive me for never having heard of the Antelopes. I live on the other side of the country, and I'm an indoorsman.
Pretty sure I've seen pictures of them, but without attribution.
While smaller in size, this secluded fissure is just as extraordinary, with the same curving walls and an ever-changing orbit of gold and purple shades, occasionally transformed by light rays into vibrant reds and oranges, adorning its narrow passageways â but far fewer crowds.
This is the sort of description that makes travel writing work, incidentally.
And like its more famous counterpart, itâs only accessible through a Navajo tour.
Well, then, maybe it'll just have to be on them to keep the crowds manageable.
Reaching the canyon entrance is an adventure in itself: It requires a 20-minute off-road ride in one of the companyâs modified, open-air (bring layers!) Ford F-350s.
Oh, did Ford pay you for the product placement?
For the benefit of anyone unfamiliar with Arizona, the "layers" thing is because, though you might have heard how scorchingly hot Phoenix can get, that state can also get finger-numbingly cold.
Upon arrival, McCabe began walking our group of 12 through millions of years of geological history. âItâs volume and velocity that forms a slot canyon,â he said, referring to the many flash floods that carved the formation through repeated erosion of its soft rock, which was then further shaped by wind.
SCIENCE!
McCabe pointed out sandstone formations that resembled an elephant, some woolly mammoths and even an Egyptian queen as we went.
Somehow, I don't think any of those are Navajo things.
Well, maybe the Egyptian queen. As Steve Martin related in his musical documentary, speaking of Tutankhamen, "Born in Arizona, moved to Babylonia."
McCabe explained that to the Navajo people, slot canyons are symbols of creation â linking the physical and spiritual worlds â and are often associated with guardian spirits.
So we come to the main reason I saved this article at all: the combination of science and spirituality. Apparently, not everyone sees a need to choose between the two.
While I haven't done canyon hikes (I haven't even been to the Grand one), I have spent time in Navajo country, and I can attest that it's pretty damn awesome. I try to swing through every time I go out west, sometimes staying in Page or Kayenta.
Maybe next time, if there is a next time, I'll go look at some rocks. |
November 30, 2025 at 8:24am November 30, 2025 at 8:24am
| |
This article from The Conversation starts out looking like a book ad, but is it really an ad if the book is 120 years old?
Not long ago, a relative of mine told me he had been working so hard in the yard that heâd âliterally thrown upâ.
As opposed to metaphorically throwing up. Though I suppose "throwing up" is itself a metaphor, or at least a euphemism of sorts.
It was, oddly enough, a boast.
I once worked so hard shoveling snow that I "literally" had a heart attack. My response? Started paying people to do the snow-shoveling. Fortunately, that hasn't been much of a thing over the last 12 years. See? Some good things come from climate change.
Elon Musk once claimed ânobody ever changed the world on 40 hours a weekâ, apparently unaware that people from Archimedes to Nobel laureate Sir Alexander Fleming managed just fine on a normal schedule
Note how he didn't specify changing it for the better.
...today, overwork is one of the few politically neutral ways to show virtue.
If that's the only way to show virtue, I'll stick to my vices, thanks.
Why is work treated, strangely enough, as if it were next to godliness?
Even if I hadn't already seen the headline, I knew the answer: John fucking Calvin.
As an aside: Trust me, I have already thought of all the Calvin (the kid with the tiger) jokes, so there's no need to make any more.
One of the sharper answers came from German sociologist Max Weber. His book The Protestant Ethic and the Spirit of Capitalism (1905) has become a classic â though we need to be careful about what âclassicâ means here. Like the Bible or Stephen Hawkingâs A Brief History of Time, The Protestant Ethic is widely bought, regularly invoked, and rarely read.
Again, I seem to be an outlier. I've read (and own copies of) those other books, but not Weber.
The Protestant Ethic is a study of how religious ideas, especially Calvinism, helped shape the mindset upon which modern capitalism thrives.
What'd I tell you.
Anxious about their prospects for salvation, Protestants looked for signs of divine favour in worldly success. That anxious looking, Weber thought, helped to create â and then helped to reinforce â the disciplined, work-and-future-oriented modern subject that capitalism depends on.
I could protest (pun intended) by saying "not all Protestants," but that would miss the point.
It was one of Weberâs key ideas, and not just in this book, that modernity had lost previous agesâ sense of spiritual meaning...
On the other hand, I'm not a huge fan of "spiritual meaning," either.
What was new, Weber thought, was the moral stance: that working hard, living frugally and accumulating wealth werenât just practical skills for succeeding, but inherently virtuous forms of behaviour. Profit, for some, was more than a merely desirable personal outcome; it was a duty.
We still see that in today's world.
Calvinists believed in predestination. This is the idea that God has already decided who is saved and who isnât, long before any merely human act could modify this outcome.
While this makes some sense from a theological perspective—after all, if God can be surprised, then God is not omniscient, and it was important to a lot of people that God be omniscient—I want to point out that, while I've argued against the concept of free will here and in the old blog, "predestination" is not the only alternative.
The result was a kind of compensatory behaviour. Believers began looking for signs of Godâs favour. Success in oneâs calling â or âBerufâ, a word that means both âjobâ and âvocationâ â became such a sign. Working hard, avoiding luxury, reinvesting profits: these werenât just sound habits. They might be clues that one was among the elect.
And that is the piece I was missing from my outsider's view of Western religious thought. How do people reconcile the "camel through the eye of the needle" bit in the Bible with the pursuit of wealth? It's long been obvious to me that people do it, but I didn't know why, apart from the very human desire for more.
I know some people claim that the "eye of the needle" was the name of a gate in Jerusalem's walls, but to me that stretches interpretation to the breaking point.
Over time, these behaviours detached from their religious roots. You didnât need to believe in predestination to feel the drive to work endlessly, or to prove your value through success.
"Consider the lilies of the field," my ass.
The moral weight Weber saw in the Protestant calling has not vanished. It has been reborn: now it answers to dopamine hits and brand loyalty. We no longer justify our work in relation to Godâs glory, but we still work as if something eternal depends on it.
I suppose one could consider intergenerational wealth a kind of "eternal."
Weber certainly wasnât celebrating what he described. He was, instead, trying to document the moment when a spiritual or theological project hardened into something far more mechanical, compulsive and inescapable.
And look, just to be clear: I'm not advocating for or or against that point of view.
Australian philosopher Michael Symonds has argued that this tragic logic, where the terror of predestination drives believers into a compulsive ethic of work, produces a world where meaning itself becomes hard to grasp.
I do, however, question the validity of the idea of "meaning."
This is one of Weberâs most unsettling points: a system designed to prove spiritual worth ends up building a world whose very operating logic seems to deny that any such worth exists.
I'd be remiss if I didn't point out that this is hardly the "world." While it's certainly a viral meme that has spread to other cultures, there are societies that don't buy into the grind. The US is not one such society.
The language of âvocationâ is everywhere, but it has been flattened into a lifestyle brand. Work isnât just work anymore; it is supposed to be passion, purpose, identity. Youâre not just employed, youâre âdoing what you loveâ. This idea is tempting, but it quickly turns into a trap, because if work is meaning, then failure or exhaustion start to look like moral flaws.
Except that, as noted above, exhaustion, at least, has become a humblebrag.
But the anxiety has shifted. For early Protestants, work was a way of reassuring yourself that you might be saved. For many today, work is a way of proving youâre not disposable.
Just to be clear, because I'm obviously not quoting the entire source text: the idea is not that hard work and success lead to salvation, but that hard work and success are signs of a salvation that has already been predetermined.
One reverberation from this is that, in that worldview, if you're poor, you deserved it. If you're rich, you deserved it. Therefore, helping the poor becomes a kind of sin, while helping to make the rich richer is a kind of virtue.
We feel the pull to be useful, to produce, to stay busy â even when the rewards are uncertain, or vanish altogether.
I've been reminded that there are various connotations of the first-person plural pronoun. The most obvious is the difference between the inclusive "we" and the exclusive "we." If I tell you, "We're going to a party," it may not be clear just from that sentence whether I mean "you and I and maybe some others" or "I and maybe some others." The one less obvious to me is the rhetorical "we," which is obviously the one being invoked here, because I'm certainly not in that "we" group.
Weberâs point wasnât just that, once upon a time, religion fatefully shaped economics. It was that a certain kind of theology, and the specifically religious anxiety to which it gave rise, engendered a system that outlived its theology and hardened into something else entirely.
While I can't completely buy into the article—all the talk about meaning and transcendence is kind of irrelevant to me, too—I do think it provides some insight into how we (specifically the US and its close allies, to use the rhetorical "we" myself) got shaped. What it doesn't do, and what it says Weber doesn't do, is show us the way out.
I'm tempted to claim that there isn't one, but I'm trying to be less dire in my thoughts. They say knowledge is power. Well, here's some power, if "we" want to use it. |
November 29, 2025 at 9:01am November 29, 2025 at 9:01am
| |
Turns out you can't spell "ironic" without Inc.
You sure about that?
Few psychological rules have as high a public profile as the Dunning-Kruger effect.
"Few?" Way to weasel out there. What are the rules you're talking about? That only children are selfish? That short men are pugnacious? The one about the bystander effect, which was pretty much debunked?
David Dunning and Justin Kruger showed that the people who were least competent at a given task were also the most confident in their abilities. Meanwhile, the most skilled are the most unsure.
They were far from the first to notice this. Yeats wrote, "The best lack all conviction, while the worst are full of passionate intensity," which sounds about the same to me.
A theory that states the dumbest among are often the loudest and most overconfident seems to explain so much about modern life.
Because we need such simple explanations.
As pleasant as it might be to write off those you disagree with as hopelessly dim and deluded, the Dunning-Kruger effect isnât actually about anyoneâs general intelligence, Dunning explained. Itâs about what happens when you gain just a little knowledge in a particular domain.
I mean, okay and all. We should all be aware that it can happen to us, and not feel superior to those of unfortunate mental capacity.
âItâs not about general stupidity. Itâs about each and every one of us, sooner or later,â he says. âWe each have an array of expertise, and we each have an array of places we shouldnât be stepping into, thinking we know just as much as the experts.â
Yes, but wouldn't stupid people have a bigger "array of places [they] shouldn't be stepping into?"
Dunking on othersâ oblivious idiocy, as tempting as it can be, isnât actually the takeaway message of the Dunning-Kruger effect according to Dunning. Instead, itâs to be mindful of your own overconfidence, especially in areas where you donât have deep domain expertise.
If you say so. You're the expert.
The point isnât to help you spot othersâ stupidity. Itâs to alert you to the constant potential for your own. Or as Dunning puts it: âOur ignorance is an everyday companion that we will all carry for the rest of our lives.â
You know what bugs me most about this article, though? It's that the author and, apparently, Dunning, based on his quotes therein, conflate ignorance with stupidity. They are (I say with great confidence) not the same thing. Ignorance is our default state. Were you born knowing how to ride a bike? Was the Pythagorean Theorem engraved on your little baby neurons? No. Hell, you even had to be taught how to walk and talk (and then, once you mastered those, how to sit down and shut up).
There is no shame in ignorance per se. No matter how aghast I might be at someone who doesn't understand a Star Trek reference joke I made, the simple truth is not everyone has seen Star Trek.
Nor is there real shame in having reduced mental capacity. People who are slow learners deserve help and empathy, not scorn and ridicule.
What deserves ridicule, the thing that I feel free to mock, is willful ignorance: the deliberate refusal to learn, to change one's mind, or consider other points of view. Flat-earthers, for example.
7 ways to avoid falling prey to the Dunning-Kruger effect
I'm just going to touch on a couple of these, here.
Imagine the worst-case scenario.
Oh, I do. That's my entire life philosophy: imagine the worst thing that can go wrong, assume it will, and you can only be pleasantly surprised.
The problem is, maybe I'm too ignorant to see that there are even worse possibilities than the one I imagined.
Think in probabilities. Citing the work of fellow psychologist Philip Tetlock, Dunning observes that people who think âin terms of probabilities tend to do much better in forecasting and anticipating what is going to happen in the world than people who think in certainties.â
I agree and all, but, and I'm not saying this makes anyone stupid, people are generally utter shit at thinking in probabilities. We (and I do mean we) overestimate the risk of things we're not used to, while underestimating the risk of activities we do regularly. The example I usually quote is someone who is scared shitless of flying because of the risk of a fatal crash or whatever, but doesn't think twice about speeding to the airport if they're a little late. The latter is far, far more likely to be fatal, but we're used to driving, so it just doesn't register.
Something similar happens at casinos, too. Playing blackjack, someone sitting next to me will wonder whether to stand or hit. The odds might favor hitting, say, with a >50% chance. But then they look at me like I'm the stupidest fool in the universe when the next card is a 10 and they bust. Don't look at me. Odds were in your favor. Never said it was a guarantee. They call it gambling for a reason. So I quit giving advice.
âBe 10 percent more skeptical of people you agree withâand 10 percent more charitable to people you disagree with.â
Insofar as such things can be quantified, this is something I try to do.
Scientists are trained to look for evidence to disprove their hypotheses, which acts as a brake on the Dunning-Kruger effect. But you donât have to be a scientist to think like one.
This too. Admittedly, I don't always succeed.
Practice saying âI donât know.â
This one time, I drove through Columbus, Ohio in a car full of friends. I remember seeing a billboard that urged something like: "Help wipe out ignorance and apathy!" I said, "I don't know; I just don't care anymore."
That got a laugh. Was it appropriate?
I don't know. |
November 28, 2025 at 9:40am November 28, 2025 at 9:40am
| |
Here's Mental Floss trying to answer the important questions, again.
One of the most recognizable cheeses is the Swiss variety.
Well, that's what we call it in the US. Which would be kind of like calling cheddar "English cheese" or brie "French cheese," ignoring all the other glorious cheeses produced by those countries. But it's too late; "Swiss cheese" is a metaphor-turned-cliché, and there's no going back now.
Even if you donât know the taste, youâre likely familiar with the distinctive appearance characterized by an abundance of holes, also known as âeyes.â
That's because it's also a trope. That is, when a cartoonist wants to draw "cheese," they will always draw Swiss because the well-known holes instantly say "cheese" like a portrait photographer's subject. Without these iconic features, you're just drawing a wedge or wheel or slice of some unidentifiable substance.
According to U.S. Dairy, a farmer-funded trade group, the eyes in Swiss cheese derive from a genus of bacteria often found in raw milk called Propionibacteria, or Props.
All cheese relies on microorganisms. Well, anything that actually deserves the name "cheese," anyway. This is not gross unless you're, I don't know, six years old.
In the case of Swiss cheese, Propionibacteria gobble up the lactic acid thatâs left behind, which creates carbon dioxide. This gas expands parts of the cheese and forms the characteristic bubbles.
Lots of microorganisms fart out carbon dioxide. It's one reason beer is carbonated. And sparkling wine. Also, it's why bread dough rises.
In the U.S., people often use Swiss cheese as a generic term, but those in Switzerland refer to the dairy item as Emmental.
Stop the presses. Also, the French refer to French bread as "bread" (only in French, so it's "le pain"). And Canadians refer to Canadian bacon as "bacon" or "backbacon," and Brits refer to English muffins as, I guess, "muffins." Despite an attempted marketing campaign, though, I'm pretty sure Australians don't call beer "Foster's."
While most Swiss cheeses have eyes, some donât, according to U.S. Dairy.
Couldn't be arsed to ask the Swiss what they have to say about it, could you?
Anyway, okay, MF is there to explain things to people who aren't as traveled, informed, or obsessed with cheese as I am. And to be fair, I hadn't known the name of the actual bacteria involved, so I learned something, too (though the full binomial of this particular strain, which I had to look up, is apparently Propionibacterium_freudenreichii ).
I've simply spent too much of my adult life, and a good bit of it before then, enjoying the products of this and other microorganisms to agree that it's "gross." Cheese was invented long before we knew about microbes, and the discovery thereof only improved the production of that delicious food. |
November 27, 2025 at 8:59am November 27, 2025 at 8:59am
| |
Ah... Thanksgiving in the US. A day to commemorate a bunch of people escaping religious persecution in their home country so they could start some of their own religious persecution in a new one.
Here's an article from that home country (via The Guardian). It's about my all-time favorite punctuation mark; don't worry, it's short, so my fellow Americans can get back to eating politics and talking turkey.
Good. Another convert. Let's keep on proselytizing.
Is there any punctuation mark more divisive than the humble semicolon?
Um, yes? The emdash has started arguments and caused friends to take an extra-long pause, and don't get me started on the interrobang. 
The use of exclamation marks (particularly by women) makes some people very excitable.
I only start to get antsy when I see more than one, and I don't discriminate by gender.
The Oxford comma has sparked vigorous debate among friends, family and internet strangers.
Oh ho ho ho. I see what you did there. Or, rather, what you didn't do there. You absolute philistine.
Still, while competition might be stiff, if there was a Most Provocative Punctuation contest, I reckon the semicolon would win it.
I will admit that, superlative or not, it's controversial, in much the same way that the pronunciation of .gif is.
Thrust into the world by an Italian printer called Aldus Manutius in 1494, the semicolon has amassed a legion of passionate supporters and haters.
You know what's worse than using semicolons? Using commas to splice complete clauses together.
Meanwhile, Kurt Vonnegut (hater) called them âtransvestite hermaphrodites representing absolutely nothing. All they do is show youâve been to college.â
Vonnegut also refused to admit he wrote science fiction, so anything he has to say is dubious at best.
For most of my life, I was agnostic about the semicolon. Then I had a dalliance with a woman with a bizarre fetish for the things; she even used them in text messages.
You know how I weed undesirables out of my life? I see how they react to me texting in complete sentences. If they don't like it, we're not compatible in any way.
Semicolon usage in British English books has fallen by nearly 50% in the past two decades, a study from language-learning company Babbel found in May.
I will admit that overusing them gets distracting. But telling me something's fallen by 50% is useless; were there two books a year with semicolons, and now there's one? Big deal.
Another found that 67% of British students â rebels without a clause â rarely use it.
Mostly I'm mad someone else used "Rebel Without a Clause" 
Still, itâs been going strong for more than 500 years, so I doubt weâre going to see the end of the semicolon anytime soon.
At least not if I have anything to say about it.
One final note: it's one thing to use punctuation. It's another thing entirely to use it appropriately. I see writing all the time that demonstrates unease with even simpler marks; usually, these are commas, either put, in, where, they're, not, necessary (like a Shatner monologue) or left out entirely in which case the sentence just runs on and on without pause or breath causing the reader to become completely exhausted by the time they finally hit the full stop or as we call it in the US a period. |
November 26, 2025 at 9:55am November 26, 2025 at 9:55am
| |
Some things just amuse me too much not to share. Thing is, this is one of those times when you absolutely have to visit the link (which is from SoraNews24) in order to get the full effect. There are illustrations there that, were I to attempt to describe them, I would fail, and even if I didn't, you wouldn't believe me.
Kids, hell. I want one for myself.
Christmas is just around the corner, and many of us have children in our families that are really hard to shop for.
Another argument for remaining childfree.
Itâs difficult to keep up with the changing trends, and even if you do, how can you know which Skibidi figure the kid already has?
Well, if they're YOUR kids, why aren't you keeping track of these things?
Itâs a problem our reporter Masanuki Sunakoma was facing, so he decided to check out Amazon Japan for some ideas. However, after years of seeking out the lowest-rated items sold there, his algorithm is a little skewed towards the abysmal.
This amuses me almost as much as the toy itself: how if you look for low-rated items on purpose, the site's algorithm will feed you only crap. Well, worse crap than usual.
The body of the train is see-through, and having young ones view the rotating gears inside is thought to help them develop early mechanical concepts, imagination, cognitive ability, and motor skills.
I don't think their "cognitive" pun was intentional, so I'm making it intentional.
Criticisms mentioned that âIt has the vibe of a seedy Shinjuku bar,â âIt spins so fast you canât see the gears moving,â and âThe music is like weird EDM, so it doesnât seem like itâs good for education.â
On the contrary, what could be better for education than weird EDM?
The locomotive began to emit brilliant colors in all directions.
Maybe I should get one for my cats.
And then it started spinning wildlyâŠ
"Oh hell no" -my cats
This, by the way, is where you really need to go to the site for full effect. There are gifs.
In addition to rotating at insane speeds, it played some sort of ear-splittingly loud Chinese techno music.
As I've mentioned before, I run a script blocker, and there are certain domains I refuse to let through. So I don't know if they reproduced the music. It's disappointing to me that I don't get to hear this thing.
It begged the question, what is a train? He wasnât familiar with any actual trains that spun around in one place and sounded like a rave in Kowloon. Sure, it was shaped like a train, but is that enough to call it a train? We donât call a train-shaped cookie a âtrain,â so why should this be one?
Ah, yes, the eternal philosophical question. How do you say "ceci n'est pas un train" in Japanese?
In that way, maybe this was an educational toy, but not in the sense of mechanics. Rather, it made Masanuki consider existentialist principles...
The world needs philosophers as well as engineers.
And if that's not enough contemplation of the absurdities of existence for you, consider this other article I found on the same site:
Hello Kitty says hello to Godzilla in new kaiju/Sanrio crossover collaboration 
Ah, yes. My most hated and most loved Japanese creations, together at last. Makes me re-examine my life as well as all of existence. |
© Copyright 2025 Waltz Invictus (UN: cathartes02 at Writing.Com). All rights reserved. Waltz Invictus has granted InkSpot.Com, its affiliates and its syndicates non-exclusive rights to display this work.
|