

Okay, but then you’re not talking about reincarnation in the Hindu/Buddhist sense. You’re talking about the kind of Past Lives shit that western spiritualists are into.


Okay, but then you’re not talking about reincarnation in the Hindu/Buddhist sense. You’re talking about the kind of Past Lives shit that western spiritualists are into.


everything is just getting worse
The planet is in its sixth major extinction event - one that’s been ongoing for roughly 12,000 years. And yet we’re finally achieving a kind of universal consciousness regarding the impact we’re collectively imposing on the world and the methodologies we can employ to respond to it.
I don’t think I’d call that “getting worse”. No other lifeforms have ever had the opportunity to know they’re going extinct before it happens, much less the faculties to do something about it.
This is a pivotal epoch in Earth’s multi-billion year history. I would not say things are getting worse. I would say things have the potential to move in a direction no planet in the Solar System has had the opportunity to move since it was created. That potential creates a great deal of anxiety and fear, because it is much easier to be ignorant - a dinosaur unaware of the looming comet - than faced with the foreknowledge of catastrophe. But it is that shared anxiety that creates the social pressure for universal change.


Reincarnation into what?
“Hey, if you die in this life, you can come back as a bug that gets eaten by another bug shortly after hatching. And then you can do this for the next 10,000 years until you get lucky enough to come back as something remotely sentient.”
Sounds like that shit sucks, man. You have a real pivotal moment in this life to embrace dharma and appeal to heaven for a higher place in the great pattern. I’ll admit, I’m not much of a mystic, but I don’t think eating a bullet after a night spent guzzling whiskey is what gets you up the ladder.


Can I remind everyone that it is impossible to produce helium in a practical way?
Sun has been doing it for millions of years and it’s a big dumb ball of energy.
Is it practical? No. Is it producing any Helium right now? No. Is it probably just a big investor scam? Sure. But still more practical than trying to conquer Iran.


We could be at war with Iran for a century, sending strike teams in to siphon helium out of the ground and smuggle it back to the US in stealth jets and submarines, and it would still be significantly cheaper than trying to mine the moon.


Okay, but do you really think we’re going to prioritize the enormous loss-leading CSAM engines over lifesaving medical diagnostics machines?

I just need a way to spend money so the banner goes away.
Buying
$10$1000 Proprietary Software
Donating $10 to FOSS SoftwareSubscribing to the FOSS Software Patron Tier for $10/mo


I still find it hard to understand the emotional attachment to LLMs and why people believe their ideas
It’s a conversation you’re having on the internet with an agent that sounds like a human. People get invested for the same reason they get catfished.
It sounds like she is too overworked and stressed to make decisions or even think for herself, so she lets GPT do it for her.
That’s the nut of it. And ChatGPT tends to mix the pastiche of a well-researched argument with the kind of feel-good self-affirmations that win over their audience. So you’re getting what looks - at first glance - to be good advice. And then you’re getting glazed on top of it. And then it’s designed to tell you what you want to hear, so you’re getting affirmation bias.
I hope she gets better soon, and I hope you do too, being overworked and stressed really destroys you and the people around you in many ways.
I mean, that’s why human-to-human interactions are valuable. But it’s also why they’re difficult. Like any good medicine, it can taste bitter up front even if its what you need in the long run.


What kind of person do you have to be to become addicted like them?
Human cognition degrades with stress, exhaustion, and trauma. If you’re in a position where turning to an AI for relationship advice seems like a good idea, you’re probably already suffering from one or more of the above.
Also doesn’t help that AIs are sycophantic precisely because sycophancy is addictive. This isn’t a “type of person” so much as a “tool engineered towards chronic use”. It’s like asking “What kind of person regularly smokes crack?”
Do you need to be very empathetic towards objects? Like seeing faces in everything and get emotionally attached?
I’ll give you a personal example. I have a friend who is currently pregnant and going through a bad breakup with her baby-daddy. She’s a trial lawyer by trade - very smart, very motivated, very well-to-do, but also horribly overworked, living by herself, and suffering from all the biochemical consequences of turning a single celled organism into a human being.
As a result of some poorly conceived remarks, she’s alienated herself from a number of close friends to the point where we doubt there’s going to be a baby shower. Part of the impulse to say these things came from her own drama. But part if it came from her discovering ChatGPT as a tool to analyze other people’s statements. This has created a vicious behavioral spiral, during which she says something regrettable and gets a regrettable response in turn. She plugs the conversation into ChatGPT, because she has nobody else to talk to. And ChatGPT feeds her some self-affirming bullshit that inflates her ego far enough to say another stupid thing.
To complicate matters, her baby daddy is also using ChatGPT to analyze her conversations. And he’s decided she’s cheated on him, the baby isn’t his, and she’s plotting to scam him.
So now you’ve got two people - already stressed and exhausted - getting fed a series of toxic delusions by a machine that is constantly reaffirming in the way none of your friends or family are. It’s compounding your misery, which drives anxiety and sends you back to the machine that offers temporary relief. But the advice from the machine yields more misery down the line, raising your anxiety, and sending you back to the machine.
What’s producing this feedback loop? You could argue it is the individual, foolish enough to engage with the machine to begin with. But that’s far more circumstantial than personality driven. If my friend didn’t have a cell phone, she wouldn’t be reaching for ChatGPT. If she wasn’t pregnant, she wouldn’t be so stressed and anxious. If she wasn’t in a fight with her boyfriend, she wouldn’t be feeding conversations into the prompt engine.


Buuuuulshit
I mean, what are the odds that the statement was composed by an AI?


Kanye No: Red Hat
Kanye Yes: Red Hat Linux


Ukrainian housewives are not producing 155mm artillery or jets. But when a drone solves the same problem as a 155mm shell, then perhaps they don’t need to. Similarly with jets.
I don’t think anyone can suggest they solve the same problems. I might argue that the conflict is asymmetric. The Americans are trying to invade Iran, not the other way around.
Western militaries (not all of them, but Americans in Iran now) seem to love solving problems with cool expensive tools, even if not that well fit for the goal.
The Western goal is to capture and control territory through terror bombing followed by an occupation.
To that end, they need the kind of surveillance, range, and precision that drones lack.
Iranians/Ukrainians aren’t trying to capture and control territory at a great distance. They’re trying to repel an invasion of territory they already own.
They’re fighting a fundamentally different war.


But this one isn’t on the military in my book.
Not historically. But eventually you purge everyone who tells you “No” and the military you’re left with is Yes-Men.
Incidentally,
Why The U.S. Army Made Four Tech Executives Lieutenant Colonels
Really tells you where the military in this country is headed.


I wonder if the headline was written by an AI


Hey now. The deranged right aren’t all bad.
They gave us AI and vaccine denial and climate change, which I’ve been raised to believe are all good things.


The Life and Times of Reading Toxic Men #14 | George Orwell: The Colonial Cop and the Cynic of Power
Incidentally, Orwell never got to see his books truly take off. He died of tuberculosis at age 46. However, his more explicitly anti-communist works were picked up by Allen Dulles, at the CIA and turned into children’s cartoons.
In 1950, after Orwell’s death, the CIA arranged for an animated film to be made based on the book. However, it pushed several changes. It transformed Orwell’s generally positive depiction of Snowball (Leon Trotsky), emphasized that the animals on some neighboring farms were perfectly content with their treatment by the human farmers, and changed the ending to show the other animals rising up to overthrow the pigs. The adaptation was unpopular in theaters, but it was widely shown in US and UK classrooms and was translated and distributed in other countries.
In an ironic twist, the books about state manufactured propaganda and national indoctrination became tools of state manufactured propaganda and national indoctrination.
Real “Don’t Build The Torment Nexus” moment.


deleted by creator
Based and v6 pilled.