• 27 Posts
  • 4.21K Comments
Joined 3 years ago
cake
Cake day: June 14th, 2023

help-circle


  • everything is just getting worse

    The planet is in its sixth major extinction event - one that’s been ongoing for roughly 12,000 years. And yet we’re finally achieving a kind of universal consciousness regarding the impact we’re collectively imposing on the world and the methodologies we can employ to respond to it.

    I don’t think I’d call that “getting worse”. No other lifeforms have ever had the opportunity to know they’re going extinct before it happens, much less the faculties to do something about it.

    This is a pivotal epoch in Earth’s multi-billion year history. I would not say things are getting worse. I would say things have the potential to move in a direction no planet in the Solar System has had the opportunity to move since it was created. That potential creates a great deal of anxiety and fear, because it is much easier to be ignorant - a dinosaur unaware of the looming comet - than faced with the foreknowledge of catastrophe. But it is that shared anxiety that creates the social pressure for universal change.


  • Reincarnation into what?

    “Hey, if you die in this life, you can come back as a bug that gets eaten by another bug shortly after hatching. And then you can do this for the next 10,000 years until you get lucky enough to come back as something remotely sentient.”

    Sounds like that shit sucks, man. You have a real pivotal moment in this life to embrace dharma and appeal to heaven for a higher place in the great pattern. I’ll admit, I’m not much of a mystic, but I don’t think eating a bullet after a night spent guzzling whiskey is what gets you up the ladder.







  • I still find it hard to understand the emotional attachment to LLMs and why people believe their ideas

    It’s a conversation you’re having on the internet with an agent that sounds like a human. People get invested for the same reason they get catfished.

    It sounds like she is too overworked and stressed to make decisions or even think for herself, so she lets GPT do it for her.

    That’s the nut of it. And ChatGPT tends to mix the pastiche of a well-researched argument with the kind of feel-good self-affirmations that win over their audience. So you’re getting what looks - at first glance - to be good advice. And then you’re getting glazed on top of it. And then it’s designed to tell you what you want to hear, so you’re getting affirmation bias.

    I hope she gets better soon, and I hope you do too, being overworked and stressed really destroys you and the people around you in many ways.

    I mean, that’s why human-to-human interactions are valuable. But it’s also why they’re difficult. Like any good medicine, it can taste bitter up front even if its what you need in the long run.


  • What kind of person do you have to be to become addicted like them?

    Human cognition degrades with stress, exhaustion, and trauma. If you’re in a position where turning to an AI for relationship advice seems like a good idea, you’re probably already suffering from one or more of the above.

    Also doesn’t help that AIs are sycophantic precisely because sycophancy is addictive. This isn’t a “type of person” so much as a “tool engineered towards chronic use”. It’s like asking “What kind of person regularly smokes crack?”

    Do you need to be very empathetic towards objects? Like seeing faces in everything and get emotionally attached?

    I’ll give you a personal example. I have a friend who is currently pregnant and going through a bad breakup with her baby-daddy. She’s a trial lawyer by trade - very smart, very motivated, very well-to-do, but also horribly overworked, living by herself, and suffering from all the biochemical consequences of turning a single celled organism into a human being.

    As a result of some poorly conceived remarks, she’s alienated herself from a number of close friends to the point where we doubt there’s going to be a baby shower. Part of the impulse to say these things came from her own drama. But part if it came from her discovering ChatGPT as a tool to analyze other people’s statements. This has created a vicious behavioral spiral, during which she says something regrettable and gets a regrettable response in turn. She plugs the conversation into ChatGPT, because she has nobody else to talk to. And ChatGPT feeds her some self-affirming bullshit that inflates her ego far enough to say another stupid thing.

    To complicate matters, her baby daddy is also using ChatGPT to analyze her conversations. And he’s decided she’s cheated on him, the baby isn’t his, and she’s plotting to scam him.

    So now you’ve got two people - already stressed and exhausted - getting fed a series of toxic delusions by a machine that is constantly reaffirming in the way none of your friends or family are. It’s compounding your misery, which drives anxiety and sends you back to the machine that offers temporary relief. But the advice from the machine yields more misery down the line, raising your anxiety, and sending you back to the machine.

    What’s producing this feedback loop? You could argue it is the individual, foolish enough to engage with the machine to begin with. But that’s far more circumstantial than personality driven. If my friend didn’t have a cell phone, she wouldn’t be reaching for ChatGPT. If she wasn’t pregnant, she wouldn’t be so stressed and anxious. If she wasn’t in a fight with her boyfriend, she wouldn’t be feeding conversations into the prompt engine.





  • Ukrainian housewives are not producing 155mm artillery or jets. But when a drone solves the same problem as a 155mm shell, then perhaps they don’t need to. Similarly with jets.

    I don’t think anyone can suggest they solve the same problems. I might argue that the conflict is asymmetric. The Americans are trying to invade Iran, not the other way around.

    Western militaries (not all of them, but Americans in Iran now) seem to love solving problems with cool expensive tools, even if not that well fit for the goal.

    The Western goal is to capture and control territory through terror bombing followed by an occupation.

    To that end, they need the kind of surveillance, range, and precision that drones lack.

    Iranians/Ukrainians aren’t trying to capture and control territory at a great distance. They’re trying to repel an invasion of territory they already own.

    They’re fighting a fundamentally different war.