• 21 Posts
  • 1.93K Comments
Joined 3 years ago
cake
Cake day: June 14th, 2023

help-circle
  • I still find it hard to understand the emotional attachment to LLMs and why people believe their ideas

    It’s a conversation you’re having on the internet with an agent that sounds like a human. People get invested for the same reason they get catfished.

    It sounds like she is too overworked and stressed to make decisions or even think for herself, so she lets GPT do it for her.

    That’s the nut of it. And ChatGPT tends to mix the pastiche of a well-researched argument with the kind of feel-good self-affirmations that win over their audience. So you’re getting what looks - at first glance - to be good advice. And then you’re getting glazed on top of it. And then it’s designed to tell you what you want to hear, so you’re getting affirmation bias.

    I hope she gets better soon, and I hope you do too, being overworked and stressed really destroys you and the people around you in many ways.

    I mean, that’s why human-to-human interactions are valuable. But it’s also why they’re difficult. Like any good medicine, it can taste bitter up front even if its what you need in the long run.


  • What kind of person do you have to be to become addicted like them?

    Human cognition degrades with stress, exhaustion, and trauma. If you’re in a position where turning to an AI for relationship advice seems like a good idea, you’re probably already suffering from one or more of the above.

    Also doesn’t help that AIs are sycophantic precisely because sycophancy is addictive. This isn’t a “type of person” so much as a “tool engineered towards chronic use”. It’s like asking “What kind of person regularly smokes crack?”

    Do you need to be very empathetic towards objects? Like seeing faces in everything and get emotionally attached?

    I’ll give you a personal example. I have a friend who is currently pregnant and going through a bad breakup with her baby-daddy. She’s a trial lawyer by trade - very smart, very motivated, very well-to-do, but also horribly overworked, living by herself, and suffering from all the biochemical consequences of turning a single celled organism into a human being.

    As a result of some poorly conceived remarks, she’s alienated herself from a number of close friends to the point where we doubt there’s going to be a baby shower. Part of the impulse to say these things came from her own drama. But part if it came from her discovering ChatGPT as a tool to analyze other people’s statements. This has created a vicious behavioral spiral, during which she says something regrettable and gets a regrettable response in turn. She plugs the conversation into ChatGPT, because she has nobody else to talk to. And ChatGPT feeds her some self-affirming bullshit that inflates her ego far enough to say another stupid thing.

    To complicate matters, her baby daddy is also using ChatGPT to analyze her conversations. And he’s decided she’s cheated on him, the baby isn’t his, and she’s plotting to scam him.

    So now you’ve got two people - already stressed and exhausted - getting fed a series of toxic delusions by a machine that is constantly reaffirming in the way none of your friends or family are. It’s compounding your misery, which drives anxiety and sends you back to the machine that offers temporary relief. But the advice from the machine yields more misery down the line, raising your anxiety, and sending you back to the machine.

    What’s producing this feedback loop? You could argue it is the individual, foolish enough to engage with the machine to begin with. But that’s far more circumstantial than personality driven. If my friend didn’t have a cell phone, she wouldn’t be reaching for ChatGPT. If she wasn’t pregnant, she wouldn’t be so stressed and anxious. If she wasn’t in a fight with her boyfriend, she wouldn’t be feeding conversations into the prompt engine.




  • Ukrainian housewives are not producing 155mm artillery or jets. But when a drone solves the same problem as a 155mm shell, then perhaps they don’t need to. Similarly with jets.

    I don’t think anyone can suggest they solve the same problems. I might argue that the conflict is asymmetric. The Americans are trying to invade Iran, not the other way around.

    Western militaries (not all of them, but Americans in Iran now) seem to love solving problems with cool expensive tools, even if not that well fit for the goal.

    The Western goal is to capture and control territory through terror bombing followed by an occupation.

    To that end, they need the kind of surveillance, range, and precision that drones lack.

    Iranians/Ukrainians aren’t trying to capture and control territory at a great distance. They’re trying to repel an invasion of territory they already own.

    They’re fighting a fundamentally different war.













  • I heard the big banks were trying something similar shortly before the '08 crash. And the Enron/Worldcomm crew right before 9/11.

    Certainly possible they’ve got an exit strategy lined up. But the problem is that they’re always just a little too greedy and too high on their own supply. During the '14 mini-recession, reinflating the bubble economy was a bipartisan goal. After the '20 COVID crash, there was broad consensus in cranking open the money hose and flooding the economy with cheap cash. '08, '14, and '20 set a big historical precedent for the “We’ll never let you fail” policies of the federal government. And so we’ve diluted a lot of the short term pain of economic contraction into the longer term pains of currency inflation.

    The enormous devastation to real physical capital all across these Mid-Eastern theocracies, combined with the socio-economic pressures of Climate Change induced heat waves, can and will push certain regions of the globe to a breaking point. At some point, you just don’t have anything to spend all those excess dollars on.


  • I don’t begrudge rich people going to rich people prison, because the point of prison is to remove dangerous people from society not to torture them in a cage. I do begrudge poor people going to poor people prison, because it seems as though these prisons exist as a means of extracting cheap labor from poor and PoC populations. Or outright abusing them - mentally, physically, and sexually - because this kind of brutality generates political rewards.





  • It is entirely true that all models from all manufacturers are compromised by spy agencies.

    I think there’s a little bit of space between “spy agencies employ systems professionals that know the guts of a component’s security and tricks to bypass it” and “every device firmware has a double super secret protocol for sidestepping all of its security features”.

    However the worst offender by far is Cisco even though they’re “American”.

    Sure. I’m willing to believe that Cisco, specifically, has relationships with the Five Eyes network such that they make monitoring their traffic easier. Even then, there’s limits. One thing to say techniques exist to bypass security. Another entirely to know what those techniques are and whether they’re practical for application at universal scale.

    One of the more chronic problems that big spy agencies have is sifting through all the spam and bullshit and empty chatter. Decryption takes time. And you can’t monitor everything, everywhere, all at once. The bigger sins of Cisco are in how they expedite access on behalf of their agency partners, not that they fail to produce perfectly hack-proof hardware.