25+ yr Java/JS dev
Linux novice - running Ubuntu (no windows/mac)

  • 0 Posts
  • 692 Comments
Joined 1 year ago
cake
Cake day: October 14th, 2024

help-circle




  • Agreed but at some point I am forced to work “at gunpoint” because I have a wife and kids who need a house and food and cars. I’m jealous of anyone in a position to simply quit.

    I work for a company that works for another company in the hospitality industry. The software system is being updated (in part of a much broader system change) to no longer allow non-binary or unspecified gender. We aren’t writing that part, but have to support it. I consider it a shortsighted and cruel change. But I’ve also spent a 7 of the last 30 months looking for work. I’m over fifty and I’m currently trying to build my retirement savings back up from zero after that.

    I’m not walking away just because of this change. Instead I’m making sure our software is easy to change back when world is ready for that once again. That’s the best I can do, and I’ve worked for companies engaged in much greater evil.

    When I got hired on a contract for Uline I’d never heard of them. Then I found out that are huge contributors to the Republican Party and I was glad when they decided to replace me on that contract, but I couldn’t just walk away. That was the most morally conflicted I’ve ever been at a job. But it gave my family the means to thrive, and that is my first goal.



  • I don’t agree with your premise. However, I don’t have a good argument against at hand.

    There are intangible benefits to a person, and I feel like the number of roles where AI could perform well enough and mistakes don’t cause customer satisfaction issues, regulatory compliance issues, or incur civil liability are vanishingly small. But could they be 10% of jobs? Even 5%? I don’t think so.

    I could see an argument that if you have a team of 10 people, AI could let you cut one and expect the other 9 to pick up the slack. But how many teams even have ten people on them? Because I don’t think a team of 5 can lose one person and still be capable of the same work. I guess it might depend on the industry — I do have IT blinders on here.


  • I say this as someone generally bullish about AI: bullshit. I use it all the time. It’s helpful when you already know what you’re doing. Anything you do with AI at scale is going to have a number of fuckups, even if it’s mostly reliable — and for most purposes I wouldn’t even go that far.

    I see it all the time. I ask Cline to have Claude do a bunch of things and create a markdown file… and it does everything, including generating the markdown, but forgets to put it in a file and then acts confused when you say to put it in a file. If that was some financial report or contract, it could tank a whole business.




  • Not OP, but…

    It’s not always perfect, but it’s good for getting a tldr to see if maybe something is worth reading further. As for translations, it’s something AI is rather decent at. And if I go from understanding 0% to 95%, really only missing some cultural context about why a certain phrase might mean something different from face value, that’s a win.

    You can do a lot with AI where the cost of it not being exactly right is essentially zero. Plus, it’s not like humans have a great track record for accuracy, come to think of it. It comes down to being skeptical about it like you would any other source.