• Aeri@lemmy.world
    link
    fedilink
    arrow-up
    6
    ·
    edit-2
    3 hours ago

    I know this is a joke but wouldn’t the thing to do be simulate the trading first?

    Tell it that it has 59 grand and ask it how it should invest. Pretend it has and monitor the stock

    • nogooduser@lemmy.world
      link
      fedilink
      English
      arrow-up
      11
      ·
      4 hours ago

      I imagine that this is actually what happened.

      The AI Fix podcast regularly has reports on the testing of AI models and the testers perform many tests of a situation and report what percentage of times they saw a specific outcome.

      It’s amazing how many times they exhibit human behaviour such as lying, hiding their mistakes, and resorting to blackmail. They were even shown to behave like gambling addicts as reported in this article.