Note: this lemmy post was originally titled MIT Study Finds AI Use Reprograms the Brain, Leading to Cognitive Decline and linked to this article, which I cross-posted from this post in [email protected].

Someone pointed out that the “Science, Public Health Policy and the Law” website which published this click-bait summary of the MIT study is not a reputable publication deserving of traffic, so, 16 hours after posting it I am editing this post (as well as the two other cross-posts I made of it) to link to MIT’s page about the study instead.

The actual paper is here and was previously posted on [email protected] and other lemmy communities here.

Note that the study with its original title got far less upvotes than the click-bait summary did 🤡

  • surph_ninja@lemmy.world
    link
    fedilink
    English
    arrow-up
    10
    arrow-down
    9
    ·
    7 days ago

    And using a calculator isn’t as engaging for your brain as manually working the problem. What’s your point?

    • ayyy@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      1
      ·
      edit-2
      7 days ago

      It’s important to know these things as fact instead of vibes and hunches.

    • UnderpantsWeevil@lemmy.world
      link
      fedilink
      English
      arrow-up
      31
      ·
      7 days ago

      Seems like you’ve made the point succinctly.

      Don’t lean on a calculator if you want to develop your math skills. Don’t lean on an AI if you want to develop general cognition.

      • BananaIsABerry@lemmy.zip
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        6 days ago

        Don’t lean on an AI if you want to develop general cognition essay writing skills.

        Sorry the study only examined the ability to respond to SAT writing prompts, not general cognitive abilities. Further, they showed that the ones who used an AI just went back to “normal” levels of ability when they had to write it on their own.

        • UnderpantsWeevil@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          6 days ago

          the ones who used an AI just went back to “normal” levels of ability when they had to write it on their own

          An ability that changes with practice

      • 5C5C5C@programming.dev
        link
        fedilink
        English
        arrow-up
        6
        ·
        7 days ago

        I don’t think this is a fair comparison because arithmetic is a very small and almost inconsequential skill to develop within the framework of mathematics. Any human that doesn’t have severe learning disabilities will be able to develop a sufficient baseline of arithmetic skills.

        The really useful aspects of math are things like how to think quantitatively. How to formulate a problem mathematically. How to manipulate mathematical expressions in order to reach a solution. For the most part these are not things that calculators do for you. In some cases reaching for a calculator may actually be a distraction from making real progress on the problem. In other cases calculators can be a useful tool for learning and building your intuition - graphing calculators are especially useful for this.

        The difference with LLMs is that we are being led to believe that LLMs are sufficient to solve your problems for you, from start to finish. In the past students who develop a reflex to reach for a calculator when they don’t know how to solve a problem were thwarted by the fact that the calculator won’t actually solve it for them. Nowadays students develop that reflex and reach for an LLM instead, and now they can walk away with the belief that the LLM is really solving their problems, which creates both a dependency and a misunderstanding of what LLMs are really suited to do for them.

        I’d be a lot less bothered if LLMs were made to provide guidance to students, a la the Socratic method: posing leading questions to the students and helping them to think along the right tracks. That might also help mitigate the fact that LLMs don’t reliably know the answers: if the user is presented with a leading question instead of an answer then they’re still left with the responsibility of investigating and validating.

        But that doesn’t leave users with a sense of immediate gratification which makes it less marketable and therefore less opportunity to profit…

        • UnderpantsWeevil@lemmy.world
          link
          fedilink
          English
          arrow-up
          5
          ·
          7 days ago

          arithmetic is a very small and almost inconsequential skill to develop within the framework of mathematics.

          I’d consider it foundational. And hardly small or inconsequential given the time young people spend mastering it.

          Any human that doesn’t have severe learning disabilities will be able to develop a sufficient baseline of arithmetic skills.

          With time and training, sure. But simply handing out calculators and cutting math teaching budgets undoes that.

          This is the real nut of comparison. Telling kids “you don’t need to know math if you have a calculator” is intended to reduce the need for public education.

          I’d be a lot less bothered if LLMs were made to provide guidance to students, a la the Socratic method: posing leading questions to the students and helping them to think along the right tracks.

          But the economic vision for these tools is to replace workers, not to enhance them. So the developers don’t want to do that. They want tools that facilitate redundancy and downsizing.

          But that doesn’t leave users with a sense of immediate gratification

          It leads them to dig their own graves, certainly.

    • rumba@lemmy.zip
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      1
      ·
      7 days ago

      Yeah, I went over there with ideas that it was grandiose and not peer-reviewed. Turns out it’s just a cherry-picked title.

      If you use an AI assistant to write a paper, you don’t learn any more from the process than you do from reading someone else’s paper. You don’t think about it deeply and come up with your own points and principles. It’s pretty straightforward.

      But just like calculators, once you understand the underlying math, unless math is your thing, you don’t generally go back and do it all by hand because it’s a waste of time.

      At some point, we’ll need to stop using long-form papers to gauge someone’s acumen in a particular subject. I suspect you’ll be given questions in real time and need to respond to them on video with your best guesses to prove you’re not just reading it from a prompt.

    • Randomgal@lemmy.ca
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      8
      ·
      7 days ago

      You better not read audiobooks or learn form videos either. That’s pure brianrot. Too easy.