• pixxelkick@lemmy.world
    link
    fedilink
    English
    arrow-up
    34
    ·
    1 day ago

    My wife is a teacher, she has shown me vibed handed in assignments abd its incredibly obvious.

    Right off the bat, if she gives an assignment to make, say, a slideshow on “Topic” and they talk about a examples A, B, and C in class, and the assignment goes off on tangents about topics F, G, and H instead, it’s an instant red flag.

    This happens cuz the student just copy paste the assignment blurb into gpt, but gpt has no context for what was discussed in class… so it goes off the rails instantly.

    Its also easy to include poison pills in the middle of an assignment if they copy paste it straight into gpt.

    Also theres all the usual markers. Emoji, em dash, and the assignment having way higher verbosity than you know damn well the kid has the vocabulary for. Suddenly they’re speaking at a grade 7~8 levels higher than usual? Uh huh. .

    From her and her teacher friends, Ive been told its extremely obvious to spot still. And its pretty trivial to setup the assignment to poison pill the AI.

    • cass80@programming.dev
      link
      fedilink
      English
      arrow-up
      11
      arrow-down
      1
      ·
      1 day ago

      What if the kid lies and says they didn’t use AI? How successful have they been in convincing admin and parents of their ai usage? While I agree its all damning, its still circumstantial evidence.

      • pixxelkick@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        ·
        20 hours ago

        She just failed the assignment, because it didnt actually meet the requirements lol

        Thats the thing, if you simply design your assignment well, AI will just… fail at it, and you dont have to prove they used AI, they simply just didnt pass.

      • Doomsider@lemmy.world
        link
        fedilink
        English
        arrow-up
        17
        ·
        edit-2
        1 day ago

        Back in the day just one instance of plagerism was very serious. If you got caught doing it more than once you could get expelled.

        Now apparently everyone is using the plagerism machine including the professors. So much for academic integrity.

        • Tollana1234567@lemmy.today
          link
          fedilink
          English
          arrow-up
          2
          ·
          edit-2
          14 hours ago

          and then professors using AI to sniff out your AI. resumes are being done by AI, and employers are also using AI TOO. Also research papers are apparently done with AI too, some how sneaking into journals.

        • AllHailTheSheep@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          14
          ·
          1 day ago

          I had a sustainability class where the professor used AI to write the course syllabus, assignments, and feedback. a fucking sustainability class.

          I contacted the office of the president about it at my university but nothing ever happened of it. academia in general has gone off the rails with AI recently. I used to assume those with doctorates we’re bright enough to avoid AI but evidently that’s not the case.

          • Tollana1234567@lemmy.today
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            14 hours ago

            professors are often to busy with thier labs or research, so they relegate thier writing/teaching to AI. before it was only done with power-points that barely gave substance to a lectures those were the bad teachers when they just read off or give little context to the powerpoint slides, comes time for a midterm, students were like wtf was she teaching in those lectures.

      • kent_eh@lemmy.ca
        link
        fedilink
        English
        arrow-up
        4
        ·
        1 day ago

        What if the kid lies and says they didn’t use AI?

        Have them re-do the assignment in a classroom with the teacher (or any other procror) present.

      • Jestzer@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        1 day ago

        I don’t teach kids, so I don’t know the answer to this, but I imagine what you’d do is add guidelines to the assignment that cause them to either lose significant points or fail if they don’t specifically mention things discussed in assignments and the classroom.

        I’d also like to point out that, yes, we know when kids and adults lazily insert a prompt and lazily paste its response, but anybody with half a brain knows they only need to spend an extra 15 minutes re-prompting and editing it to make it nearly unnoticeable.

        The answer is probably to test them in person with no computer of any kind in front of them.

    • Doomsider@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 day ago

      Amateurs, everyone knows you record classroom discussion, translate it to text, and then feed it into the LLM for context.

    • trougnouf@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      2
      ·
      1 day ago

      I was unable to get Mistral’s AI to output an emoji recently. They forbid it in the system prompt and it wouldn’t give one out for a pretend life or death situation.