• marcos@lemmy.world
    link
    fedilink
    arrow-up
    27
    ·
    22 hours ago

    That’s how the C++ code should have looked all the time. And the amount of people that get surprised and complain about this is just more evidence that nobody should write C++. Ever.

      • Ephera@lemmy.ml
        link
        fedilink
        English
        arrow-up
        9
        ·
        17 hours ago

        I guess, if you come from garbage-collected languages, you might be used to not needing the ampersands, because everything is automatically a reference in those…

        • XPost3000@lemmy.ml
          link
          fedilink
          arrow-up
          3
          ·
          17 hours ago

          Yeah this is how it was for me when I first started C++, I was use to any object beyond a simple 3D vector to always be passed by reference

          And then I read a C++ book my uncle gave me during a flight and realized that there isn’t any syntax for passing a parameter by copy, so obviously that’d have to be the default behavior and I’ve been passing by reference ever since

          • Ephera@lemmy.ml
            link
            fedilink
            English
            arrow-up
            4
            arrow-down
            1
            ·
            16 hours ago

            Oh wow, what the hell. I’m not actually familiar with C++ (just with Rust which gets similar reactions with the ampersands), but that’s insane that it just copies shit by default. I guess, it comes from a time when people mostly passed primitive data types around the place. But yeah, you won’t even notice that you’re copying everything, if it just does it automatically.

            And by the way, Rust did come up with a third meaning for passing non-references: It transfers the ownership of the object, meaning no copy is made and instead, the object is not anymore allowed to be used in the scope that passed it on.
            That’s true, except for data types which implement the Copy trait/interface, which is implemented mostly for primitive data types, which do then get treated like C++ apparently treats everything.