That may well be. I’d say I understand the basic concepts, but people in this thread have more detail on the specifics and how they work out in practice than me.
It does make me wonder why everyone hasn’t been doing it, if there’s no drawbacks, though.
It is being used. Objective-C (used for macOS and iOS apps) has used reference counting since the language was created. Originally it was manual, but since 2011 it’s been automatic by default. And Swift (which basically replaced Objective-C) only supports ARC (does not support manual reference counting). The downside is that it doesn’t handle loops so the programmer has to be careful to prevent those. Also, the compiler has to insert reference increment and decrement calls, and that’s a significant engineering challenge for the compiler designers. Rust tracks ownership instead of references, but that means it’s compiler is even more complicated. Rust’s system is a little bit like compile-time reference counting, but that’s not really accurate. Apparently Python, Pearl, and PHP use reference counting, plus tracing GC (aka ‘normal’ GC) in Python and PHP to handle cycles. So your implicit statement/assumption that reference counting is not widely used is false. Based on what I can find online, Python and JavaScript are by far the most used languages today and are roughly equal, so in that respect reference counting GC is equally or possibly more popular than pure tracing GC.
Everyone doing it was a critical distinction there. OP is making it sound like there’s literally no drawbacks. If that was so, I’m pretty sure tracing would have long since died out. It has come up that a lot of languages do use it elsewhere in the thread.
Which is another reason I’m not so sure Roc is the answer we’ve all been waiting for. Then again, the first few Rust proponents would have sounded the same way.
Honestly I didn’t really follow OP’s meme or care enough to understand it, I’m just here to provide some context and nuance. I opened the comments to see if there was an explanation of the meme and saw something I felt like responding to.
Edit: Actually, I can’t see the meme. I was thinking of a different post. The image on this one doesn’t load for me.
“The answer we’ve all been waiting for” is a flawed premise. There will never be one language to rule them all. Even completely ignoring preferences, languages are targeted at different use cases. Data scientists and systems programmers have very different needs. And preferences are huge. Some people love the magic of Ruby and hate the simplicity of Go. I love the simplicity of Go and hate the magic of Ruby. Expecting the same language to satisfy both groups is unrealistic because we have fundamentally different views of what makes a good language.
I meant the person I was arguing with by OP. OOP’s image won’t load for me either now, but it was basically just a list of things that compile to LLVM.
That may well be. I’d say I understand the basic concepts, but people in this thread have more detail on the specifics and how they work out in practice than me.
It does make me wonder why everyone hasn’t been doing it, if there’s no drawbacks, though.
It is being used. Objective-C (used for macOS and iOS apps) has used reference counting since the language was created. Originally it was manual, but since 2011 it’s been automatic by default. And Swift (which basically replaced Objective-C) only supports ARC (does not support manual reference counting). The downside is that it doesn’t handle loops so the programmer has to be careful to prevent those. Also, the compiler has to insert reference increment and decrement calls, and that’s a significant engineering challenge for the compiler designers. Rust tracks ownership instead of references, but that means it’s compiler is even more complicated. Rust’s system is a little bit like compile-time reference counting, but that’s not really accurate. Apparently Python, Pearl, and PHP use reference counting, plus tracing GC (aka ‘normal’ GC) in Python and PHP to handle cycles. So your implicit statement/assumption that reference counting is not widely used is false. Based on what I can find online, Python and JavaScript are by far the most used languages today and are roughly equal, so in that respect reference counting GC is equally or possibly more popular than pure tracing GC.
Everyone doing it was a critical distinction there. OP is making it sound like there’s literally no drawbacks. If that was so, I’m pretty sure tracing would have long since died out. It has come up that a lot of languages do use it elsewhere in the thread.
Which is another reason I’m not so sure Roc is the answer we’ve all been waiting for. Then again, the first few Rust proponents would have sounded the same way.
Honestly I didn’t really follow OP’s meme or care enough to understand it, I’m just here to provide some context and nuance. I opened the comments to see if there was an explanation of the meme and saw something I felt like responding to.
Edit: Actually, I can’t see the meme. I was thinking of a different post. The image on this one doesn’t load for me.
“The answer we’ve all been waiting for” is a flawed premise. There will never be one language to rule them all. Even completely ignoring preferences, languages are targeted at different use cases. Data scientists and systems programmers have very different needs. And preferences are huge. Some people love the magic of Ruby and hate the simplicity of Go. I love the simplicity of Go and hate the magic of Ruby. Expecting the same language to satisfy both groups is unrealistic because we have fundamentally different views of what makes a good language.
I meant the person I was arguing with by OP. OOP’s image won’t load for me either now, but it was basically just a list of things that compile to LLVM.