The “correct” way to use AI for coding (and anything really) is to ask for explanations / tutorials when you can’t find one online, then learn from that.
Never let it do something for you. That’s how you lose. If you’re not actively learning, you’re actively rotting, and that goes for life in general too.
I don’t think that’s a good idea, if you can’t find an explanation online that means that there’s not much info available in which case the best thing would be to ask on a forum, that way other people that look for that info will find it.
Not really, google results have been just that bad for the last 10 years. I can spend 10min looking for a piece of documentation on something and not find it. Or I can prompt an internet-connected AI and have it spit out links to relevant docs. It’s gotten THAT bad.
The “correct” way to use AI for coding (and anything really) is to ask for explanations / tutorials when you can’t find one online, then learn from that.
except the “explanation” frequently will be 100% “hallucinated” bullshit
People say the best way to see this is asking AI about subject you’re expert of.
This is not always possible, I had people who said “but I’m not expert at anything”. Another way is to ask them about yourselves. For example if you have reddit account that is has some age, Gemini has deal with reddit and feeds them everything that’s posted. First response might even look good, but continue talking (as it is getting more ridiculous), don’t try correct, you can see how it is making shit up.
Since they are feeding it with everything lemmy might also work.
That’s why I always ask it to cite sources. Basically googld ATP since google is turning to shit and all other search engines still aren’t quite as good
It could very easily use a completely different or hallucinated source.
But a lot of LLM products are now providing source links right in the response. I’ve found them useful, and hopefully they aren’t produced just by feeding the text back in and asking for a link.
The “correct” way to use AI for coding (and anything really) is to ask for explanations / tutorials when you can’t find one online, then learn from that.
Never let it do something for you. That’s how you lose. If you’re not actively learning, you’re actively rotting, and that goes for life in general too.
So Using it as my emotional dumbing machine is wrong ?
I don’t think that’s a good idea, if you can’t find an explanation online that means that there’s not much info available in which case the best thing would be to ask on a forum, that way other people that look for that info will find it.
Not really, google results have been just that bad for the last 10 years. I can spend 10min looking for a piece of documentation on something and not find it. Or I can prompt an internet-connected AI and have it spit out links to relevant docs. It’s gotten THAT bad.
except the “explanation” frequently will be 100% “hallucinated” bullshit
People say the best way to see this is asking AI about subject you’re expert of.
This is not always possible, I had people who said “but I’m not expert at anything”. Another way is to ask them about yourselves. For example if you have reddit account that is has some age, Gemini has deal with reddit and feeds them everything that’s posted. First response might even look good, but continue talking (as it is getting more ridiculous), don’t try correct, you can see how it is making shit up.
Since they are feeding it with everything lemmy might also work.
That’s why I always ask it to cite sources. Basically googld ATP since google is turning to shit and all other search engines still aren’t quite as good
Then why not ask just for the sources and read them yourself?
It could very easily use a completely different or hallucinated source.
But a lot of LLM products are now providing source links right in the response. I’ve found them useful, and hopefully they aren’t produced just by feeding the text back in and asking for a link.