The GNU Compiler Collection (GCC) developers now have a need to set a policy whether AI / Large Language Model (LLM) generated patches will be accepted for this open-source compiler stack.
The GCC compiler doesn’t currently have a policy in place whether to permit AI/LLM-generated patches. But within a bug report today there is a patch posted by a user in trying to fix a GCC 16 compiler regression.


Well… I’d be curious to know how anyone would argue that AI generated code is GPL safe, especially considering they are often using black box, binary blob models trained on mystery code from all over the internet, with zero attribution or license information, and with some of it almost certainly being proprietary.
Are we going to start ignoring the very licenses that we created.
Alternatively, I would argue that all LLM code output must be GPL, since it was trained partially on GPL code.
Either that, or LLM code output cannot be used for any purpose at all, by anyone.