minus-squareBehohippy@lemmy.worldtoAI@lemmy.ml•The AI Feedback Loop: Researchers Warn Of "Model Collapse" As AI Trains on AI-Generated ContentlinkfedilinkEnglisharrow-up1·1 year agoAny data sets produced before 2022 will be very valuable compared to anything after. Maybe the only way we avoid this is to stick to training LLMs on older data and prompt inject anything newer, rather than training for it. linkfedilink
Behohippy@lemmy.world to Selfhosted@lemmy.worldEnglish · 1 year agoMy home server setupplus-squarelemmy.worldimagemessage-square0fedilinkarrow-up13arrow-down10
arrow-up13arrow-down1imageMy home server setupplus-squarelemmy.worldBehohippy@lemmy.world to Selfhosted@lemmy.worldEnglish · 1 year agomessage-square0fedilink
Any data sets produced before 2022 will be very valuable compared to anything after. Maybe the only way we avoid this is to stick to training LLMs on older data and prompt inject anything newer, rather than training for it.