The AI Feedback Loop: Researchers Warn Of "Model Collapse" As AI Trains on AI-Generated Content
As a generative AI training model is exposed to more AI-generated data, it performs worse, producing more errors, leading to model collapse.
Any data sets produced before 2022 will be very valuable compared to anything after. Maybe the only way we avoid this is to stick to training LLMs on older data and prompt inject anything newer, rather than training for it.