According to Wired, new research shows that training large language models with low-quality content reduces their cognitive abilities.
This effect is similar to the effect that short, interesting content creates in people, causing knowledge burnout. During the experiment, two models were trained on different types of texts: one on those familiar with these models, the other on short posts on social networks that were simplified and contained advertising language. In addition to general cognitive decline, researchers found that models trained with low-quality content were difficult to retrain.
As the experimenters put it: “The more meaningless content spreads on social networks, the more it contaminates the data on which future models will be trained. Our findings suggest that when this kind of data depletion occurs, learning from quality data may not fully fix it.”
By the way, we have previously written about how artificial intelligence is learning to manipulate arguments to win.
Source: People Talk
Mary Crossley is an author at “The Fashion Vibes”. She is a seasoned journalist who is dedicated to delivering the latest news to her readers. With a keen sense of what’s important, Mary covers a wide range of topics, from politics to lifestyle and everything in between.


