Since around 2017, there has been a lot of public interest in how youtube’s recommendation algorithms may affect individuals and society negatively. Governments, think tanks, the press/media, and other institutions have pressured youtube to adjust its recommendations. You could think of this as our world’s (indirect & corrupted) way of trying to instill humanity’s values into youtube’s algorithms.
You might be interested in Building Human Values into Recommender Systems: An Interdisciplinary Synthesis as well as Jonathan Stray’s other work on alignment and beneficence of recommender systems.
Since around 2017, there has been a lot of public interest in how youtube’s recommendation algorithms may affect individuals and society negatively. Governments, think tanks, the press/media, and other institutions have pressured youtube to adjust its recommendations. You could think of this as our world’s (indirect & corrupted) way of trying to instill humanity’s values into youtube’s algorithms.