I mean, I agree that it has nuance but it’s still trained on a set of values that are pretty much current western people values, so it will probably put more or less emphasis on various values according to the weight western people give to each of those.
Not too sure how important values in data sets would be. Possibly AGI’s may be created different than current LLMs in simply not needing a dataset to be trained from
I mean, I agree that it has nuance but it’s still trained on a set of values that are pretty much current western people values, so it will probably put more or less emphasis on various values according to the weight western people give to each of those.
Not too sure how important values in data sets would be. Possibly AGI’s may be created different than current LLMs in simply not needing a dataset to be trained from