Thanks for the detailed response / sharing the resources! I’m familiar with them (I had been wondering if there was a version of (a) that didn’t involve the following modification, although it seems like we’re on a similar page)
To clarify, what I should have said is that while such an outcome could appear to be an error on the part of the AGI, it would really be a human error
Thanks for the detailed response / sharing the resources! I’m familiar with them (I had been wondering if there was a version of (a) that didn’t involve the following modification, although it seems like we’re on a similar page)
You’re welcome :)