Yes I believe we see some people in the community talking about downstream ripple effects. Here I wanted to point out that it was not in the values nor the principles of CEA which I believe would be important. I do not have specific examples (It seem to me that RP portfolio is made to reflect your opinion of risk not a factual risk). If I had to give some examples I would say that a lot of ea orgs aren’t clear about theyr choice and the possible downside of theyr work like Givewell, the life you can save, Giving green etc.. For example in neither of both Givewell or The Life You Can Save website’s you can find the GHD vs AW dilemna, in Giving green you can’t find possible downsides of advancing energy or decarbonising energy… I really appreciate the work that they are doing. The more problematic for me is companys that have a great impact like Open ai. They are making money devellopping a dangerous technology without acknowledging the risk they are taking. I realise my criticism isn’t worth a lot since I am to ideal for the real world where funders probably don’t want to hear about GHD vs AW. Nonetheless, I believe responsability should be part of the EA values and we should be more cautious about areas where the community has a great impact (because it took hold of neglected issues like AI)
Hello,
Yes I believe we see some people in the community talking about downstream ripple effects. Here I wanted to point out that it was not in the values nor the principles of CEA which I believe would be important. I do not have specific examples (It seem to me that RP portfolio is made to reflect your opinion of risk not a factual risk). If I had to give some examples I would say that a lot of ea orgs aren’t clear about theyr choice and the possible downside of theyr work like Givewell, the life you can save, Giving green etc.. For example in neither of both Givewell or The Life You Can Save website’s you can find the GHD vs AW dilemna, in Giving green you can’t find possible downsides of advancing energy or decarbonising energy… I really appreciate the work that they are doing. The more problematic for me is companys that have a great impact like Open ai. They are making money devellopping a dangerous technology without acknowledging the risk they are taking. I realise my criticism isn’t worth a lot since I am to ideal for the real world where funders probably don’t want to hear about GHD vs AW. Nonetheless, I believe responsability should be part of the EA values and we should be more cautious about areas where the community has a great impact (because it took hold of neglected issues like AI)