Thanks Caleb. I’m heads down on short term project so I can’t give a long reply, but have a few short things .
Raemon offered to do the heavy lifting on why the epistemic point is so important https://www.lesswrong.com/posts/FNPXbwKGFvXWZxHGE/grant-applications-and-grand-narratives .
What do you think of applicants giving percentile outcomes, terminal goals without justification (e.g “improve EA productivity”), and citing other people to justify why their project is high impact?
Is that link correct?
Thanks Caleb. I’m heads down on short term project so I can’t give a long reply, but have a few short things .
Raemon offered to do the heavy lifting on why the epistemic point is so important https://www.lesswrong.com/posts/FNPXbwKGFvXWZxHGE/grant-applications-and-grand-narratives .
What do you think of applicants giving percentile outcomes, terminal goals without justification (e.g “improve EA productivity”), and citing other people to justify why their project is high impact?
Is that link correct?