OK, how interesting. I tend to stick to the academic EA literature, ‘visionary’ type EA books and when on this forum, generally not posts like these kinds of posts. So I suppose I might not encounter too much of this… appropriation… for lack of a better term. You imply this self-serving quality is also infused into what it means for EAs to be rational as well? As if the economic definition of ‘rational self-interest’ is the definition of rational? Am I reading that correctly?
As NinaR said, ’round these parts the word “agentic” doesn’t imply self-interest. My own gloss of it would be “doesn’t assume someone else is going to take responsibility for a problem, and therefore is more likely to do something about it”. For example, if the kitchen at your workplace has no bin (‘trashcan’), an agentic person might ask the office manager to get one, or even just order one in that they can get cheaply. Or if you see that the world is neglecting to consider the problem of insect welfare, instead of passively hoping that ‘society will get its act together’, you might think about what kind of actions would need to be taken by individuals for society to get its act together, and consider doing some of those actions.
That’s a definition I would assume an EA would give for the question- what it means to have agency for EAs, but from my view, it doesn’t mesh with how the OP is describing agency and therefore what is “agentic” in this post nor the post this post is a self-response to. In that first post, the one this is referencing, the OP gives seven recommendations that are clearly self-interested directives (e.g., “Figure out what you need, figure out who can help you get it, ask them for it”). If karma is an indicator, that first post was well received. What I am getting here, if I try to merge your position and my view of the OP’s position, is that self-interested pursuits are means to an ends which is a sort of eventual greater good? Like, I can pursue everything “agentic-ly” because my self-interest is virtuous (This sounds like objectivism to me, btw)?
Now, I actually think this current post is about rolling back that original post a bit, but only a little. Hence my questions about regulating ‘agency’ in others—to try and get some parameters or lane markers for this concept here.
These aren’t defined topics (agency, agentic) in the EA forum, btw. This is partly why I am so interested in how people define them since they seem to be culturally defined here (if in multiple ways) differently than they would be in non-critical philosophy and mainstream economics (from which EA is rooted).
When I read the original OP that this OP is a response to, I am “reading in” some context or subtext based on the fact I know the author/blogger is an EA; something like “when giving life advice, I’m doing it to help you with your altruistic goals”. As a result of that assumption, I take writing that looks like ‘tips on how to get more of what you want’ to be mainly justified by being about altruistic things you want.
I don’t fully understand your comment, but I think agency is meant to be a relatively goal-agnostic cognitive tool, similar to e.g. “being truth-seeking” or “good social skills.” Altruism is about which goals you load in, but at least in theory this is orthogonal to how high your ability to achieve your goals are.
That definition aligns more with the sort of traditional, non-critical philosophical and mainstream economic approach to agency, which means there are now basically three definitions to agency in this post and comment thread from EA peeps. I’m glad I asked because I’ve always just assumed the sort of standard non-critical philosophy and mainstream economic definition (because that’s the origin story of EA too) when reading things here and encountering the term agency—which I do not feel fits at all with the definition intimated or literally given by the OP in this post or the referenced, original post (which was highly karma’d).
“Agency” refers to a range of proactive, ambitious, deliberate, goal-directed traits and habits.
I see that as a definition driven by self-interest, the original essay confirms this. There’s probably room for debate there, but its definitely not a “relatively goal-agnostic cognitive tool,” type definition, for instance.
So, I think I’ve got to stop assuming a shared definition here for this one and probably more importantly, stop assuming that EAs actually take it under consideration, really (there’d probably be a terms entry if it’s definition were valuable here, I suppose). I had no idea there were such post-y things in EA. How exciting. Ha!
If I were to guess what the ‘disagreement’ downvotes were picking up on, it would be this:
I see that as a definition driven by self-interest
Whereas to me, all of the adjectives ‘proactive, ambitious, deliberate, goal-directed’ are goal-agnostic, such that whether they end up being selfish or selfless depends entirely on what goal ‘cartridge’ you load into the slot (if you’ll forgive the overly florid metaphor).
I don’t think self-interest is relevant here if you believe that it is possible for an agent to have an altruistic goal.
Also, as with all words, “agentic” will have different meanings in different contexts, and my comment was based on its use when referring to people’s behaviour/psychology which is not an exact science, therefore words are not being used in very precise scientific ways :)
It was the OP that injected self-interest into the concept of agency, hence my original question. And I totally agree about the meaning of words varying. None of this is a science at all, in my opinion, just non-critical philosophy and economics and where the two meet and intertwine. I’m just trying to understand how EA and EAs define these things, that’s all.
Thanks so much for your input. I really appreciate it.
OK, how interesting. I tend to stick to the academic EA literature, ‘visionary’ type EA books and when on this forum, generally not posts like these kinds of posts. So I suppose I might not encounter too much of this… appropriation… for lack of a better term. You imply this self-serving quality is also infused into what it means for EAs to be rational as well? As if the economic definition of ‘rational self-interest’ is the definition of rational? Am I reading that correctly?
Thanks for the response!
As NinaR said, ’round these parts the word “agentic” doesn’t imply self-interest. My own gloss of it would be “doesn’t assume someone else is going to take responsibility for a problem, and therefore is more likely to do something about it”. For example, if the kitchen at your workplace has no bin (‘trashcan’), an agentic person might ask the office manager to get one, or even just order one in that they can get cheaply. Or if you see that the world is neglecting to consider the problem of insect welfare, instead of passively hoping that ‘society will get its act together’, you might think about what kind of actions would need to be taken by individuals for society to get its act together, and consider doing some of those actions.
That’s a definition I would assume an EA would give for the question- what it means to have agency for EAs, but from my view, it doesn’t mesh with how the OP is describing agency and therefore what is “agentic” in this post nor the post this post is a self-response to. In that first post, the one this is referencing, the OP gives seven recommendations that are clearly self-interested directives (e.g., “Figure out what you need, figure out who can help you get it, ask them for it”). If karma is an indicator, that first post was well received. What I am getting here, if I try to merge your position and my view of the OP’s position, is that self-interested pursuits are means to an ends which is a sort of eventual greater good? Like, I can pursue everything “agentic-ly” because my self-interest is virtuous (This sounds like objectivism to me, btw)?
Now, I actually think this current post is about rolling back that original post a bit, but only a little. Hence my questions about regulating ‘agency’ in others—to try and get some parameters or lane markers for this concept here.
These aren’t defined topics (agency, agentic) in the EA forum, btw. This is partly why I am so interested in how people define them since they seem to be culturally defined here (if in multiple ways) differently than they would be in non-critical philosophy and mainstream economics (from which EA is rooted).
When I read the original OP that this OP is a response to, I am “reading in” some context or subtext based on the fact I know the author/blogger is an EA; something like “when giving life advice, I’m doing it to help you with your altruistic goals”. As a result of that assumption, I take writing that looks like ‘tips on how to get more of what you want’ to be mainly justified by being about altruistic things you want.
I don’t fully understand your comment, but I think agency is meant to be a relatively goal-agnostic cognitive tool, similar to e.g. “being truth-seeking” or “good social skills.” Altruism is about which goals you load in, but at least in theory this is orthogonal to how high your ability to achieve your goals are.
That definition aligns more with the sort of traditional, non-critical philosophical and mainstream economic approach to agency, which means there are now basically three definitions to agency in this post and comment thread from EA peeps. I’m glad I asked because I’ve always just assumed the sort of standard non-critical philosophy and mainstream economic definition (because that’s the origin story of EA too) when reading things here and encountering the term agency—which I do not feel fits at all with the definition intimated or literally given by the OP in this post or the referenced, original post (which was highly karma’d).
I see that as a definition driven by self-interest, the original essay confirms this. There’s probably room for debate there, but its definitely not a “relatively goal-agnostic cognitive tool,” type definition, for instance.
So, I think I’ve got to stop assuming a shared definition here for this one and probably more importantly, stop assuming that EAs actually take it under consideration, really (there’d probably be a terms entry if it’s definition were valuable here, I suppose). I had no idea there were such post-y things in EA. How exciting. Ha!
If I were to guess what the ‘disagreement’ downvotes were picking up on, it would be this:
Whereas to me, all of the adjectives ‘proactive, ambitious, deliberate, goal-directed’ are goal-agnostic, such that whether they end up being selfish or selfless depends entirely on what goal ‘cartridge’ you load into the slot (if you’ll forgive the overly florid metaphor).
I don’t think self-interest is relevant here if you believe that it is possible for an agent to have an altruistic goal.
Also, as with all words, “agentic” will have different meanings in different contexts, and my comment was based on its use when referring to people’s behaviour/psychology which is not an exact science, therefore words are not being used in very precise scientific ways :)
It was the OP that injected self-interest into the concept of agency, hence my original question. And I totally agree about the meaning of words varying. None of this is a science at all, in my opinion, just non-critical philosophy and economics and where the two meet and intertwine. I’m just trying to understand how EA and EAs define these things, that’s all.
Thanks so much for your input. I really appreciate it.
Yup, another commenter is correct in that I am assuming that the goals are altruistic.