If you have short timelines, I feel like your employment prospects in a fairly specific hypothetical should be fairly low on your priority list compared to ensuring AI goes well? Further, if you have short timelines, knowing about AI seems likely to be highly employable and lab experience will be really useful (until AI can do it better than us...)
I can buy “you’ll be really unpopular by affiliation and this will be bad for your personal life” as a more plausible argument
“If you have short timelines, I feel like your employment prospects in a fairly specific hypothetical should be fairly low on your priority list compared to ensuring AI goes well?”
- Possibly! I’m mostly suggesting that people should be aware of this possibility. What they do with that info is obvs up to them
”if you have short timelines, knowing about AI seems likely to be highly employable and lab experience will be really useful”
- The people I’m pointing to here (e.g. you) will be employable and useful. Maybe we’re talking passed each other a bit?
The people I’m pointing to here (e.g. you) will be unemployable or unuseful. Maybe we’re talking passed each other a bit?
Why do you believe this? Being an expert in AI is currently very employable, and seems all the more so if AI is automating substantial parts of the economy
I’m very confused. Your top level post says “your employment prospects could be damaged.” I am explaining why I think that the skillset of lab employees will be way more employable. Is your argument that the PR will outweigh that, even though it’s a major factor?
If you have short timelines, I feel like your employment prospects in a fairly specific hypothetical should be fairly low on your priority list compared to ensuring AI goes well? Further, if you have short timelines, knowing about AI seems likely to be highly employable and lab experience will be really useful (until AI can do it better than us...)
I can buy “you’ll be really unpopular by affiliation and this will be bad for your personal life” as a more plausible argument
Hi! Some thoughts:
“If you have short timelines, I feel like your employment prospects in a fairly specific hypothetical should be fairly low on your priority list compared to ensuring AI goes well?”
- Possibly! I’m mostly suggesting that people should be aware of this possibility. What they do with that info is obvs up to them
”if you have short timelines, knowing about AI seems likely to be highly employable and lab experience will be really useful”
- The people I’m pointing to here (e.g. you) will be employable and useful. Maybe we’re talking passed each other a bit?
Why do you believe this? Being an expert in AI is currently very employable, and seems all the more so if AI is automating substantial parts of the economy
Ugh I wrote the wrong thing here, my bad (will update comment).
Should have said
“—The people I’m pointing to here (e.g. you) will be employable and useful. Maybe we’re talking passed each other a bit?”
I’m very confused. Your top level post says “your employment prospects could be damaged.” I am explaining why I think that the skillset of lab employees will be way more employable. Is your argument that the PR will outweigh that, even though it’s a major factor?