I reached out to the events team and they sent me this link :)
Cillian_
Transformer Weekly is great: https://www.transformernews.ai/
Summarises the AI news of the week every Friday.
Compare giving what we can’s page which has good branding and simple language. IMO 80,000 hours page has too much text and too much going on front page. Bring both websites up on your phone and judge for yourself.
My understanding is that 80k have done a bunch of A/B testing which suggested their current design outcompetes ~most others (presumably in terms of click-throughs / amount of time users spend on key pages).
You might not like it, but this is what peak performance looks like.
Since 2022, we’ve partnered with major news outlets but our ambition is to scale Tarbell far beyond this. We hope to grow into an AI journalism organisation of comparable size and prestige to the Pulitzer Centre by 2029.
We’re a small, but growing, organisation. To reach our impact potential, we need to hire world-class people. These first few hires are absolutely crucial and I expect they could be the most important that the organisation will make as they will determine our trajectory.
If you could take ~5 mins to help, I would be enormously appreciative:
Share these job postings directly with suitable candidates (template messages here)
Refer specific people via this form
Apply here by July 21 (if you’re personally interested in being considered)
all aspects of effective autism
Can’t tell if the correction is joke or genuine misunderstanding, but I enjoyed this even more
Out of interest:
What are your other two priorities?
How will you know if you’ve been successful in “improving your comms”? Curious to hear if you have a more specific okr here
How much did this impact assessment cost to commission? Are you open to others reaching out to commission similar assessments?
(Feel free to DM your responses if you prefer, though I expect others might benefit from this info too)
What’s the forum etiquette on advertising jobs?
Context: Training for Good is hiring for two exciting roles. I expect a bunch of great applicants to be here, right now, on this forum.
BUT I suspect that top level posts advertising jobs decreases the average user’s experience. Maybe that’s outweighed by the possibility of TFG reaching a top candidate (but I’m motivated to believe that so don’t really trust it). Plus, it feels like a tragedy of the commons type scenario. So I’ve decided to post it as ashort formquick take instead.
___________
Training for Good is hiring for two exciting roles. Come join us as a founding employee- Compensation: $40-60k, depending on location and experience.
- Location: Remote, with 3+ hours of overlap with UK working hours
- Closing date: 13th August 2023, 11:59 PM Anywhere on Earth (we’ll be reviewing applications on a rolling basis until then)
- Responsibilities: You’ll be responsible for designing and delivering AI training programs to early-career journalists and EU policymakers. Within 3-6 months, we expect you will transition to leading one of our fellowship programmes.
- Compensation: $30-45k, depending on location and experience.
- Location: Remote, with 3+ hours of overlap with UK working hours
- Closing date: 13th August 2023, 11:59 PM Anywhere on Earth (we’ll be reviewing applications on a rolling basis until then)
- Responsibilities: You’ll be responsible for leading, implementing, and innovating on Training for Good’s operational and administrative processes. The work is varied and includes organising in-person training weeks, managing financial operations, optimising vetting processes, and overseeing HR and legal procedures
Not sure why you unendorsed this.
I run Training for Good & also found this wild!
- Jul 18, 2023, 7:48 PM; 8 points) 's comment on Cillian Crosson’s Quick takes by (
After a quick read, this was my first thought too (ie that promoting & advocating for the use of “what three words” might be an easier solution)
Curious why you chose the name “the GPI” (Global Prosperity Institute)?
Seems ripe for confusion with GPI (Global Priorities Institute)
Yep, we discontinued it. We suggest using the EA Opportunities Board instead: https://ea-internships.pory.app/
Great work—excited to see so much growth across the podcast, one-on-one service & job board! I’m curious about web engagement though.
Web engagement hours fell by 20% in 2021, then grew by 38% in 2022 after we increased investment in our marketing.
This implies that engagement hours rose by ~10% in 2022 compared to 2020. This is less than I would have expected given the marketing budget rose from $120k in 2021 to $2.65m in 2022. I’m assuming it was also ~$120k in 2020 (but this might not be true). Even if we exclude the free book giveaway (~£1m), there seems to have been a ~10x increase in marketing here that translated to a 10-40% rise in engagement hours (depending whether you count from 2020 or 2021).
See quote from this recent post for context on marketing spend:
In 2022, the marketing programme spent $2.65m (compared to ~$120k spent on marketing in 2021). The bulk of this spending was on sponsored placements with selected content creators ($910k), giving away free books to people who signed up to our newsletter ($1.04m), and digital ads ($338k).
I can think of a bunch of reasons why this might be the case. For example:
Maybe the price of acquiring new users / engagement hours increases geometrically or something
It looks like marketing drove a large increase in newsletter subs. Maybe they’re engaging with the content directly in their inbox instead?
Maybe you expect a lag in time between initial reach & time spent on the 80k website for some reason (e.g. because people become more receptive to the ideas on 80k’s website over time, especially if they’re receiving regular emails with some info about it)
Maybe marketing mainly promoted podcast / 1-1 service / job board (or people reached by marketing efforts mainly converted to users of these services)
See screenshot from the full report for extra context on engagement hours, unique visitors & subscribers:Again, great work overall. I’d be really curious to hear any quick thoughts anyone from 80k has on this?
Sounds like something @Hamish Doodles could do (if they were interested / thought this was sufficiently worthwhile)
Nice! Can you provide an update when it’s ready on Google Podcasts?
I think this comment would be much more helpful if it linked to the relevant posts about Leverage rather than just called Geoff a “known cult leader”.
(On phone right now but may come back and add said links later unless Guy / others do)
Fwiw this doesn’t line up with my experience at all as someone who previously participated.
(n = 1 but I’d be very surprised to hear that the sentiment you describe above was commonplace among people who previously participated)
Suggestion: consider including a brief summary of the report in this forum post (e.g. the “key takeaways” section).
I’ve copied it below for ease
KEY TAKEAWAYS
There is no escape hatch for humanity, nor for the rich. Shelters that can reliably protect even a small group of humans against catastrophes that would otherwise make humanity extinct are probably infeasible due to multiple technical, psychological, social, political, and economic issues. Constructing “escape hatches” for the few, particularly for the rich and the powerful, would probably increase the net catastrophic and existential risk, as any benefits gained would almost certainly be offset by incentive hazards and further erosion of the perception that we are all in this together.
Self-sufficient space colonies that could protect against existential risks require technologies and skills that, if they existed, could be used more cheaply and reliably to create self-sustaining shelters on Earth. This will likely remain the case in the foreseeable future.
Even if a small group manages to survive a planetary catastrophe, only in some scenarios it is at all plausible that their descendants could repair the damages caused by any catastrophic outcome that the global society failed to prevent, and reconstitute the technological civilization.
Therefore, to save civilization, one needs to save society. The best lifeboat is the ship; the best shelter is a functional society. Increasing the resilience of societies and their capability for cooperative action would increase humanity’s resilience against events that could cascade into existential risks while having obvious benefits in less dire circumstances as well.
Even though popular discussion about shelters tends to revolve around bunkers and stockpiles, the importance of organizational efforts, e.g. maintenance, training, and preparedness, cannot be overstated. No amount of material preparations or technology will help in a crisis if they do not work due to lack of maintenance, or if humans do not know how to use them. On the other hand, organizations that train to respond to disruptions can improvise even if they lack materials.
The solutions to the shelter problem are not primarily technological. As far as I’m aware, no one has been able to identify any foreseeable technologies that would offer substantial improvements in societal resilience or otherwise provide a significant reduction in existential risk, although research into resilience-enhancing, “resilient by default” and “gracefully failing” technologies and practices should probably receive more funding than it currently does. However, even here the primary problem is not technical but economic: more resilient technologies and practices often exist already but they tend to be more expensive to buy or to use.
Longer-term research programs could nevertheless develop cost-effective ways to increase resilience against catastrophes and permit easier or faster recovery from a disaster. One obvious partner would be research into self-sustaining ecosystems for space colonization. A demonstration facility for the long-term feasibility of a closed-loop life support system would also double as a shelter, even if the small scale of such habitats and likely reliance on the “iceberg” of external technical support raises serious questions about the contribution they could provide for existential risk reduction.
Natural, accidental, or deliberate release of a dangerous pathogen(s) is widely seen as the threat with the most potential to precipitate an existential risk, although one should remember that the ongoing COVID-19 pandemic may bias this conclusion. A particularly worrisome prospect is the simultaneous, deliberate release of two or more pathogens, which could greatly confound the efforts to detect and contain the outbreak.
The SHELTER meeting participants seemed to broadly agree that with some exceptions, any single causal factor is unlikely to cause the extinction of humanity and is probably not sufficient to cause a catastrophic event. Instead, most existential risks and many catastrophic risks would probably be the result of several interacting mechanisms that e.g. prevent timely response to a risk that in theory should be manageable. Breakdown of the societal capability to act is thus a major risk multiplier. Single-cause risks that threaten human extinction, such as nearly omniscient AI god, are probably risks that shelters cannot realistically protect against.
Existing efforts in disaster management, particularly in countries with already robust civil defense/disaster response capability (Finland, Sweden, Switzerland etc.) could probably be augmented by relatively low-cost means to reduce the likelihood of major catastrophe(s) a) cascading to existential risks and/or b) leading to serious, irrecoverable loss of accumulated knowledge. Empirical validation of proposed means for improving resilience and the probability of recovery is necessary.
Two of the shelter strategies that seemed to gather the most support are a) hardening existing facilities identified as crucially important for reducing the likelihood of disasters cascading into catastrophes or existential risks, e.g. biomedical research and manufacturing facilities, and b) maintaining or even increasing the geographical and cultural diversity of humanity by supporting or even creating relatively isolated communities and helping them increase their resilience against biological threats in particular.
Maintaining human geographical and cultural diversity by supporting relatively isolated communities would be a no-regrets strategy that would increase resiliency and provide tangible benefits to typically underserved communities today.
Any strategy that is adopted must gain buy-in from the people who are involved. Gaining acceptance from the people is particularly important when supporting isolated communities, most of whom have very good reasons to be extremely wary of outsiders trying to “help” them. A humble bottom-up approach that is guided by what the people themselves want and need is practically mandatory.
See also the Tarbell Fellowship, a one year programme for early-career journalists interested in covering important topics such as global poverty & existential risks.
(Applications for the 2023 intake close on October 9th)
If you’re interested in supporting Tarbell, you can do so here (rather than via the Manifund link included above): https://www.every.org/tarbell/f/fund-in-depth-reporting
Happy to chat with potential donors, so please do email me if you have questions: cillian [at] tarbellfellowship [dot] org