RSS

Atom­i­cally pre­cise manufacturing

TagLast edit: 19 Aug 2022 21:22 UTC by Leo

Atomically precise manufacturing (APM) is a proposed technology for assembling a wide variety of macroscopic structures molecule-by-molecule with atomic precision.

Evaluation

80,000 Hours rates atomically precise manufacturing a “potential highest priority area”: an issue that, if more thoroughly examined, could rank as a top global challenge.[1]

Further reading

Beckstead, Nick (2014) A conversation with Chris Phoenix on August 20, 2014, Open Philanthropy, August 20.

Beckstead, Nick (2015) Risks from atomically precise manufacturing, Open Philanthropy, June.

Drexler, K. Eric (2013) The physical basis of high-throughput atomically precise manufacturing, Metamodern.

Hilton, Benjamin (2022) Risks from atomically precise manufacturing, 80,000 Hours, July 29.

Muehlhauser, Luke (2017) Some case studies in early field growth, Open Philanthropy, August.

Regis, Ed (1990) Great Mambo Chicken and the Transhuman Condition: Science Slightly Over the Edge, Reading, Massachusetts: Addison-Wesley.

Rosales, Janna (2010) Drexler-Smalley debates, in David Guston (ed.) Encyclopedia of Nanoscience and Society, Thousand Oaks, California: SAGE Publications, pp. 170–171.

  1. ^

    80,000 Hours (2022) Our current list of pressing world problems, 80,000 Hours.

Risks from atom­i­cally pre­cise man­u­fac­tur­ing—Prob­lem profile

Benjamin Hilton9 Aug 2022 13:41 UTC
53 points
4 comments5 min readEA link
(80000hours.org)

My thoughts on nan­otech­nol­ogy strat­egy re­search as an EA cause area

Ben Snodin2 May 2022 9:41 UTC
137 points
17 comments33 min readEA link

Risks from Atom­i­cally Pre­cise Manufacturing

MichaelA🔸25 Aug 2020 9:53 UTC
29 points
4 comments2 min readEA link
(www.openphilanthropy.org)

Drexler’s Nanosys­tems is now available online

MikhailSamin1 Jun 2024 14:41 UTC
32 points
4 comments1 min readEA link
(nanosyste.ms)

A new database of nan­otech­nol­ogy strat­egy re­sources

Ben Snodin5 Nov 2022 5:20 UTC
39 points
0 comments1 min readEA link

What’s the Use In Physics?

Tetraspace30 Dec 2018 3:10 UTC
54 points
13 comments4 min readEA link

Brian Tse: Risks from Great Power Conflicts

EA Global11 Mar 2019 15:02 UTC
23 points
2 comments13 min readEA link
(www.youtube.com)

Prob­lem ar­eas be­yond 80,000 Hours’ cur­rent pri­ori­ties

Arden Koehler22 Jun 2020 12:49 UTC
280 points
62 comments15 min readEA link

Case stud­ies of self-gov­er­nance to re­duce tech­nol­ogy risk

jia6 Apr 2021 8:49 UTC
55 points
6 comments7 min readEA link

Some AI Gover­nance Re­search Ideas

MarkusAnderljung3 Jun 2021 10:51 UTC
101 points
5 comments2 min readEA link

Differ­en­tial tech­nolog­i­cal de­vel­op­ment

james25 Jun 2020 10:54 UTC
37 points
7 comments5 min readEA link

Tech­nolog­i­cal de­vel­op­ments that could in­crease risks from nu­clear weapons: A shal­low review

MichaelA🔸9 Feb 2023 15:41 UTC
79 points
3 comments5 min readEA link
(bit.ly)

Database of ex­is­ten­tial risk estimates

MichaelA🔸15 Apr 2020 12:43 UTC
130 points
37 comments5 min readEA link

An In­for­mal Re­view of Space Exploration

kbog31 Jan 2020 13:16 UTC
51 points
5 comments35 min readEA link

Some thoughts on Toby Ord’s ex­is­ten­tial risk estimates

MichaelA🔸7 Apr 2020 2:19 UTC
67 points
33 comments9 min readEA link

Cru­cial ques­tions for longtermists

MichaelA🔸29 Jul 2020 9:39 UTC
104 points
17 comments19 min readEA link

[Question] Is nan­otech­nol­ogy (such as APM) im­por­tant for EAs’ to work on?

pixel_brownie_software12 Mar 2020 15:36 UTC
6 points
9 comments1 min readEA link

‘Cru­cial Con­sid­er­a­tions and Wise Philan­thropy’, by Nick Bostrom

Pablo17 Mar 2017 6:48 UTC
30 points
4 comments24 min readEA link
(www.stafforini.com)

EA rele­vant Fore­sight In­sti­tute Work­shops in 2023: WBE & AI safety, Cryp­tog­ra­phy & AI safety, XHope, Space, and Atom­i­cally Pre­cise Manufacturing

elteerkers16 Jan 2023 14:02 UTC
20 points
1 comment3 min readEA link

Some global catas­trophic risk estimates

Tamay10 Feb 2021 19:32 UTC
106 points
15 comments1 min readEA link
No comments.