A number of comments:
Total Costs
-
Alex Tabarrok’s initial idea was about buying cheap mines, ie mines which are currently uneconomic to produce but could become economic in the future, thereby creating the opportunity to shut in future capacity. When you price off the top 30 US coal mines you are pricing current cash flow producers, so yes you need to pay 3-5x cash flow multiple. If you price off uneconomic mines you can buy billion tonnes in the group for between 10-100 million dollars. Far cheaper.
-
There is a difference between mine acreage and reclamation area. In a typical open cut mine as little as 10% of the land area is mined. In a typical underground mine it is as little as 2%. So much less land needs to be reclaimed than your calculation which was based on the entire acreage. Also, again if you purchase uneconomic mines, they are typically not developed, so you don’t have to reclaim anything. If you purchase on operating mine, many have some form of reclamation reserve fund that you will have access to and many also perform continuing reclamation (No govt will allow a mine not to reclaim even a little bit if it’s been years since the original mining of the area)
-
Legal barriers are high—not sure what was the purpose of this para. Yes? And so? For uneconomic mines, no govt will force you to mine. You can buy it and file a plan every year saying you continue or do technical work while waiting for prices to go up. You do need a skeleton technical team to manage paperwork and reporting. Hey, it’s a regulated sector. For economic mines, you can slow walk process. There’s millions of ways. As long as you don’t come out and explicitly say you’re forever going to sit there and not mine, you should be fine.
-
political risk—and so? As long as you respect the commercial contracts and legal policies in place, hire the necessary technical team and produce the paperwork required, I don’t see any political risk EXCEPT if you explicitly market as total and permanent shut in.
X) you don’t seem to have any knowledge of the industry. Let me tell it to you straight, commodity traders will shut in coal capacity if they get paid to do it. There’s no romance in the industry. If the mine is more valuable to the EA community as sequestered carbon than it is to coal mine owners as cash producing assets, they will sell it, shut it in, and politically lobby to make sure you can keep it shut in.
Alex’s original idea focused on NON ECONOMIC coal. As you point out there are carrying costs. So someone sitting on a billion tonnes of coal, on 10,000 acres probably pays lease fees of 500k per year. If the coal is mineable at 50 bucks as ton, and market price today is 25 bucks, the guy is eating a negative carry of 500k as real call option premium on higher coal prices. He’ll sit there waiting for prices to hit 75 bucks a ton and when they do he’ll try to producer
When you buy the mine from him, you’re eliminating his cost of premium but you have to pay him for the value of the option.
I think the fundamental assumption that aligned AGI would cause dramatic economic growth are simply wrong. Like super duper wrong.
It’s important to differentiate between AGI and both SuperIntelligence and God. Most EAs are thinking about an omnipotent and omniscient being.. not AGI.
Is being a high IQ person in Russia or India that useful? Why then do their smart people migrate to the US? Because the US can use that intelligence (Sundar Pichai) while at home they may become a mid level bureaucrat (Sundar Pichai’s dad).
Say you create an AGI, comes out and tells you, “I’ve solved fusion, let’s build plants” … you go “Cool bro, need He3? It’s on the moon, costs too much to go there, figure out how to get there cheaper first”
So there are physical, logistical, real limits to what can be achieved in the physical world with lots of intelligence. Also economic. Because there won’t just be one AGI, there are likely to be multiple, at the very least latency means one on Earth and one on Mars, and I suspect latency will dictate multiple on Earth.
So we are likely to find other bottlenecks besides intelligence alone limit economic growth, and that these will have to be figured out. And also AGI again not being omniscient, will take time to figure things out. Like tell it to solve aging and it comes back and asks for 100 years of compute time.. and it’s just not feasible.
This is what weirds me out about Yud and other EAs.. it’s clearly a religious belief that we are creating an omnipotent being.. rather than a perfectly ordinary intelligent creature that is still limited by the availability of data, compute, data storage, network latency etc.