Your Profession Is Replaceable, Your Intelligence Is Not
33,333 3 GHz Computers With ChatGPT vs You? Not Even Close...
You're way more efficient than 33,333 3 GHz computers loaded with chat GPT, but most people don't even think they're as smart as the computer they interact with every day... even though it takes 33,333 3 GHz computers to equal the 100 trillion instructions per second a human brain computes (https://www.scientificamerican.com/article/rise-of-the-robots/). You think I'm just flattering you? You're going 100 terahertz a second. Maybe more. That's not me, that's Hans Moravec's figure.
One high end PC consumes about 1 kilowatt hour of electricity per hour of operation. That is about $.30 if you have PG&E. 33,333 computers times $.30 is $10,000. An hour.
Even if you take 3000 calories a day to run, that's only 3.489 kilowatt hours. If you live 86 years, that means 3.489 KWH * 365 days * 86 years equals 110 Megawatt hours to run you for your whole life. $33k of electricity. Electricity is cheaper than food when it comes to calories. 2 pounds of rice is $2, times 365 times 86 years equals $63k, not including extras.
For the computer to work as many hours as you will, one figures, you're doing 250 days a year (or so) and for 8 hours a day for 45 years. That's 90,000 hours. How many times will the computer break down and need new material inputs? How many times will software and security cause a system failure? As many times as you get sick, or goof off, or go on vacation? Ok, then that evens out.
Where it doesn't even out is the frequency with which computing hardware fails and must be replaced. We need 90,000 hours of computing, or 10 years, to equal a human's labor lifetime. If we replace every two years, we have to replace 4 times and buy the initial capital outlay as well.
Right now 33,333 high end desktop computers are about $2k each. The initial outlay is $66.6 million. Assuming Moore's law holds (twice as powerful every 2 years), that means just by the 4th generation, 10 years later, cumulative outlays would have reached $129 million (we only need half as many computers every 2 years, though it's more complicated than that).
If it's $10k/hr to run an equivalent number of computers and it needs to match 90,000 hours of human labor, that's going to be $900 million to create a machine equivalent. Plus the $129 million from the cost of the computers, and that gives us $1.029 billion. That doesn't even include human technical maintenance of the machines. We won't even go into the robotics of what it takes for machine intelligence to run machinery independently, and the cost.
At an extremely overly optimistic best, 3.1% of people would be useful to replace, because that's everyone in the 3rd IQ standard deviation and higher. Even if we drop it to 1% of humans worth replacing, out of 8 billion people that's 80 million geniuses. Do the math.
80,000,000 geniuses * $1.029 billion machine equivalent = $82.32 Quadrillion.
This doesn't include the capital outlay to build all that yet to be built machinery. Because if we do it in one tenth installments, that's still $8.232 quadrillion of production per year. Is it even possible to produce that much electricity? If incremental capital outlay ratio is 3.3 then that means that we need almost $247 quadrillion of investments in computing and electricity alone. Not including logistics and other infrastructure.
A country like the USA invests like 20% of its $23 trillion GDP per year. So at the current rate, it will take 53,687 years or so to invest enough to automate what already exists on Earth. Even if the USA only automated to replace its own 1% of geniuses, that's still 3.3 million * $1.029 billion = $3.395 quadrillion. That would still take 738 years.
How much rare earth metal, fossil fuel, and other finite resources would it take to get these damn machines running just for 10 years? It's not even feasible. Fusion you say? That changes the capital outlay and probably vastly increases the productivity rate, but there are non-energy inputs that will fail first. And we haven't even gotten into how much it costs to add physical action to a machine's repertoire (ie, robotics).
Even with never before seen scientific breakthroughs, and exponential productivity gains, it still seems unlikely that computers could ever catch up to the human thinking class. Professions will change, but the skill required will increase. That may remove some humans from the workforce, but we have just bracketed one side of our solution: the smartest cannot be replaced.
PS: I had taken this down because I thought maybe I had done my calculations horribly wrong, but it turns out I underestimated the cost. How many chained GPUs per motherboard? Coolant? Facility? Downtime from hardware down? ENIAC was often down because there was always some piece of hardware that wasn’t working. Here are some hardware specs showing how huge the power draw is for these systems: https://www.tomshardware.com/reviews/best-gpus,4380.html https://wccftech.com/amd-rdna-3-gpu-radeon-rx-7000-enhanced-raytracing-capbilities-higher-clocks-av1-dp-2-0-support/ https://beebom.com/intel-core-i9-13900k-review/