You appear to be comparing two different things here. GPT-4 was trained using around 2*10^25 FLOP (floating point operations). The Open Phil report estimates that 10^21 FLOP/s (floating point operations per second) would be needed to run—not train—a model that matches the human brain.
You appear to be comparing two different things here. GPT-4 was trained using around 2*10^25 FLOP (floating point operations). The Open Phil report estimates that 10^21 FLOP/s (floating point operations per second) would be needed to run—not train—a model that matches the human brain.