dont cpus just multiply the ghz?
so like a quad core 4gz is theoretically a 16ghz single core
Correct me if I'm wrong, but (most) applications use only a single core. When each core runs at 4ghz, that application runs at 4ghz. Because a computer can only essentially run only one single task or command at a time, running a second application would mean having to run both in parallel. The computer needs to alternate between the program's parts and execute them one after another, rather than both at the same time in every instance. The command stack would look something like this:
do something for program 1
do something for program 2
do something for program 1
do something for program 2
And basically, both programs would be running at 2ghz. With a third program, they'd be running at 4/3 ghz because of this fact, and so on.
Multiple cores allow programs to truly be run in parallel, meaning with a quad core, you could actually execute four commands at once, rather than process one after another. If four different programs were "assigned" to four different cores, all would run at 4ghz, without being slowed down by another program running on the same core.
Because of this, if you had one singular program running on a quad core 4ghz, and (for comparison purposes) a 10ghz single core, the 10ghz would win, because the first program can't actually use another core to go faster.