Hz is in cycles/second, or things that happen per second. For example, monitor refresh rates are expressed in hertz as they are # of screen refreshes per second. As such, these speed are indeed the same type of measurement used with WiFi and network speeds (also expressed in things/second, where things is data)
Data is dumb and simple, so to speak. Saying how much goes through the connection pretty much covers everything that matters.
For more complicated devices such as processors, it’s much more difficult to summarise what it’s really doing. So we call it ‘cycles’ and give it a number (4GHz!) and pretend that says it all. It doesn’t. To really understand what’s happening, you’d have to study microarchitecture, which also differs between manufacturers. Two processors at equal speed are not necessarily equally good. For comparisons, they often use benchmark software or test the performance of games, with varying results.
Tradition? Marketing? It’s an easy metric?
In truth it’s not a universal measure of how much processing power a processor has. It’s useful for comparing between CPUs in a family, but not necessarily between chipsets. There are other metrics used. Folks used to care a lot about FLOPS (floating-point operations per second, more giga-FLOPS or tera-FLOPS now), which is more about throughput. Don’t they track matrix operations on GPUs?
I mostly stopped paying attention once everything got fast enough that it didn’t really matter much any more. I’m almost never CPU bound any more anyway.
Sometimes a more useful metric of performance for a cpu is the MIPs (millions of operations per second) and FlOPS (floating point operations per second). These are handy because they take into account pipelining and other factors that can speed up execution.
But why stop there? There are published benchmarks designed for all kinds of computer use, from gaming to AI to workbench use and beyond, that carefully simulate real world experiences.
Hertz represents operations per second. An operation is not a discrete number of bits, since you’re not *moving* data so much as *changing* it. Ex: an AND operation inputs two bits and outputs one. An addition operation inputs two or more bits and outputs at least two. The result is that an AND operation moves three bits, while an addition operation moves four or more, yer both operations may take the same amount of time.
Latest Answers