In this slide, things looks a little off to me. Clock cycle time or clock period, is already time required per clock cycle. Question is, does the word Clock Rate
makes sense?
It also says, Hardware designer must often trade off clock rate against cycle count
. But, they are inversely related. If one increases Clock speed, the clock period(time for per clock cycle) will reduce automatically. Why there will be a choice?
Or am I missing something?
First things first, slides aren't always the best way to discuss technical issues. Don't take any slide as gospel. There's a huge amount of handwaving going on to support gigantic claims with so little evidence.
That said, there are tradeoffs: