This problem assumes that we are using a processor with a 15-stage pipeline and that the actual direction (T or NT) for a branch is determined during the 10th stage. The processor is executing a program in which 10% of executed instructions are conditional branches, and the only stalls occur when the direction of a conditional branch is mispredicted. The accuracies of different predictors for this program are as follows:
What is the branch misprediction penalty (in terms of cycles)? In other words, how many cycles are added to the execution time if we have a branch misprediction instead of a correct prediction.
If the branch is fetched in cycle C (this is the 1st stage for this branch), with a correct prediction the next (correct) instruction is fetched in cycle C+1. In case of a misprediction, the branch outcome is known at the end of cycle C+9. Note that the 1st stage for this branch was in cycle C, so its 10th stage is in cycle C+9. A correct instruction will be fetched in cycle C+10. This correct next instruction should have been fetched in cycle C+1, but was fetched in C+10 instead, so the misprediction penalty is 9 cycles.