This problem assumes that we are using a processor with a 15-stage pipeline and that the actual direction (T or NT) for a branch is determined during the 10th stage. The processor is executing a program in which 10% of executed instructions are conditional branches, and the only stalls occur when the direction of a conditional branch is mispredicted. The accuracies of different predictors for this program are as follows:
For each of these branch predictors, what is the CPI achieved by the processor?
We waste 9 cycles for each mispredicted branch, so we have:
CPI with always-taken: 1+0.1*(1-0.45)*9=1.495
CPI with always-not-taken: 1+0.1*(1-0.55)*9=1.405
CPI with 2-bit: 1+0.1*(1-0.9)*9=1.090
CPI with global: 1+0.1*(1-0.95)*9=1.045
CPI with tournament: 1+0.1*(1-0.98)*9=1.018