I am coding a side channel attack demo. I expected a simple character-by-character check with early return to take a linearly-proportional time to the first incorrect character. For example, if I have the password password
and a input of qassword
(first letter wrong) and passwore
(last letter wrong), I expected qassword
to take less time to compare than passwore
(because it exits early, on the first wrong letter).
The loop in question is:
for (int i = 0; i < 26; i++) {
if (a[i] != in[i]) {
return 1;
}
}
return 1;
I have disabled optimizations with -O0
.
The output of the full code is:
Diff(9326)
Diff(2)
Diff(1)
Diff(9292)
This is surprising to me because the first case is checking all the letters (ok, should take the longest time), the second case is checking all but the last letter (should take about the same amount of time as first case?) and the last case should only be checking the first letter (ok, taking the least amount of time).
My question is why does it take so long for the case where it is completely correct and so quick for all other cases (even if the wrong letter is in the middle)?
The full code is (Try It Online):
#include <stdio.h>
#include <time.h>
int check(char* in) {
const char* a = "abcdefghijklmnopqrstuvwxyz";
for (int j = 0; j < 100000; j++) {
for (int i = 0; i < 26; i++) {
if (a[i] != in[i]) {
return 1;
}
}
}
return 1;
}
int main() {
clock_t start;
clock_t diff;
// All correct
start = clock();
check("abcdefghijklmnopqrstuvwxyz");
diff = clock() - start;
printf("Diff(%ld)\n", diff);
// Last letter wrong
start = clock();
check("abcdefghijklmnopqrstuvwxya");
diff = clock() - start;
printf("Diff(%ld)\n", diff);
// First letter wrong
start = clock();
check("zbcdefghijklmnopqrstuvwxyz");
diff = clock() - start;
printf("Diff(%ld)\n", diff);
// Check if cache issue (same as case 1)
start = clock();
check("abcdefghijklmnopqrstuvwxyz");
diff = clock() - start;
printf("Diff(%ld)\n", diff);
}