printf("Actually took %11ld microseconds or %f seconds\n", tdif,
tdif/D_MILLION);
printf("Number of seconds per nanosleep was %f\n",
(tdif/(double)COUNT)/MILLION);
printf("Number of seconds per nanosleep should be %f\n",
NANOSECONDS/D_BILLION);
return 0;
}
复制代码
在我的机器上的输出是:
100 nanosleeps of 1000 nanoseconds
Should take 100 microseconds or 0.000100 seconds
Actually took 2007089 microseconds or 2.007089 seconds
Number of seconds per nanosleep was 0.020071
Number of seconds per nanosleep should be 0.000001
100 nanosleeps of 1000 nanoseconds
Should take 100 microseconds or 0.000100 seconds
Actually took 1998414 microseconds or 1.998414 seconds
Number of seconds per nanosleep was 0.019984
Number of seconds per nanosleep should be 0.000001
debian linux (kernel 2.4.27-2) on vmware5.0
CPU:P4 2.8G
MEMORY: 256M
100 nanosleeps of 1000 nanoseconds
Should take 100 microseconds or 0.000100 seconds
Actually took 2007089 microseconds or 2.007089 seconds
Number of seconds per nanosleep was 0.020071
Number of seconds per nanosleep should be 0.000001