Android 4.2和clock_gettime

时间:2014-03-08 05:46:57

标签: java android c android-ndk java-native-interface

我正在使用这些函数来测量执行函数所需的毫秒数:

struct timespec diff_timespec(struct timespec start, struct timespec end)
{
     struct timespec result;

     if (end.tv_nsec < start.tv_nsec)
     { 
        result.tv_nsec = 1000000000 + end.tv_nsec - start.tv_nsec;        
        result.tv_sec = end.tv_sec - 1 - start.tv_sec;
     }
     else
     {
        result.tv_nsec = end.tv_nsec - start.tv_nsec;        
        result.tv_sec = end.tv_sec - start.tv_sec;
     }

    return result;
}

long long nanosec_elapsed(struct timespec diff)
{
    return ((long long)diff.tv_sec * 1000000000) + (diff.tv_nsec );
}

我有一个调用另一个函数B的函数A.函数A通过JNI公开给Java,因此可以从Java调用它。操作系统是Android 4.2,语言显然是C。

void functionA(void)
{
     struct timespec t1;
     struct timespec t2;
     struct timespec diff;
     long long dt;
     double delta;


     clock_gettime(CLOCK_MONOTONIC, &t1);
     functionB();
     clock_gettime(CLOCK_MONOTONIC, &t2);
     diff = diff_timespec(t1,t2);
     dt = nanosec_elapsed(diff);
     delta = ((double) dt) / ((double) 1000000.0);

}


  void functionB(void)
  { 
     struct timespec t1;
     struct timespec t2;
     struct timespec diff;
     long long dt;
     double delta1;
     double delta2;
     double delta3;


     clock_gettime(CLOCK_MONOTONIC, &t1);
     /* 1. Here I do some stuff  */
     clock_gettime(CLOCK_MONOTONIC, &t2);
     diff = diff_timespec(t1,t2);
     dt = nanosec_elapsed(diff);
     delta1 = ((double) dt) / ((double) 1000000.0);

     clock_gettime(CLOCK_MONOTONIC, &t1);
     /* 2. Here I do some more stuff  */
     clock_gettime(CLOCK_MONOTONIC, &t2);
     diff = diff_timespec(t1,t2);
     dt = nanosec_elapsed(diff);
     delta2 = ((double) dt) / ((double) 1000000.0);

     clock_gettime(CLOCK_MONOTONIC, &t1);
     /* 3. Here I do the last stuff  */
     clock_gettime(CLOCK_MONOTONIC, &t2);
     diff = diff_timespec(t1,t2);
     dt = nanosec_elapsed(diff);
     delta3 = ((double) dt) / ((double) 1000000.0);

}

我遇到的问题是 - 有时 - (delta1 + delta2 + delta3)和delta之间存在很大差异。换句话说,函数A报告执行B需要T1 ms,而函数B本身报告它需要T2 ms ...但是T1和T2有时相隔很远,以至于我在质疑我的测量是否有效开始与???

此外,我正在测量的任务之一是发送UDP数据。

根据以上述方式执行的测量,发送一小段UDP数据有时需要5ms以上。我的日志摘录:

It took 9.552002 ms to send 100 bytes to server
It took 23.071289 ms to send 124 bytes to server
It took 4.791260 ms to send 128 bytes to server
It took 6.134032 ms to send 86 bytes to server
It took 10.589599 ms to send 100 bytes to server
It took 6.591797 ms to send 117 bytes to server
It took 3.173829 ms to send 117 bytes to server

有人可以帮我解释一下吗?

0 个答案:

没有答案