所以我试图使用std :: chrono :: high_resolution_clock来计算需要执行多长时间。我想你可以找到开始时间和结束时间之间的差异...
So I was trying to use std::chrono::high_resolution_clock to time how long something takes to executes. I figured that you can just find the difference between the start time and end time...
为了检查我的方法工作,我做了以下程序:
To check my approach works, I made the following program:
#include <iostream> #include <chrono> #include <vector> void long_function(); int main() { std::chrono::high_resolution_clock timer; auto start_time = timer.now(); long_function(); auto end_time = timer.now(); auto diff_millis = std::chrono::duration_cast<std::chrono::duration<int, std::milli>>(end_time - start_time); std::cout << "It took " << diff_millis.count() << "ms" << std::endl; return 0; } void long_function() { //Should take a while to execute. //This is calculating the first 100 million //fib numbers and storing them in a vector. //Well, it doesn't actually, because it //overflows very quickly, but the point is it //should take a few seconds to execute. std::vector<unsigned long> numbers; numbers.push_back(1); numbers.push_back(1); for(int i = 2; i < 100000000; i++) { numbers.push_back(numbers[i-2] + numbers[i-1]); } }问题是,它只输出3000ms它显然不是真的。
The problem is, it just outputs 3000ms exactly, when it clearly wasn't actually that.
对于较短的问题,它只输出0ms ...我做错了什么?
On shorter problems, it just outputs 0ms... What am I doing wrong?
编辑:如果它是任何用途,我使用GNU GCC编译器与
If it's of any use, I'm using the GNU GCC compiler with -std=c++0x flag on
推荐答案high_resolution_clock的分辨率取决于平台。
The resolution of the high_resolution_clock depends on the platform.
打印以下内容可以帮助您了解您使用的实现的解决方案。
Printing the following will give you an idea of the resolution of the implementation you use
std::cout << "It took " << std::chrono::nanoseconds(end_time - start_time).count() << std::endl;更多推荐
std :: chrono :: high
发布评论