使用SDL + OpenGL实现高CPU使用率(High CPU usage with SDL + OpenGL)
我有一个现代CPU(AMD FX 4170)和一个现代GPU(NVidia GTX 660)。 然而,这个简单的程序设法完全使用我的CPU核心之一。 这意味着它使用一个4.2 GHz核心在60 FPS下不抽取任何东西 。 这个程序有什么问题?
#include <SDL/SDL.h> int main(int argc, char** argv) { SDL_Init(SDL_INIT_VIDEO | SDL_INIT_AUDIO); SDL_SetVideoMode(800, 600, 0, SDL_OPENGL | SDL_RESIZABLE); while(true) { Uint32 now = SDL_GetTicks(); SDL_GL_SwapBuffers(); int delay = 1000 / 60 - (SDL_GetTicks() - now); if(delay > 0) SDL_Delay(delay); } return 0; }I have a modern CPU (AMD FX 4170) and a modern GPU (NVidia GTX 660). Yet this simple program manages to fully use one of my CPU's cores. This means it uses one 4.2 GHz core to draw nothing at 60 FPS. What is wrong with this program?
#include <SDL/SDL.h> int main(int argc, char** argv) { SDL_Init(SDL_INIT_VIDEO | SDL_INIT_AUDIO); SDL_SetVideoMode(800, 600, 0, SDL_OPENGL | SDL_RESIZABLE); while(true) { Uint32 now = SDL_GetTicks(); SDL_GL_SwapBuffers(); int delay = 1000 / 60 - (SDL_GetTicks() - now); if(delay > 0) SDL_Delay(delay); } return 0; }最满意答案
事实证明,NVidia的驱动程序实现了等待使用繁忙循环的vsync,导致SDL_GL_SwapBuffers()使用100%CPU。 从NVidia控制面板关闭vsync会消除此问题。
It turns out that NVidia's drivers' implement waiting for vsync with a busy loop which causes SDL_GL_SwapBuffers() to use 100 % CPU. Turning off vsync from NVidia Control Panel removes this problem.
更多推荐
发布评论