I'm working with Qt and OpenCV. I use QtOpenGL Widgets and map opencv frames as textures on a glPlane, which is really fast and works great.
Now I wonder if I could improve performance even more by using QThreads. QThr开发者_如何转开发ead has mapping and reducing features implemented. Therefore it should be possible to split opencv frames and let multiple Threads process the frames.
As Example: If my frame is 640x480 and I have 4 Threads available I would split the frame in 640x120 sized frames and pass one to each thread.
The Threads don't have shared data because every thread get it's own frame and at the end I just need to append the 4 frames or copy them into a new frame.
My Question is, do you think this will work and it will give me a boost in processing or is there a bottleneck elsewhere?
First of all, my instinct is to say: "If it is really fast and works great, don't change it!"
If you just want to play around with threads, I recommend that you take a look at QtConcurrent. I don't know anything about performance differences between an implementation with QThread vs. QtConcurrent, but I prefer using QtConcurrent::run
over using threads (I don't have performance critical rendering code).
First of all, make sure you don't create new threads for every frame - constructing threads are expensive. A lightweight solution could be to use QtConcurrent::run() to create N QFutures. Use QFuture::waitForFinished on each of the futures and collect the results. However, I'm not convinced that QFutures are the right solution for realtime processing. An alternative approach would be to use QThreadPool directly. Again, make sure you don't create the pool every frame.
精彩评论