I've a camera and I'm reading the images in real time into an array. I'm applying some algorithm to the image and displaying it. Then I get the next image and display it as well. So I'm streaming images from the camera to the display. However I also want to save images to hard disk once I've displayed them. I tried using the main thread but everything slowed down too much. I then tried using ThreadPool (see code below). This doesn't slow the display down but I've found the images aren't being saved properly. It looks like they are not in the expected order and after about 50 images have been saved the subsequent image data looks garbled. I'm guessing too many threads are being started.
Is there a better way to do this? I think I only need one thread to save the images. Maybe some kind of queue that saves each image sequentially. Just as long as it开发者_如何学编程s done in the background and doesn't slow down the display. If someone could post a code snippet that would be fantastic.
short[] image1 = new short[20000];
while(streaming)
{
ReadImageFromCamera(ref image1)
ImageData data;
data.fileName = imageNumber;
data.image = image1;
ThreadPool.QueueUserWorkItem(WriteImageToFile, data); // Send the writes to the queue
}
private void WriteImageToFile(object imageData) {
try {
ImageData data = (ImageData)imageData;
System.Runtime.Serialization.Formatters.Binary.BinaryFormatter bf = new System.Runtime.Serialization.Formatters.Binary.BinaryFormatter();
string fName = myDirectory + @"/" + Convert.ToString(data.fileName) + @".spe";
using (Stream myStream = new FileStream(fName, FileMode.Create)) {
bf.Serialize(myStream, data.image);
}
}
catch (Exception) { }
}
I think you should avoid starting a new thread for each particular image. Since you have got just a single hard drive and store all files into the single directory, you should use just one disk writer thread. Then I'd recommend using some concurrent queue to transfer jobs from camera thread to disk writer thread. I don't show "code snippet" because this is not a thing you can write in good quality in a few lines of code.
Also you definitely must somewhere put 'new short[20000]' for each image, otherwise it is overwritten by next image before you save it to disk.
Also, I would expect that it is sufficient to write files in the main thread, because Windows uses concurrent techniques (mainly disk cache) automatically when you write data to disk. Are you sure that your hardware is fast enough to write all those data in real time?
When dealing with threads, ordering is no longer in your control. The thread pool can choose to schedule the threads in any order it likes. If you need things to happen sequentially in a specific order, threading does not make much sense anyway.
Regarding the corrupted images, it looks like the short[] image1
instance is being passed around. It is unclear what happens inside ReadImageFromCamera
, but since you pass a pre-initialized array into it, chances are that the method will use that array and simply copy data into it (even though the ref
keyword indicates that it might create a brand new array instance and assign that instead). Then you pass that array instance to WriteImageToFile
on a separate thread.
Meanwhile, in parallell, you get the next image. Now you have a scenario where ReadImageFromCamera
might write data into the array at the same time as WriteImageToFile
is storing the data on disk. There you have your corrupted image. This can be avoided by passing a new array instance to WriteImageToFile
:
ReadImageFromCamera(ref image1)
ImageData data;
data.fileName = imageNumber;
data.image = (short[])image1.Clone(); // create a new array instance, so that
// the next call to ReadImageFromCamera
// will not corrupt the data
ThreadPool.QueueUserWorkItem(WriteImageToFile, data);
Still, as has been mentioned by Al Kepp, since you have only one hard drive, launching many threads might not be your best option here. You could look into having one long-running separate thread for storing data on disk, and putting the images into some sort of queue that the storage thread picks up data from and writes to disk. This comes with its own set of problems dealing with concurrency, limiting the size of the queue and what not.
You need to create a distinct buffer for the thread to read data from, otherwise main thread will overwrite it when you dump it to a file. The way you are doing it seems to copy only references (image1
in particular).
So:
ThreadPool.QueueUserWorkItem(WriteImageToFile, data);
instead of data you'll send in a deep copy of data
. Since it seems you are already doing it - but in the worker thread - you just need to move the copy before sending it.
HTH
You have to check before thinking about threads if the speed of a normal disk will be sufficient for your task as you may create images faster than writing to the disk. If the image creation is faster than writing I would look at using a Memory disk, but then you need to calculate if the size is sufficient until you stop the camera, so that you can write to the normal disk overnight.
If you use .NET 4.0 I would suggest that you use Concurrent queue together with a normal thread (as the thread will run until the program finishes).
Quick and dirty way is starting single new thread and work with global class members - the new thread should be able to access them while the main thread will update them.
First of all, have these lines outside of any function:
private List<ImageData> arrGlobalData = new List<ImageData>();
private bool keepWritingImages = true;
Now change the code in the "main" thread to this:
short[] image1 = new short[20000];
ThreadPool.QueueUserWorkItem(WriteImageToFile, null);
while(streaming)
{
ReadImageFromCamera(ref image1)
ImageData data = new ImageData();
data.fileName = imageNumber;
data.image = image1;
arrGlobalData.Add(data);
}
keepWritingImages = false;
And finally have such function for the new thread:
private void WriteImageToFile(object imageData)
{
while (keepWritingImages)
{
if (arrGlobalData.Count > 0)
{
ImageData data = arrGlobalData[0];
try
{
System.Runtime.Serialization.Formatters.Binary.BinaryFormatter bf = new System.Runtime.Serialization.Formatters.Binary.BinaryFormatter();
string fName = myDirectory + @"/" + Convert.ToString(data.fileName) + @".spe";
using (Stream myStream = new FileStream(fName, FileMode.Create))
{
bf.Serialize(myStream, data.image);
}
}
catch
{
}
finally
{
arrGlobalData.Remove(data);
}
}
Thread.Sleep(10);
}
}
You can do following.
public class AsyncFileWriter
{
private readonly FileStream fs;
private readonly AsyncCallback callback;
public Action FinishedCallback;
private IAsyncResult result;
private class AsyncState
{
public FileStream Fs;
}
private void WriteCore(IAsyncResult ar)
{
if (result != null)
{
FileStream stream = ((AsyncState)ar.AsyncState).Fs;
stream.EndWrite(result);
if (this.FinishedCallback != null)
{
FinishedCallback();
}
}
}
public AsyncFileWriter(FileStream fs, Action finishNotification)
{
this.fs = fs;
callback = new AsyncCallback(WriteCore);
this.FinishedCallback = finishNotification;
}
public AsyncFileWriter(FileStream fs)
: this(fs, null)
{
}
public void Write(Byte[] data)
{
result = fs.BeginWrite(data, 0, data.Length, callback, new AsyncState() { Fs = fs });
}
}
Later you can consume it as.
static void Main(string[] args)
{
FileStream fs = File.Create("D:\\ror.txt");
ManualResetEvent evt = new ManualResetEvent(false);
AsyncFileWriter writer = new AsyncFileWriter(fs, () =>
{
Console.Write("Write Finished");
evt.Set();
}
);
byte[] bytes = File.ReadAllBytes("D:\\test.xml");//Getting some random bytes
writer.Write(bytes);
evt.WaitOne();
Console.Write("Write Done");
}
精彩评论