开发者

optimize large csv file writer - large table datareader using TPL PLINQ

开发者 https://www.devze.com 2023-03-09 23:00 出处:网络
Any tips on how I can optimize below further using TPL and/or PLINQ. Below code runs on a background worker

Any tips on how I can optimize below further using TPL and/or PLINQ.

Below code runs on a background worker

Read a large table using sql reader 
Open stream writer to write a large csv file
while (reader.read())
{
   massage the data, parse data from columns etc. 
   create csv string to write to file
   write csv line to fil开发者_高级运维e
}
close reader
close file

Thank you.


You might find better performance by writing the csv line data to a StringBuilder (I.E., in memory) then writing the contents out to your csv file. I would suggest using both methods along with a memory profiler like ANTS or the JetBrains product.


define "optimize further"... Do you want more speed or less memory use?

Assuming above pseudo code is correctly implemented then memory use should be already pretty minimal.

Speed? Based on the statement that that you are working with a large data set, then the data reader would be your biggest source of slowness. So if you really wanted to use parallel processing, then you'd have to fragment your data set (presumably open multiple readers?)

But then again, you are already running it in a background worker, so does it really matter?

0

精彩评论

暂无评论...
验证码 换一张
取 消