开发者

Certain Files getting corrupted by SQL Server FileStream

开发者 https://www.devze.com 2023-01-06 03:01 出处:网络
I am saving files to a SQL server 2008 (Express) database using FILESTREAM, the trouble I\'m having is that certain files seem to be getting corrupted in the process.

I am saving files to a SQL server 2008 (Express) database using FILESTREAM, the trouble I'm having is that certain files seem to be getting corrupted in the process.

For example if I save a word or excel document in one of the newer formats (docx, or xslx) then when I try to open the file I get an error message saying that the data is corrupted and would I like word/excel to try and recover it, If I click yes office is able to 'recover' the data and opens the file in compatibility mode.

However if i zip the file first then after extracting the contents I'm able to open the file without a problem. Strangely If I save an mp3 file to the database then I have the reverse issue, I can open the file no problem, but If I saved a zipped version of the mp3 I can't even extract the contents of that zip. When I tried to save a pdf or power-point file I ran into similar problems (the pdf i could only read if I zipped it first, and the ppt I couldn't read at all).

Update: here's my code that I'm using to write to the database and to read

To write to the database:

SQL = "SELECT Attachment.PathName(), GET_FILESTREAM_TRANSACTION_CONTEXT() FROM Activity " +
               "WHERE RowID = CAST(@RowID as uniqueidentifier)";
           transaction = connection.BeginTransaction();

           command.Transaction = transaction;
           command.CommandText = SQL;
           command.Parameters.Clear();
           command.Parameters.Add(rowIDParam);

           SqlDataReader readerFS = null;
           readerFS= command.ExecuteReader();

   string path = (string)readerFS[0].ToString();
   byte[] context = (byte[])readerFS[1];
   int length = context.Length;

   SqlFileStream targetStream = new SqlFileStream(path, context, FileAccess.Write);

         int blockSize = 1024 * 512; //half a megabyte
            byte[] buffer = new byte[blockSize];
            int bytesRead = sourceStream.Read(buffer, 0, buffer.Length);
            while (byt开发者_运维问答esRead > 0)
            {
                targetStream.Write(buffer, 0, bytesRead);
                bytesRead = sourceStream.Read(buffer, 0, buffer.Length);
            }

            targetStream.Close();
            sourceStream.Close();
            readerFS.Close();
            transaction.Commit();

And to read:

        SqlConnection connection = null;
        SqlTransaction transaction = null;

        try
        {
            connection = getConnection();
            connection.Open();
            transaction = connection.BeginTransaction();

            SQL = "SELECT Attachment.PathName(), + GET_FILESTREAM_TRANSACTION_CONTEXT() FROM Activity"
          +  " WHERE ActivityID = @ActivityID";


            SqlCommand command = new SqlCommand(SQL, connection);
            command.Transaction = transaction;

            command.Parameters.Add(new SqlParameter("ActivityID", activity.ActivityID));

            SqlDataReader reader = command.ExecuteReader();

            string path = (string)reader[0];
            byte[] context = (byte[])reader[1];
            int length = context.Length;
            reader.Close();

            SqlFileStream sourceStream = new SqlFileStream(path, context, FileAccess.Read);

            int blockSize = 1024 * 512; //half a megabyte
            byte[] buffer = new byte[blockSize];
           List<byte> attachmentBytes = new List<byte>();

            int bytesRead = sourceStream.Read(buffer, 0, buffer.Length);

            while (bytesRead > 0)
            {
                bytesRead = sourceStream.Read(buffer, 0, buffer.Length);
                foreach (byte b in buffer)
                {
                    attachmentBytes.Add(b);
                }

            }

            FileStream outputStream = File.Create(outputPath);

            foreach (byte b in attachmentBytes)
            {
                 byte[] barr = new byte[1];

                 barr[0] = b;

                 outputStream.Write(barr, 0, 1);
            }

            outputStream.Close();
            sourceStream.Close();
            command.Transaction.Commit();


Your read code is incorrect:

  while (bytesRead > 0)
        {
            bytesRead = sourceStream.Read(buffer, 0, buffer.Length);
            foreach (byte b in buffer)
            {
                attachmentBytes.Add(b);
            }

        }

If the bytesRead is less than buffer.Length, you still add the entire buffer to the attachementBytes. Thus, you always corrupt the document returned by adding any garbage in the end of the last buffer post bytesRead.

Other than that, allow me to have a really WTF moment. Reading a stream as a List<byte> ?? C'mon! First, I don't see the reason why you need to read into an intermediate in-memory storage to start with. You can simply read buffer by buffer and write each buffer straight into the outputStream. Second, if you must use an intermediate in-memory storage, use a MemoryStream, not a List<byte>.


I had the exact problem a few months back and figured out that I was adding an extra byte at the end of the file when reading it from FILESTREAM.

0

精彩评论

暂无评论...
验证码 换一张
取 消

关注公众号