开发者

Upload sequentially a huge file to a database using a pointer

开发者 https://www.devze.com 2023-01-13 15:32 出处:网络
It\'s a requirement in my app that the users can upload files to the database. At this point, the user upload the file and the webpage save it in a temp directory, secondly the logic load the file in

It's a requirement in my app that the users can upload files to the database. At this point, the user upload the file and the webpage save it in a temp directory, secondly the logic load the file in a Byte[] and pass this array as parameter to the "insert" SQL statement.

The problem wi开发者_如何学Goth that approach, is that if the user try to upload a 1GB file, it'll cause the server takes 1GB of memory in order to store that file as Byte[], and that memory should be GCollected later on. If several users do that at the same time, the server could collapse.

One way to avoid this, is limit the size of the file... but the customer don't want to do that. So the best approach seems to be upload the file to the database sequentially using a pointer. I've found an example of this for SQL Server using a special function named updatetext, but I'd like to know an approach valid for all kinds of databases, ie. to know if is possible and how to upload a file to a database in chunks.

Cheers.


I don't believe such a thing is possible.

A better option may simply be to store the file in a path on the database server and store only the path to that file in the database. This will eliminate the need for loading the file into memory at all. You can either use FTP or simply copy the file to the database server, depending on the network configuration.


As others have pointed out, I also believe that storing file on file system is better option. Note that you can have configurable shared folder (or file server) that can allow easy moving of the application.

Regardless, if you must write large binary data to database in chunks then ways will differ across databases. And you need to check for such support in ADO.NET data provider for that database. Here's the article that explain how to read & write large data in chunks for Sql Server and Oracle. As the way would changes as per database or managed provider, you have to abstract your API (as some interface) and code against the interface. Then create database specific implementation and use factory or IOC etc to choose the implementation.

0

精彩评论

暂无评论...
验证码 换一张
取 消