开发者

Writing direct to disk with php

开发者 https://www.devze.com 2022-12-31 15:35 出处:网络
I would like to create an upload script that doesn\'t fall under the p开发者_JAVA百科hp upload limit.

I would like to create an upload script that doesn't fall under the p开发者_JAVA百科hp upload limit. There might be an occasion where I need to upload a 2GB, or larger file and I don't want to have to change the whole server execution to above 32MB.

Is there a way to write direct to disk from php?

What method might you propose someone would use to accomplish this? I have read around stack overflow but haven't quite found what I am looking to do.


The simple answer is you can't due to the way that apache handles post data.

If you're adamant to have larger file uploads and still use php for the backend you could write a simple file upload receiver using the php sockets api and run it as a standalone service. Some good details to be found at http://devzone.zend.com/article/1086#Heading8


Though this is an old post, you find it easily via google when looking for a solution to handle big file uploads with PHP. I'm still not sure if file uploads that increase the memory limit are possible but I think there is a good chance that they are. While looking for a solution to this problem, I found contradicting sources. The PHP manual states

post_max_size: Sets max size of post data allowed. This setting also affects file upload. To upload large files, this value must be larger than upload_max_filesize. If memory limit is enabled by your configure script, memory_limit also affects file uploading. Generally speaking, memory_limit should be larger than post_max_size. (http://php.net/manual/en/ini.core.php)

...which implies that your memory limit should be larger than the file you want to upload. However, another user (ragtime at alice-dsl dot com) at php.net states:

I don't believe the myth that 'memory_size' should be the size of the uploaded file. The files are definitely not kept in memory... instead uploaded chunks of 1MB each are stored under /var/tmp and later on rebuild under /tmp before moving to the web/user space.

I'm running a linux-box with only 64MB RAM, setting the memory_limit to 16MB and uploading files of sizes about 100MB is no problem at all! (http://php.net/manual/en/features.file-upload.php)

He reports some other related problems with the garbage collector but also states how they can be solved. If that is true, the uploaded file size may well increase the memory limit. (Note, however, that another thing might be to process the uploaded file - then you might have to load it into memory)

I'm writing this before I tried handling large file uploads with PHP myself since I'm evaluating using php or python for this task.


You can do some interesting things based around PHP's sockets. Have you considered writing an applet in Java to upload the file to a listening PHP daemon? This probably won't work on most professional hosting providers, but if you're running your own server, you could make it work. Consider the following sequence:

  1. Applet starts up, sends a request to PHP to open a listening socket
    1. (You'll probably have to write a basic web browser in Java to make this work)
  2. Java Applet reads the file from the file system and uploads it to PHP through the socket that was created in step 1.

Not the cleanest way to do it, but if you disable the PHP script timeout in your php.ini file, then you could make something work.


It isn't possible to upload a file larger than PHP allowed limits with PHP, it's that simple.

Possible workarounds include using a client-side technology - like Java, not sure if Flash and Javascript can do this - to "split" the original file in smaller chunks.

0

精彩评论

暂无评论...
验证码 换一张
取 消