Let me describe what I've made ar first:
I have to import large ammount of data from different xml's to my database and because it last a lot I had to put a progress bar and I did it like this: I split the whole import into tiny little AJAX requests and I import little data at a time (when an ajax request completes the progress bar increases a bit). This whole idea is great but the data just keeps getting bigger and bigger and I can't optimize the code anymore (it's as optimized as it gets).
The problem is that everytime I do a AJAX call I lose a lot of time with things specific to the framework (model initializations and stuff), with the browser handling the url and so on. So I was wondering if I could use the flush function from php.
But I've been reading that the flush function doesn't work great on all browsers (which is weird cause it's a server-side function). If I would use the flush function I would just write <script>increase_progressbar</script>
or whatever I want and I could do it.
So, any opinions on the flush function? I've bee开发者_C百科n testing it on little scripts but I want to know if someone really used it with big scripts. Also, I can listen to any other suggestion of doing what I want to do :)
I wont give you direct advise, but I will tell you how I did it in one of my projects. In my case I need to upload an Excel files and then parse them. The data exceeding 3000 rows and I had to check all columns of each row for some data. When I parse it directly after the upload, the parser often crashes somewhere and it was really not safe.
So how I did it? The upload process has been split in 2 parts:
Upload physically the file (regular upload field and submit). When the button is clicked some CSS and JS "magic" hide the form and one nice loading bar appears on the screen. When the upload has been done the page just refreshes and the form appear again for the next file
Start parsing the data on the background using php-cli as @Dragon suggest with exec().
In the database I had a table which stores information about the files and there is a boolean field called "parsed". When the parser finishes the job, the last task is to update that field to true.
So here is the whole process from user point of view:
- select a file and upload it.
- wait until the file has been uploaded on the server. Till then a message and loading bar appear indicating that something is working. The upload form has been hidden with CSS and JS, so preventing user to upload another file.
- When it's over the page has been refreshed (because I did normal _POST submit) the form appear on the screen again as well as a list of recently uploaded files (this I've stored this in the session).
- In each of the nodes of that list I had an indicator (an icon). In the first time it's a spinner (ajax spinning wheel).
- On a regular basis (30 sec or 1 min) I've checked the file table through Ajax call and reading the parsed field. If the background process has been over, the field was set to true and with some JS and CSS I've changed the icon to "Done". Otherwise the spinner remain.
In my project I doesn't have requirement to show extra details about the imports, but you can always go wild with other extra data.
Hope this help you with your project.
精彩评论