开发者

How can I avoid reloading an XML document multiple times?

开发者 https://www.devze.com 2022-12-18 05:54 出处:网络
tl;dr: I want to load an XML file once and reuse it over and over again. I have a bit of javascript that makes an ajax request to a PHP page that collects and parses some XML and returns it for displ

tl;dr: I want to load an XML file once and reuse it over and over again.

I have a bit of javascript that makes an ajax request to a PHP page that collects and parses some XML and returns it for display (like, say there are 4,000 nodes and the PHP paginates the results into chunks of 100 you would have 40 "pages" of data). If someone clicks on one of those other pages (besides the one that initially loads) then another request is made, the PHP loads that big XML file, grabs that subset of indexes (like records 200-299) and returns them for display. My question is, is there a way to load that XML file only once and just reuse it over and over?

The process on each ajax request is:

- load the xml file (simplexml_load_file())

- parse out the bits needed (with xpath)

- use LimitIterator to grab the specific set of indexes I need

- return that set

When what I'd like it to be when someone requests a different paginated result is:

- use LimitIterator on the data I loaded in the previous request开发者_Python百科 (reparse if needed)

- return that set

It seems (it is, right?) that hitting the XML file every time is a huge waste. How would I go about grabbing it and persisting it so that different pagination requests don't have to reload the file every time?


Just have your server do the reading and parsing of the paginated file based on the user input and feedback. Meaning it can be cached on the server much quicker than it would take the client to download and cache the entire XML document. Use PHP, Perl, ASP or what have you to paginate the data prior to displaying it to the user.


I believe the closest thing you are going to get is Memcached.

Although, I wouldn't worry about it, especially if it is a local file. include like operations are fairly cheap.


To the question "hitting the XML file every time is a huge waste" then answer is yes, if you have to parse that big XML file everytime. As I understand, you want to save the chunk the user is interested in so that you don't have to do that everytime. How about a very simple file cache? No extension required, fast, simple to use and maintain. Something like that:

function echo_results($start)
{
    // IMPORTANT: make sure that $start is a valid number
    $cache_file = '/path/to/cache/' . $start . '.xml';
    $source     = '/path/to/source.xml';
    $mtime      = filemtime($cache_file);

    if (file_exists($cache_file)
     && filemtime($cache_file) < $mtime)
    {
        readfile($cache_file);
        return;
    }

    $xml = get_the_results_chunk($start);
    file_put_contents($cache_file, $xml);

    echo $xml;
}

As an added bonus, you use the source file's last modification time so that you automatically ignore cached chunks that are older than their source.

You can even save it compressed and serve it as-is if the client supports gzip compression (IOW, 99% of browsers out there) or decompress it on-the-fly otherwise.


Could you load it into $_SESSION data? or would that blow out memory due to the size of the chunk?

0

精彩评论

暂无评论...
验证码 换一张
取 消