开发者

Most efficient way to maintain a large frequently updated XML feed

开发者 https://www.devze.com 2023-03-01 00:31 出处:网络
I have to generate some fairly large XML feeds that get updated very frequently (hundreds of thousands of elements, hundreds of megabytes per feed, tens of feeds); right now I can only re-generate the

I have to generate some fairly large XML feeds that get updated very frequently (hundreds of thousands of elements, hundreds of megabytes per feed, tens of feeds); right now I can only re-generate these on a nightly basis, but I'd like to get them closer to real-time.

Right now I'm thinking that I could generate each XML开发者_StackOverflow中文版 element as a separate text file, so when any element is updated, I can go update just that file, and then concatenate all the files together for the final deliverable XML feed.

So two questions... (1) Does this seem like a good approach? (2) What's the most efficient way to concatenate thousands of text files?


It seems like a decent approach to me given the requirements. Are you able to append new items to the top or bottom of the feed as they arrive? If so, maybe you could keep the most recent feed around and then simply append any new items to it as they arrive.

How you concat the files depends on what tools you are planning to use. Is this being build in batch files or a programming language?

0

精彩评论

暂无评论...
验证码 换一张
取 消