I open a 10MB+ XML file several times in my script in different functions:
$dom = DOMDocument::load( $file ) or die('couldnt open');
1) Is the above the old style of loading a document?
I am using PHP 5. Oppening it statically?
2) Do I need to close the loading of the XML file, if possible?
I suspect it开发者_运维技巧s causing memory problems because I loop through the nodes of the XML file several thousand times and sometimes my script just ends abruptly.
Thanks all for any help
Using a DOM parser, the whole XML document is loading in memory -- which can lead to problems when working with a big document (I know, you probably don't have much of a choice)
First of all, I would try not to open the same document more than once :
- it means more work for PHP : it has to parse a big document several times, and, each time, build the DOM tree in memory
- it might require more memory -- In theory, when leaving the function in which you instanciated the
DOMDocument
object, its destructor should be called, and memory released, but, who knows...
About the "Is the above the old style of loading a document", well, looking at the documentation for DOMDocument::load
, it seems it can be called both dynamically (see the example) and statically (see the return value section) ; so, I suppose both solutions are OK, and there is no "old way" nor "new way".
What do you mean by "my script just ends abruptly" ? Do you have a Fatal Error about memory_limit
?
If yes, if you can change that kind of configuration setting, it might help to set memory_limit
to a higher value.
There's a strong text concering memory leaks in DOMDocument. When I was parsing huge xml files (90MB) I used read file line by line and parse it with regualar expression. It's ugly, I know, but it worked without with quite low memory.
精彩评论