So, for performance reasons, I need my app to store big arrays of data in a way that's fast to parse. I know JSON is readable but it's not fast to decode. So it's I should either convert my array into pure php code or I have to serialize it and then deserialize. So, which is faster? Are there any better solutions? I could do a benchmark myself, but It's always better to consider other people's experiences :)
More info: By big array I mean something with about 2MB worth of data returned from calling print_r() on it!
and by converting it into pure php code I mean this:
suppose this is my array: {"index1":"value1","index2":"val'ue2"}
and this would what the hypothetical function convert_array_to_php() would return:
$ar开发者_运维百科ray = array('index1'=>'value1' ,'index2'=>'val\'ue2');
Depends on the data and usage patterns.
Generally unserialize() is faster than json_decode(), which is faster than include(). However with large data amounts, the bottleneck is actually the disk. So unserialize(gzdecode(file_get_contents()))
is often the fastest. The difference in decoding speed might be neglectible in comparison to reading it from disk.
If you don't really need to read out the complete data set for printing or calculation, then the fastest storage might be SQLite however. It often keeps indexes in memory.
Well, I did a little benchmark, I put about 7MB of pure php coded array into a php file, and also put it's json version in another file and also a serialized version. Then did a benchmark on all three of them and here is the result: As expected, the json format was the slowest to decode, it took about 3 times longer than the pure php code to parse. And it's interesting to know that unserialize() was the fastest one, performing around 4 times faster than the native php code.
Pure php code will probably have to be the fastest. However, it's unlikely to be the best option, because it is probably harder to maintain. It depends on the nature of the data though.
Isn't there a better option available but relying on PHP solely for this to?
I am guessing that handling a few arrays of this size is going to hit your server quite hard.
Is it possible for you to maybe utilize a database with some temporary tables to do what you need to do with the data in the arrays?
精彩评论