So I have implemented a google maps, using javascript to plot a polygon. I use getBounds to get the latLongs. Then I call a database (postcodes
), which contains each lat long for each postcode in the UK.
This query looks like:
SELECT * FROM `postcode`.`postcodeGoogle` WHERE (`lat` BETWEEN '$lat1' AND '$lat2')
AND (`long` BETWEEN '$long1' AND '$long2')
Which is fine, returns the right data.
I then have a second database, address_list
, which contains every address in the UK (at the moment it only contains Kettering area, so 12,000 rows). I do a while
loop on the first query and then on each loop run this query:
SELECT * FROM `digital_hub`.`address_list` WHERE `postcode`='".$row['postcode']."'
Then I add the latlong of the postcode to a variable:
$resultE = $resultE."(".$row['lat'].", ".$row['long'].") || ";
This then gets echoed out at the end of the loops.
This page is called via jQuery:
$.get("php/properties.php?bounds="+bounds, function(data){
var theLatLongs = data.split(" || ");
totalcount = 0;
for(x in theLatLongs){
var theLatLong = "";
var latLong = theLatLongs[x].substr(1);
var latLong = latLong.slice(0, -1);
var theLatLong = latLong.split(", ");
var thelat = theLatLong[0];
var thelong = theLatLong[1];
totalcount = totalcount+1;
}
$('#totalcount').html('<h6><span>'+totalcount+'</span> Households found</h6>Filtered by location');
});
It all works fine (I think). It's just in开发者_开发问答credibly slow. I know facebook have much better resources, but there create advert filterer is amazingly quick. I will also be implementing further filters. I tried a join but the time difference didn't seem to massive. Sometimes it doesn't even return results at all and crashes my mysql
service...
there are 3 sources that can slow the code: mysql, php, js.
the first thing I would do is run the sql (the join version), in a tool like toad or do a php file that outputs the raw result. You can add console.time/console.timeEnd to js, and mictrotime in php.
another "fast and dirty" check is to past "your_server"/php/properties.php?bounds=YourBounds, and check the result. it will give you some indication.
If you are certain it's sql, try to index digital_hub.address_list.postcodes, postcode.postcodeGoogle.lat, and postcode.postcodeGoogle.long.
Then, in your php "raw" script (or your sql tool), try to call the query with less column,
or even with "select count(*)
". If the "select count(*)
" is way faster, that means that the returning value is slowing the system. This is typical of empty (non null) date fields by example.
But they key is simple, time the different parts of the process to isolate the bottleneck.
console.time('load');
$.get("php/properties.php?bounds="+bounds, function(data){
console.timeEnd('load');
console.time('parse');
var theLatLongs = data.split(" || ");
totalcount = 0;
for(x in theLatLong ){
var theLatLong = "";
var latLong = theLatLongs[x].substr(1);
var latLong = latLong.slice(0, -1);
var theLatLong = latLong.split(", ");
var thelat = theLatLong[0];
var thelong = theLatLong[1];
totalcount = totalcount+1;
}
console.timeEnd('parse');
console.time('display');
$('#totalcount').html('<h6><span>'+totalcount+'</span> Households found</h6>Filtered by location');
console.timeEnd('display');
});
on a side note, you might consider json data. (and review your use of thelatlong js var, but I guess you are debugging) in php:
while ($row = msqlqresult) {
$resultArray[] = $row;
}
$resultJson = json_encode($resultArray);
in js:
console.time('load');
$.getJSON("php/properties.php", {bounds: bounds}, function(data){
console.timeEnd('load');
console.time('parse');
totalcount = 0;
for(var x in data){
var thelat = theLatLong[x].lat;
var thelong = theLatLong[x].long;
totalcount = totalcount+1;
}
console.timeEnd('parse');
console.time('display');
$('#totalcount').html('<h6><span>'+totalcount+'</span> Households found</h6>Filtered by location');
console.timeEnd('display');
});
精彩评论