Whats this?? my query is:
Doctrine_Query::create()
->select('rec.*')
->from('Records rec')
->execute();
Records table contains more than 20k rows.
On executing query on page i am getting an error:
Fatal error: Maximum execution time of 60 seconds exceeded in \doctrine\lib\Doctrine\Collection.php on line 462
Edit but to retrieve the values from a table having records more than 20k its taking more than 60 seconds.. Why?? And i am using same table for retriving some other records with a where clause, its running perfectly..
Table Structure:
CREATE TABLE IF NOT EXISTS `records` (
`id` int(11) NOT NULL AUTO_INCREMENT,
`state` varchar(255) COLLATE utf8_unicode_ci DEFAULT NULL,
`city` varchar(255) COLLATE开发者_运维技巧 utf8_unicode_ci DEFAULT NULL,
`school` varchar(255) COLLATE utf8_unicode_ci DEFAULT NULL,
PRIMARY KEY (`id`)
) ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci AUTO_INCREMENT=16334 ;
database is local.
We can do one thing, we can select a single element. like: city... I tried that even then its giving the same error..
Number of rows are 16333(exactly) and number of columns are 4 as mentioned in the table structure..
Anybody who can tell me how to solve it??? means why its not working.. mysql simple select query is working perfectly..
EDIT----
can anybody explain why he/she voted my question "negative"
Doctrine_Query::create()
->select('rec.*')
->from('Records rec')
->limit(5000)
->execute();
Try this.. Its will work fine but increase the limit then u will face problems.Retrieving so many records will cause problems..try to change the logic you are using.Indexing etc may solve your problem but i dont have any idea about that..May be some1 here can help you.
When you add a where clause to your DB, it potentially causes it to use indexes, making it much easier on the db engine. It is generally a bad idea to grab an entire table all at once in one DB query, though this may be appropriate for smaller tables. If you're grabbing the whole table at once, it hints that you're misusing the DB, maybe treating it as a glorified file system. In general, the tendency should be to grab only what you need, and where possible due global operations and updates on the DB end.
All that being said, 60s for 20,000 records seems really slow. You may want to check to see if your DB is configured properly (e.g. to use enough RAM and whatnot). You might find the mysqltuner perl script at http://mysqltuner.pl helpful.
In the file php.ini of your server installation, you will find the configuration option max_execution_time = 60.
Try setting max_execution_time = 240 or even higher.
Don't forget to restart your webserver after editing php.ini, so the file gets read.
精彩评论