I'm implementing a Google like search box in my asp.net application. Each keypress in the box sends开发者_Go百科 an Ajax request to IIS webmethod that queries the txt and return matches - works pretty cool. However, on loading up activity (e.g. 300 users), I'm getting errors that my 100 Pooled connections are used up. Now I'm rethinking that perhaps opening/closing a db connection on each keystroke may be too much. How would one architect this differently, or insure that the connections are reclaimed really fast. I'm have the 'using' construct for connections to insure it is closed. The concern is GC may not be reclaiming them fast enough?
How would google handle such a large open/close cycle.
Memecache those keystroke requests/DB responses and avoid the trip to the DB each time after the first.
Or generate a precompiled list of the possible autocomplete phrases, cache in-memory, and query that instead of the DB. Why do you need to query the DB for a search box? Generate an acceptable list and use that instead of making a cross-tier connection!
Or make sure there are indexes on the tables your query to the DB makes.
You might be already doing so but it is also in your best interest to require a minimum number of characters before the "autocomplete" kicks in, as well as always retrieve top(x) of items.
精彩评论