开发者

ajax autocomplete performance issues to hit the db or filter the pre-cahced results

开发者 https://www.devze.com 2023-03-13 18:37 出处:网络
In my existing asp.net application, i have a ajax autocomplete control, which uses a web method to get the autocomplete results which intern hits a stored procedure. But since hitting the db every tim

In my existing asp.net application, i have a ajax autocomplete control, which uses a web method to get the autocomplete results which intern hits a stored procedure. But since hitting the db every time is costly and a performance issue, we wanted to cache all the lookup results which is around 45,000 rows and use the cahced data to filter out and get the data.

we are using linq query to filter by using contains method to check the prefix. But if i use the cached data filtering, it takes much longer than the original implementation which hits db everytime.

Is there any approach 开发者_Python百科you can suggest me, which can give a much quicker result set retrieval when the user types the data in the UI.

I know that maintaining 45k no.of rows and filtering that would be a real pain. Hitting the db would be much more better approach.

Since we are facing some performance issues, please let me know any better approach.

Replacement of Ajax autocomplete with Jquery autocomplete plugin makes any difference..?

Code: Just like any other ajax autocomplete code:

<ajaxToolkit:AutoCompleteExtender 
    runat="server" 
    ID="autoComplete1" 
    TargetControlID="myTextBox"
    ServiceMethod="GetUserList"></ajaxToolkit:AutoCompleteExtender>
[WebMethod]
public string[] GetUserList(string prefix)
{
return UserManager.GetUserNamesBySearch(prefix);
}

public string[] GetUserNamesBySearch(string prefix)
{
List<User> userCollection=UserServiceMgr.GetUserList(prefix);
var filteredUsers=from user in userCollection
              Where user.FirstName.contain(prefix)
              select user.FirstName.
filterdUsers.ToArray(); 
}

Thanks in advance

Suri


IMO, 45k+ records is too much to try and create a cache strategy.

Given that the "cache key" is a partial string, you will end up with duplicate results in the cache which you have to deal with.

We also use auto complete (jQuery) in our ASP.NET MVC application, and we have millions of records in the db.

We use a stored procedure, with Full Text Search over indexes views in the backend.

And given the amount of data, it performs reasonably well.

So i would suggest using a FTS provider such as SQL Server or Lucene to optimize your backend, not the use of ASP.NET cache.

FTS is a natural fit for auto complete, since it has built in smarts for things like noise words and thesauruses.


If your underlying data source isn't going to change very often you can consider storing the results in the application's Cache and establish a timeout to refresh the data. If the data changes every so often, you can set dependencies on the source table so that when the table's data gets modified it invalidates the cached object and then rebuilds the cached object. If the data changes often then I would recommend doing exactly what you're doing now.

When I say "rebuild" the cached object, it is your application's responsibility to test for the presence of the cached object and rebuild it and reestablish it in the cache if is not present.

Keep in mind that if you will use an SqlDependency then you must set the database up for it. aspnet_regsql.exe has the ability to set up your database. Also, if your application relies on the fact that your data is guaranteed to be accurate then I would also do what you're doing now, or use SqlDependency.

Caching Example

SqlDependency

aspnet_regsql.exe

0

精彩评论

暂无评论...
验证码 换一张
取 消