开发者

Prevent ADO.NET from using sp_executesql

开发者 https://www.devze.com 2022-12-16 04:24 出处:网络
In our SQL Server 2005 database (tested using Management Studio with DBCC FREEPROCCACHE and DBCC DROPCLEANBUFFERS), the following statement is fast (~0.2s compile time, ~0.1s execution time):

In our SQL Server 2005 database (tested using Management Studio with DBCC FREEPROCCACHE and DBCC DROPCLEANBUFFERS), the following statement is fast (~0.2s compile time, ~0.1s execution time):

SELECT ... FROM ... WHERE a = 1 AND b = '' ...

The following statement, however, is slow (~0.2s compile time, 7-11s execution time):

exec sp_executesql N'SELECT ... FROM ... WHERE a = @a AND b = @b ...', N'@a int, @b nvarchar(4000), ...', @a=1, @b=N'', ...

SQL Server chooses a different execution plan, although the queries are equal. This makes sense, since, in the first case, SQL Server has the actual values of a, b and all the other parameters available and can use the statistics to create a better plan. Apparently, the query plan for the concrete values of the parameters is much better than the generic one and definitely outweighs any "quer开发者_JAVA技巧y plan caching" performance benefit.

Now my question: ADO.NET always seems to use the second option (sp_executesql) when executing parameterized queries, which usually makes sense (query plan caching, etc.). In our case, however, this kills performance. So, is there some way to either

  • force ADO.NET to use something different than sp_executesql (i.e., something where the SQL Server query analyzer takes the actual parameter values into account) OR
  • force SQL Server to recaclulate the query plan of the SQL passed to sp_executesql taking the parameter values into account?

And please don't tell me I have to go back to ugly, old, dangerous sql = "WHERE b = " + quoteAndEscape(parameterB)...

Putting the SQL into a stored procedure makes no difference (slow, with and without WITH RECOMPILE). I did not post the actual SQL statment since it is quite complex (joins over multiple tables, including sub-SELECTs and aggregation).


Old thread I know, but I just found it by googling pretty much the exact same phrase! I had exactly the same issue (query ran very fast in Management Studio using parameters, but then really slow via ADO.Net) and replicated the issue by running the query in Management Studio via "exec sp_execute". The two execution plans were very different, even with the Optimize for query hint, so instead what I did was do an initial select of some of the data into a temporary table. That seemed to make the difference, and given you say that your query is a complex one, it might very well make the difference in your case too - I'm not quite sure how it worked, but it seemed to kick the execution plan into line even when using sp_execute.


You could try the OPTIMIZE FOR query hint which (quote):

Instructs the query optimizer to use a particular value for a local variable when the query is compiled and optimized. The value is used only during query optimization, and not during query execution. OPTIMIZE FOR can counteract the parameter detection behavior of the optimizer or can be used when you create plan guides


I believe the issue has to do with using the VARCHAR data type in the database. SQL Server doesn't appear to use the index specified if the where parameter is declared as NVARCHAR.

You could, however, change your database column to NVARCHAR (this would increase the size, of course) and then the index performance likely improve.

I am currently having this issue with LINQ, and may actually need to revert to using stored procedures to get around it.

The issue is explained in detail in this Microsoft Connect discussion


I would move query to Stored Procedure and then in command specify command.CommandType = CommandType.StoredProcedure.

This doesn't create sp_executesql and increase performance

0

精彩评论

暂无评论...
验证码 换一张
取 消