I have a big list of int parameters for a SQL query:
update mytable set col='xyz'
where id in (/* thousands of ints */)
My problem is that in SQL Server 2000 there are a limit for parameters. I could run this query on an SQL Server 2008 too.
What is the better way to do this.
Edit:
The list of Ids come from a C# program. Not from another ta开发者_开发问答ble.
You can insert the integers into a temp table, and then query like this:
update mytable m set col='xyz'
where exists (select * from #MyTempTable where id = m.id)
An alternate approach that works with SQL 2000 is to use XML.
Have the program/application format the ints like so:
'<root><TMP J="111"/><TMP J="222"/><TMP J="333"/></root>'
.
Then create the following stored procedure:
CREATE PROCEDURE UpDateIntsFromXML (
@sXML TEXT
)
AS
DECLARE @iDoc INT
EXEC sp_xml_preparedocument @iDoc OUTPUT, @sXML
UPDATE YourTable
SET YourColumn = 'fixed value'
FROM OPENXML (@iDoc, '/root/TMP', 1) WITH (J INT) AS X
WHERE X.J = YourTable.IntColumn
EXEC sp_xml_removedocument @iDoc
RETURN
.
Then your application can call that SP, passing a potentially huge block of text/XML.
Observe that root
, TMP
, and J
are all case-sensitive.
The best working soulution for me was SQL Server 2008: Table Valued Parameters
100000 Ids needs 14-20s, 1000 Ids needs ~140ms.
sql = @"
update MyTable
set Col1 = 1
where ID in (select * from @ids)
";
sqlCmd = new SqlCommand {Connection = _sqlConn, CommandText = sql};
//Create a DataTable with one Column("id") and all ids as DataRows
DataTable listOfLeadIDs = new DataTable();
listOfIDs.Columns.Add("id", typeof(int));
Ids.ToList<string>().ForEach(x => listOfIDs.Rows.Add(new object[] { int.Parse(x) }));
//Bind this DataTable to the Command-object
// Node: "IntTable" is an User-Defined-Table-Typ (new feature with SQL-2008)
sqlCmd.Parameters.Add(
new System.Data.SqlClient.SqlParameter("@ids", listOfIDs) {
TypeName = "IntTable"
});
//Execute the Query
sqlCmd.ExecuteNonQuery();
The User-Defined-Table-Typ:
CREATE TYPE [dbo].[IntTable] AS TABLE(
[id] [int] NULL
)
GO
At all costs, AVOID IN
; especially if you are post-2000. my backup
Instead, use EXISTS
UPDATE myTable
SET col = 'newValue'
FROM myTable
WHERE EXISTS (
SELECT *
FROM @myTempTable temp
WHERE myTable.ID = temp.ID)
Divide the data into smaller groups, and execute multiple update queries.
There is no reason to use a temp table, since you retrieve the data from outside the db, so there is no way to avoid its transfer towards the db.
If the ints are in any way sequential (more than two at once), you could make them into BETWEEN
pairs.
But in this case, just make a string of these ints and pass that as a single varchar(max)
parameter.
I think you probably want to create a memory based temporary table with an index. Assuming the table you are querying against is large you would not want to do a table scan comparing each row against each of your 5000 matches. You want to do a join with the help of two indexes.
CREATE TEMPORARY TABLE IF NOT EXISTS inputlist
(i INT PRIMARY KEY) ENGINE = MEMORY;
INSERT INTO inputlist (i) VALUES (1),(2),(3),(1000),(2000),(5000);
SELECT * FROM your_table JOIN inputlist ON your_table.intvalues = inputlist.i;
DROP TEMPORARY TABLE inputlist;
SQL based on MySQL, see:
http://dev.mysql.com/doc/refman/5.1/en/memory-storage-engine.html
http://dev.mysql.com/doc/refman/5.1/en/insert.html
http://dev.mysql.com/doc/refman/5.1/en/create-table.html
精彩评论