I have a single table in a database called Users
Users ------ ID (PK, Identity) Username (Unique Index)
I have setup a unique index on the Username table to prevent duplicates. I am then enumerating through a collection and creating a new 开发者_C百科user in the database for each item.
What I want to do is just insert a new user and ignore the exception if the unique key constraint is violated (as it's clearly a duplicate record in that case). This is to avoid having to craft where not exists kind of queries.
First off, is this going to be any more efficient or should my insert code be checking for duplicates instead? I'm drawn more to the database having that logic as this prevents any other type of client from inserting duplicate data.
My other issue is related to LINQ To SQL. I have the following code:
public class TestRepo
{
DatabaseDataContext database = new DatabaseDataContext();
public void Add(string username)
{
database.Users.InsertOnSubmit(new User() { Username = username });
}
public void Save()
{
database.SubmitChanges();
}
}
And then I iterate over a collection and insert new users, ignoring any exceptions:
TestRepo repo = new TestRepo();
foreach (var name in new string[] { "Tim", "Bob", "John" })
{
try
{
repo.Add(name);
repo.Save();
}
catch { }
}
The first time this is run, great I have three users in the table. If I remove the second one and run this code again, nothing is inserted. I expected the first insert to fail with the exception, the second to succeed (as I just removed that item from the DB) and the third to then fail.
What seems to be happening is that once the SqlException is thrown (even though the loop continues to iterate) all of the next inserts fail - even when there isn't a row in the table that would cause a unique violation.
Can anyone explain this?
P.S. The only workaround I could find was to instantiate the repo each time before the insert, then it worked exactly as excepted - indicating that it's something to do with the LINQ To SQL DataContext.
Thanks.
On the second insert dataContext will try again to insert first objects since it is sitting in identity map for this dataContext (your wish to insert 'Tim' is still pending - after first catch).
In first loop you are inserting ‘Tim’ In second ‘Tim’ and ‘Bob’ In Third ‘Tim’, ‘Bob’ and ‘John’ So you are not inserting only ‘Bob’ in second loop and this is why is failing.
You can try to remove failing name from DataContext in cach but anyway multiple submits in one data context are very bad idea (I got burned on this).
Try calling DataContext.Refresh
on your DataContext in between test runs.
I think you are getting hung up on the internal caching in Linq-to-Sql, that is why re-creating the DataContext acts as a work around, when you new up a DataContext, the new object's cache will be empty.
You should check for existence using the try catch trick is bad.
foreach (var name in new string[] { "Tim", "Bob", "John" })
{
var u = from users in db.Users
where users.username == name
select users;
if (u.Count() == 0)
{
User user = new User();
user.name = name;
db.Users.InsertOnSubmit(user);
db.SubmitChanges(System.Data.Linq.ConflictMode.ContinueOnConflict);
}
}
It is hard to say if your method is "more efficient." It would depend on how often you get duplicate accounts. If you have a very high rate, then you lose speed as it will likely take more time to deal with the exception than to check for the existence of the record.
If you want to maximize your speed, then create a stored procedure to check and insert if it isn't in the database.
精彩评论