I have been trying to get a Berkeley DB up and running for a C# project I have. I downloaded and built the .NET version from Oracle. For the most part, it is working the way I would expect and performing like a champ. But the one issue I'm having trouble with has to do with duplicate keys.
First, every time I try to use the .putNoDuplicate method it throws an error even though the key is not a dupe.
db.PutNoDuplicate(k, v);
The output on the console says "Illegal flag specified to DB->put", and an exception is raised with error code 22. The exception itself contains no info.
So I figured, OK, maybe this method is buggy, I'll just do the check myself. I changed the code to
if (db.Exists(k))
throw new System.Data.DataException("Duplicate key");
else
db.Put(k, v);
This works, but is extremely slow. For reference, it adds roughly 10,000 records almost instantaneously when only db.Put is called, but slows to roughly 40 records per second when doing the above db.Exists check. That just can't be right.
Can anyone offer any insight on what is going on? Thanks!
EDIT: So I've figured out part of it. The slowness on the db.Exists() check was happening because the Berkeley-DB code routinely throws, and internally handles, an exception when the function returns false. When I run the release build of the code, the speed is back up where I would expect it. But I would still love to know why the pu开发者_开发技巧tNoDuplicate call is failing in the first place.
精彩评论