I'm trying to figure out how to speed up this operation. Before I import a record from the text file I first need to see if one exists in the database. If it does exist I'm going to perform an update operation on it. If it does not exist I'm going to create a new record.
Running the code you see below this operation takes somewhere in the neighborhood of 3 hours.
I've tried using ADO's find method and it actually appears to be slower than the filter method.
The database is a Visual Foxpro 6 database. The table does have an index on the item_cd field but the table does not have any primary key established. This is out of my control since I didn't write the software and I'm trying to stay away from making any structural changes to the database.
There are 46652 rows in the text file and about 650,000 records/rows in the ADO recordset. I think slimming down the recordset would be the biggest step in fixing this but I haven't come up with any way of doing that. I'm trying to prevent creating duplicate records since there is no primary key and so I really need to have the entire table in my recordset.
Because I'm running this on my local machine it appears that the operation is limited by the power of the CPU. In actuality this might be used across the network, especially if I can get it to go faster.
Dim sFileToImport As String
sFileToImport = Me.lstFiles.Text
If sFileToImport = "" Then
MsgBox "You must select a file from the listbox to import."
Exit Sub
End If
If fConnectToDatabase = False Then Exit Sub
With gXRst
.CursorLocation = adUseClient
.CursorType = adOpenKeyset
.LockType = adLockReadOnly
.Open "SELECT item_cd FROM xmsalinv ORDER BY item_cd ASC", gXCon
End With
Call fStartProgress("Running speed test.")
Dim rstTxtFile As ADODB.Recordset
Set rstTxtFile = New ADODB.Recordset
Dim con As ADODB.Connection
Set con = New ADODB.Connection
Dim sConString As String, sSQL As String
Dim lRecCount As Long, l As Long
Dim s As String
sConString = "DRIVER={Microsoft Text Driver (*.txt; *.csv)};Dbq=" & gsImportFolderPath 开发者_如何学Python& ";Extensions=asc,csv,tab,txt;Persist Security Info=False;"
con.Open sConString
sSQL = "SELECT * FROM [" & sFileToImport & "]"
rstTxtFile.Open sSQL, con, adOpenKeyset, adLockPessimistic
If Not (rstTxtFile.EOF And rstTxtFile.BOF) = True Then
rstTxtFile.MoveFirst
lRecCount = rstTxtFile.RecordCount
Do Until rstTxtFile.EOF = True
'This code appears to actually be slower than the filter method I'm now using
'gXRst.MoveFirst
'gXRst.Find "item_cd = '" & fPQ(Trim(rstTxtFile(0))) & "'"
gXRst.Filter = "item_cd = '" & fPQ(Trim(rstTxtFile(0))) & "'"
If Not (gXRst.EOF And gXRst.BOF) = True Then
s = "Item Found - " & Trim(rstTxtFile(0)) 'item found
Else
s = "Item Not Found - " & Trim(rstTxtFile(0)) 'Item not found found
End If
l = l + 1
Call subProgress(l, lRecCount, s)
rstTxtFile.MoveNext
Loop
End If
Call fEndProgress("Finished running speed test.")
Cleanup:
rstTxtFile.Close
Set rstTxtFile = Nothing
gXRst.Close
A simple solution to speed up Yours_Rs.find response is to use the Yours_Rs.move statement first if it is possible for you. What I have done is to use MyRs.move statement prior to using MyRs.find to come in the vicinity of my actual record. It had really worked for me as response of move statement is quite brisk.
I was using it to locate a patient record. So, moving the pointer to a record near the actual record made MyRs.find statement to work with the speed of light.
regards,
MAS.
doesn't answer your question and this is a pretty old thread, but why don't you import your text file to a temporary table on your db then do a join? something like SELECT tt.* FROM texttemp tt left outer join xmsalinv xal on tt.field1=xal.item_cd where xal.item_cd is null
this should return the contents of your imported text file which don't have any item_cd matches in the database, since you're dealing with a text file that complicates the query which is why i'm wondering your not importing the contents to a temporary table.
now assuming you know the mapping of the fields, you can probably also use this to insert assuming your db accepts insert select notation it'd be insert into xmsalinv (fields) select (matching fields) from (as above...) this moves your choke points to the import process, which i'm hoping is quick.
the ado collections seem like they're pretty stupid, so they don't benefit from any sort of knowledge about the data and are kinda slow.
ah next item on "vb6 filter" google http://www.techrepublic.com/article/why-ados-find-method-is-the-devil/1045830
this response is based on basic sql knowledge and not tailored to foxpro
Use a firehose cursor for the VFP query's results if you aren't, and see your other post here for suggestions regarding the text file Recordset.
Perhaps better yet though, you might try getting rid of your slow "loop and search" aproach.
I would probably create a temporary Jet 4.0 MDB from scratch for each text file you want to look up. Import the text data, index your key field. Use ADOX to define a linked table over in the VFP database. The use a query to do your matching.
Close and dispose of the MDB afterward.
In response to Bob Riemersma's post, the text file is not causing the speed issues. I've changed my code to open a recordset with a query looking for a single item. This code now runs in 1 minute and 2 seconds as opposed to the three to four hours I was looking at the other way.
Dim sFileToImport As String
sFileToImport = Me.lstFiles.Text
If sFileToImport = "" Then
MsgBox "You must select a file from the listbox to import."
Exit Sub
End If
If fConnectToDatabase = False Then Exit Sub
Call fStartProgress("Running speed test.")
Dim rstTxtFile As ADODB.Recordset
Set rstTxtFile = New ADODB.Recordset
Dim con As ADODB.Connection
Set con = New ADODB.Connection
Dim sConString As String, sSQL As String
Dim lRecCount As Long, l As Long
Dim sngQty As Single, sItemCat As String
sConString = "DRIVER={Microsoft Text Driver (*.txt; *.csv)};Dbq=" & gsImportFolderPath & ";Extensions=asc,csv,tab,txt;Persist Security Info=False;"
con.Open sConString
sSQL = "SELECT * FROM [" & sFileToImport & "]"
rstTxtFile.Open sSQL, con, adOpenKeyset, adLockPessimistic
If Not (rstTxtFile.EOF And rstTxtFile.BOF) = True Then
rstTxtFile.MoveFirst
lRecCount = rstTxtFile.RecordCount
Do Until rstTxtFile.EOF = True
l = l + 1
sItemCat = fItemCat(Trim(rstTxtFile(0)))
If sItemCat <> "[item not found]" Then
sngQty = fItemQty(Trim(rstTxtFile(0)))
End If
Call subProgress(l, lRecCount, sngQty & " - " & sItemCat & " - " & rstTxtFile(0))
sngQty = 0
rstTxtFile.MoveNext
Loop
End If
Call fEndProgress("Finished running speed test.")
Cleanup:
rstTxtFile.Close
Set rstTxtFile = Nothing
My Functions:
Private Function fItemCat(sItem_cd As String) As String
'Returns blank if nothing found
If sItem_cd <> "" Then
With gXRstFind
.CursorLocation = adUseClient
.CursorType = adOpenKeyset
.LockType = adLockReadOnly
.Open "SELECT item_cd, ccategory FROM xmsalinv WHERE item_cd = '" & fPQ(sItem_cd) & "'", gXCon
End With
If Not (gXRstFind.EOF And gXRstFind.BOF) = True Then
'An item can technically have a blank category although it never should have
If gXRstFind!ccategory = "" Then
fItemCat = "[blank]"
Else
fItemCat = gXRstFind!ccategory
End If
Else
fItemCat = "[item not found]"
End If
gXRstFind.Close
End If
End Function
Private Function fIsStockItem(sItem_cd As String, Optional bConsiderItemsInStockAsStockItems As Boolean = False) As Boolean
If sItem_cd <> "" Then
With gXRstFind
.CursorLocation = adUseClient
.CursorType = adOpenKeyset
.LockType = adLockReadOnly
.Open "SELECT item_cd, bal_qty, sug_qty FROM xmsalinv WHERE item_cd = '" & fPQ(sItem_cd) & "'", gXCon
End With
If Not (gXRstFind.EOF And gXRstFind.BOF) = True Then
If gXRstFind!sug_qty > 0 Then
fIsStockItem = True
Else
If bConsiderItemsInStockAsStockItems = True Then
If gXRstFind!bal_qty > 0 Then
fIsStockItem = True
End If
End If
End If
End If
gXRstFind.Close
End If
End Function
Private Function fItemQty(sItem_cd As String) As Single
'Returns 0 if nothing found
If sItem_cd <> "" Then
With gXRstFind
.CursorLocation = adUseClient
.CursorType = adOpenKeyset
.LockType = adLockReadOnly
.Open "SELECT item_cd, bal_qty FROM xmsalinv WHERE item_cd = '" & fPQ(sItem_cd) & "'", gXCon
End With
If Not (gXRstFind.EOF And gXRstFind.BOF) = True Then
fItemQty = CSng(gXRstFind!bal_qty)
End If
gXRstFind.Close
End If
End Function
First can try creating an in-memory index on item_cd
with gXRst!item_cd.Properties("OPTIMIZE").Value = True
which will speed up both Find
and Filter
.
For ultimate speed in searching initialize helper index Collection
like this
Set cIndex = New Collection
On Error Resume Next
Do While Not gXRst.EOF
cIndex.Add gXRst.Bookmark, "#" & gXRst!item_cd.Value
gXRst.MoveNext
Loop
On Error GoTo ErrorHandler
And insetad of Find
use some function like this
Public Function SearchCollection(Col As Object, Index As Variant) As Boolean
On Error Resume Next
IsObject Col(Index)
SearchCollection = (Err.Number = 0)
On Error GoTo 0
End Function
3 hours just for a few hundred thousands of records!!! You are doing it the wrong way. Simply: -append text file to a VFP table, -then insert the ones that do not exist in existing table with a single SQL -and update the ones that exist with another Update sql.
That is all and should take less than a minute (a minute is even very slow). You can do all these using the VFPOLEDB driver and it doesn't matter that you have VFP6 database, VFPOLEDB has VFP9 engine built-in.
精彩评论