I am rewriting an Access application that uses a lot of Temp tables. Meaning the data gets added to the table, massaged, used for updates in other tables and then the data is deleted. I am trying to figure out when I move this to SQL server if it is better practice to use a similar process of a staging/temp table or a table that gets stored in local memory.
Is there a preferred method fo开发者_JS百科r this?
EDIT:
Per a request for additional info. The current process basically is in a morning process.
Table1 data moves to Table2 (temp)
Table1 data is deleted
Table1 gets new data for the day
Table2 gets a few updates
Table2 is then used to update Table1
Table2 data is deleted.
As I said this current process runs in an Access DB that is horribly designed so we are redesigning it for SQL server.
This is one of those "it depends" situations. By and large, if there will be multiple users (or automated sources) concurrently running a process that uses a "temp table methodology", you are better off using temp tables, as then each instance will have its own unique set of (one or more) temp tables. However, if there will only ever be one instance of such a process, it can make sense to have a "fixed" set of permanent staging tables within which to do the work.
精彩评论