Some time back i encountered a scenario, in which a chain of stored procedures gets called recursively and this was eating into our performance (90% cpu time). All of a sudden obscure micro issue about whether to use table variable or temp tables suddenly became important.
After ferreting out some good sources of information on comparing these, we managed to reduce the cpu consumption by
- Making all our stored proc use temp tables rather than table variables made a difference for us, because our procedures where getting invoked frequently and stored-procedures are recompiled every time if table variables are used.
- But we had minimal impact from db writes due to temp tables because very few data was actually used
Ultimately the maximum improvement came from changes to the algorithm to make sure that the procedures do not get needlessly invoked or at least parts of them do not run (recursive algorithm was flattened out)
Learning – If micro optimizations start to look important, it is time to look for improvements in the algorithms that are used.
NOTES – Information gleamed from the exercise
- Most often you will require table variables
- Table variables cause FEWER recompilations than temporary tables. (source MSDN)
- Table variables seems to be scoped at the stored proc level. So if you create a table variable inside a while loop, it would retain old data (painfully found from a bug)
- Table variables cannot have indexes created on them except the primary key specified when creating these tables
- Create index command can cause query recompilations
- Table variables are held in memory and involve no disk IO – therefore no rollbacks are possible on these. On the upper side no locks are acquired by using table variables.
- Table variables cannot have statistics created on them so for huge data it COULD be inefficient ?