B
BarryC
I have a program that imports data into temp tables, processes it and
places the results
into final tables. My problem is that it is really slow. Each temp
processing table has almost 2 million records in it. I have identified
the deletion of these temporary records as a bottleneck in the speed.
My first thought is to drop the tables and recreate them instead of
deleting the records.
Any thoughts on this? Better ideas? I've seen comments that this isn't
good for DB stability.
places the results
into final tables. My problem is that it is really slow. Each temp
processing table has almost 2 million records in it. I have identified
the deletion of these temporary records as a bottleneck in the speed.
My first thought is to drop the tables and recreate them instead of
deleting the records.
Any thoughts on this? Better ideas? I've seen comments that this isn't
good for DB stability.