B
BlockNinja
Problem:
When I run an append query in Access, it generates a large "cache" for the
"select" part of that query. I am in a situation where I have several linked
tables, and another table which has the names of those tables (each of these
linked tables are detailed data stored in text files that are about 85MB in
size and have on average about 300k-400k records). What I want to do is loop
through each table, find out summary data via a dynamically built Crosstab
query (since Crosstabs don't seem to allow parameterized queries in Access
2000 which I use at work), and then append those results to a "history" table
via an append query (i.e. db.Execute "select * from qry_NewCrosstabQuery into
tbl_Historical_Data"). My process works fine, except that each I run that
crosstab query, it generates a huge cache, which doesn't get deleted after it
runs, so after about 12 of these text files having been queried upon, my
database grows dangerously close to the 2gb limit. Once I compact the
database after everything is done, it goes back down to about 150MB, you can
obviously see what problem is going to happen if I had say 24 text files
(each one in my case is a month of detailed data), as it stands, I would have
to split it up into several processes, compacting the database each time when
it's unnecessary for that cache to be there after I run the append query to
begin with.
Question:
Is it possible to find that huge cache that was created when the append
query was run (through MSysObjects/MSysQueries/etc?), and nuke it on the fly
after the append query is run and I no longer need that cache to exist? It
would be safe at the point after I do my append query in each loop for my
purposes to discard that cache afterwards.
When I run an append query in Access, it generates a large "cache" for the
"select" part of that query. I am in a situation where I have several linked
tables, and another table which has the names of those tables (each of these
linked tables are detailed data stored in text files that are about 85MB in
size and have on average about 300k-400k records). What I want to do is loop
through each table, find out summary data via a dynamically built Crosstab
query (since Crosstabs don't seem to allow parameterized queries in Access
2000 which I use at work), and then append those results to a "history" table
via an append query (i.e. db.Execute "select * from qry_NewCrosstabQuery into
tbl_Historical_Data"). My process works fine, except that each I run that
crosstab query, it generates a huge cache, which doesn't get deleted after it
runs, so after about 12 of these text files having been queried upon, my
database grows dangerously close to the 2gb limit. Once I compact the
database after everything is done, it goes back down to about 150MB, you can
obviously see what problem is going to happen if I had say 24 text files
(each one in my case is a month of detailed data), as it stands, I would have
to split it up into several processes, compacting the database each time when
it's unnecessary for that cache to be there after I run the append query to
begin with.
Question:
Is it possible to find that huge cache that was created when the append
query was run (through MSysObjects/MSysQueries/etc?), and nuke it on the fly
after the append query is run and I no longer need that cache to exist? It
would be safe at the point after I do my append query in each loop for my
purposes to discard that cache afterwards.