How to delete query cache in Access

B

BlockNinja

Problem:
When I run an append query in Access, it generates a large "cache" for the
"select" part of that query. I am in a situation where I have several linked
tables, and another table which has the names of those tables (each of these
linked tables are detailed data stored in text files that are about 85MB in
size and have on average about 300k-400k records). What I want to do is loop
through each table, find out summary data via a dynamically built Crosstab
query (since Crosstabs don't seem to allow parameterized queries in Access
2000 which I use at work), and then append those results to a "history" table
via an append query (i.e. db.Execute "select * from qry_NewCrosstabQuery into
tbl_Historical_Data"). My process works fine, except that each I run that
crosstab query, it generates a huge cache, which doesn't get deleted after it
runs, so after about 12 of these text files having been queried upon, my
database grows dangerously close to the 2gb limit. Once I compact the
database after everything is done, it goes back down to about 150MB, you can
obviously see what problem is going to happen if I had say 24 text files
(each one in my case is a month of detailed data), as it stands, I would have
to split it up into several processes, compacting the database each time when
it's unnecessary for that cache to be there after I run the append query to
begin with.

Question:
Is it possible to find that huge cache that was created when the append
query was run (through MSysObjects/MSysQueries/etc?), and nuke it on the fly
after the append query is run and I no longer need that cache to exist? It
would be safe at the point after I do my append query in each loop for my
purposes to discard that cache afterwards.
 
J

John Spencer

Don't have a complete answer, but Crosstabs do allow parameters.
With a crosstab query you MUST declare your parameters and if any
other queries are used in the crosstab their parameters must also be
declared.

Open the query in design mode
Select Query: Parameters from the Menu
Fill in the EXACT name of the parameter in column 1
Select the data type of the parameter in column 2
 
B

BlockNinja

Very cool, yes that seems to do the trick right there for most of what I want
to do at least with the dynamic querying part. Here's a sample of my pain...
:)

PARAMETERS [Enter Net Investment Low Value] Currency, [Enter Net Investment
High Value] Currency, [Column Name Start] Text ( 255 ), [Column Name End]
Text ( 255 );
TRANSFORM NZ(Count(CURRENT_FLASH.NET_INVESTMENT),0) AS CountOfNET_INVESTMENT
SELECT CURRENT_FLASH.FILE_DATE, CURRENT_FLASH.SITE, CURRENT_FLASH.INDY,
CURRENT_FLASH.STATUS_CD,
IIf(((([CURRENT_FLASH].[CONT_DATE])<#1/1/2001#)),"OLD_BANK","NEW_BANK") AS
bank_flag
FROM CURRENT_FLASH
WHERE (((CURRENT_FLASH.NET_INVESTMENT) Between [Enter Net Investment Low
Value] And [Enter Net Investment High Value]))
GROUP BY CURRENT_FLASH.FILE_DATE, CURRENT_FLASH.SITE, CURRENT_FLASH.INDY,
CURRENT_FLASH.STATUS_CD,
IIf(((([CURRENT_FLASH].[CONT_DATE])<#1/1/2001#)),"OLD_BANK","NEW_BANK")
ORDER BY CURRENT_FLASH.FILE_DATE, CURRENT_FLASH.SITE, CURRENT_FLASH.INDY,
CURRENT_FLASH.STATUS_CD,
IIf(((([CURRENT_FLASH].[CONT_DATE])<#1/1/2001#)),"OLD_BANK","NEW_BANK"),
[Column Name Start] & Format([DLQ_CODE],"000") & [Column Name End]
PIVOT [Column Name Start] & Format([DLQ_CODE],"000") & [Column Name End];

Any way to change the "NZ(Count(CURRENT_FLASH.NET_INVESTMENT),0) AS
CountOfNET_INVESTMENT" to a different expression without programming? (i.e.
NZ(Sum(CURRENT_FLASH.NET_INVESTMENT),0) AS SumOfNET_INVESTMENT)

If I could get BOTH the count and sum in the same table that would be
awesome, but from what I get of the crosstabs they can't do more than one
column (for my purposes, I need to be able to break that Net Investment down
by a DLQ_CODE (which is 0,31,61,91,...,361), but I need to get count totals
and sum totals for both).
 
B

BlockNinja

I found a temporary fix on the size issue, it's not the best in the world
though, essentially when importing the linked table into my database into a
single table, instead of doing a straight append query, I had to split it up
into a select query recordset that gets the values, and then insert into the
new table one line at a time. The process is much slower and requires more
coding effort, but it keeps the db size down from reaching that 2gb limit so
fast.

For anyone who might suggest DoCmd.RunSQL instead of db.Execute... tried
it... same results as the append query.
 
B

BlockNinja

That's not going to work in this case, because the limit could be hit WHILE
the process is running. A database has to be closed in order to compact it,
so if you closed the database and recompacted it, you would have to re-run
the execution of the process, and then you'd be back to square one.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top