P
Philip
Hi,
We're using the ODBC Text driver, and the ODBC Excel driver to retrieve data
in several procedures in VBA.
In each case the data is retrieved into ADO Recordsets, then copied to
different worksheets using the Excel CopyFromRecordset, then the ADO
Recordset (and it's connection) are explicitly closed and destroyed within
the same procedure.
Every data import leads to a memory leak.
For example, in step 1, first a source text file is opened as an ODBC table
from the LAN into an ADO Recordset with a WHERE clause restricting the date
of acceptable records. The matched data is put in one worksheet, then that
recordsegt is closed, and another procedure runs to import the exceptions
(where the date field does not match). Finally the workbook is saved, then
the ODBC Excel driver is used to read Aggregate data from the 1st worksheet
into a 3rd sheet.
This step (really 3 procedures) takes the memory used by Excel (in the Task
Manager) from 33 MB to 48 MB - and yet only a few records are actually
imported (like 10 !)
There are several other steps like this, all of which do essentially the
same tasks, either importing matching data from textfiles and manipulating it
record by record, or reading from worksheets using UNION SQL Queries (which
include formulae !) to perform complex data manipulations...
Of course, I know that something so complex should be in a compiled
application, but we had no time to build an app, only a macro (usual story!)
And we had to use SQL Queries because of the complex nature of the data
manipulations...for example, we had to import data, then return 2 rows into
another sheet from 1 record - which means a UNION SQL Query.
Can anyone suggest a way of nailing down the memory leaks please, or how I
could ensure that memory is released... is there an API method I could use to
force Excel to release the memory?
thanks for any help or assistance! I know its a big one
Philip
We're using the ODBC Text driver, and the ODBC Excel driver to retrieve data
in several procedures in VBA.
In each case the data is retrieved into ADO Recordsets, then copied to
different worksheets using the Excel CopyFromRecordset, then the ADO
Recordset (and it's connection) are explicitly closed and destroyed within
the same procedure.
Every data import leads to a memory leak.
For example, in step 1, first a source text file is opened as an ODBC table
from the LAN into an ADO Recordset with a WHERE clause restricting the date
of acceptable records. The matched data is put in one worksheet, then that
recordsegt is closed, and another procedure runs to import the exceptions
(where the date field does not match). Finally the workbook is saved, then
the ODBC Excel driver is used to read Aggregate data from the 1st worksheet
into a 3rd sheet.
This step (really 3 procedures) takes the memory used by Excel (in the Task
Manager) from 33 MB to 48 MB - and yet only a few records are actually
imported (like 10 !)
There are several other steps like this, all of which do essentially the
same tasks, either importing matching data from textfiles and manipulating it
record by record, or reading from worksheets using UNION SQL Queries (which
include formulae !) to perform complex data manipulations...
Of course, I know that something so complex should be in a compiled
application, but we had no time to build an app, only a macro (usual story!)
And we had to use SQL Queries because of the complex nature of the data
manipulations...for example, we had to import data, then return 2 rows into
another sheet from 1 record - which means a UNION SQL Query.
Can anyone suggest a way of nailing down the memory leaks please, or how I
could ensure that memory is released... is there an API method I could use to
force Excel to release the memory?
thanks for any help or assistance! I know its a big one
Philip