Excel ODBC query tables memory overflow

G

Gregory

I discover that when i recounting more then 200 QT on one sheet with this
simply code



Set qt_data = result_sheet.QueryTables.Add("ODBC;DSN=Excel Files;DBQ=" +
ActiveWorkbook.Name + ";DefaultDir=" + ActiveWorkbook.Path +
";DriverId=790;MaxBufferSize=2048;PageTimeout=5;",
result_sheet.range("AA1"), sql_str)

With qt_data
.PreserveFormatting = True
.FieldNames = False
.BackgroundQuery = False
.AdjustColumnWidth = False
.RefreshStyle = xlOverwriteCells
.Refresh
End With
qt_data.Delete



my EXCEL.exe process (2003 ver.) becomes very slow, "hard" (~300-400 Mb) and
"unstoppable" (closing frames, but not unload process)

Can you give me any advice?
 
G

Gregory

Is this advice? )))

I so prefer to store my algorithm (very flexible finance planning sheets
with my special formulas) which is very transportable

Of cause, i can transfer data to server (with RPC or other technology), make
recount, transfer back, but why i must do that?
 
A

Arvi Laanemets

Hi

Am I right that you are running >200 separate queries with source table(s)
in same workbook on same sheet? What do you want to achieve with this, I
wonder! Am I right in my suspect, that you calculate some summary values
from tables on other worksheets this way?

Anyway, I'm sure there are less resource-consuming solutions available -
maybe you describe, what kind of original data you have, and what do you
want to get out of them.
 
G

Gregory

Yes, you are right. I have some arrays of input data (foundries and
registries), which were received from other program sources over 16 orgs in
our holding, then finance director preparing finance plan or other reports,
dynamically changing input data and report forms (for example, hi describes
[2,3,7,8][7-9][2^10-90|2^1000-1143,2^1148-3120,2^3135-5999] as profit of
[2,3,7,8] orgs in 3-d quarter, and else) . Of cause, some times, he is
making his work w/o internet, that means local realization of algorithm. If
i don't resolve time-memory problem by excel ODBC, i should try another
ODBC-complain local data-source, may be dbf or mysqld. Almost of all i
prefer current realization, cause it's work normal (one time :))
 
A

Arvi Laanemets

Hi

I think an ODBC query is a wrong tool here. I myself use queries in Excel
applications frequently, but only to retrieve data from external datasource,
when user opens the workbook, to consolidate or to split data, returned by
external query/queries, or to consolidate some data from table in same
workbook (p.e. mirror table with links to several workbooks, with gaps
between data blocks). As rule I never use several queries on same sheet. I'm
trying to keep the number of queries low, and allow the refreshing only on
open and manually.

In your case, better consider 2 options:
a) use formulas (SUM(), COUNT(), SUMIF(), COUNTIF(), SUMPRODUCT() etc.) to
calculate summary values (all calculations are made automatically, but when
the number of formulas increases, the workbook is getting slow).
b) create a procedure, which calculates the summary sheet, and is started
p.e. from command button on sheet, and maybe on Open event too (the workbook
is much faster, but the user has to remember to refresh the summary sheet).


--
Arvi Laanemets
( My real mail address: arvil<at>tarkon.ee )



Gregory said:
Yes, you are right. I have some arrays of input data (foundries and
registries), which were received from other program sources over 16 orgs
in our holding, then finance director preparing finance plan or other
reports, dynamically changing input data and report forms (for example, hi
describes [2,3,7,8][7-9][2^10-90|2^1000-1143,2^1148-3120,2^3135-5999] as
profit of [2,3,7,8] orgs in 3-d quarter, and else) . Of cause, some times,
he is making his work w/o internet, that means local realization of
algorithm. If i don't resolve time-memory problem by excel ODBC, i should
try another ODBC-complain local data-source, may be dbf or mysqld. Almost
of all i prefer current realization, cause it's work normal (one time :))
 
G

Gregory

Thanks to ALL!

Of cause, I also see wrong sides of that decision, may be some times i can
use SPREADSHEET FUNCTIONS (like SUMPRODUCT), but when i want to allocate a
group of costs in specified diapason over all orgs by criteria... Such is
only sql or vba macro work. The main problem - WHY MEMORY DON'T CLEARED
AFTER DESTROING QUERYTABLE. I think, that IF NO ESCAPE OF THAT PROBLEM
EXISTS, then, may be, IT'S ONE OF ALL EXCEL DEVELOPERS MISTAKES.




Arvi Laanemets said:
Hi

I think an ODBC query is a wrong tool here. I myself use queries in Excel
applications frequently, but only to retrieve data from external
datasource, when user opens the workbook, to consolidate or to split data,
returned by external query/queries, or to consolidate some data from table
in same workbook (p.e. mirror table with links to several workbooks, with
gaps between data blocks). As rule I never use several queries on same
sheet. I'm trying to keep the number of queries low, and allow the
refreshing only on open and manually.

In your case, better consider 2 options:
a) use formulas (SUM(), COUNT(), SUMIF(), COUNTIF(), SUMPRODUCT() etc.) to
calculate summary values (all calculations are made automatically, but
when the number of formulas increases, the workbook is getting slow).
b) create a procedure, which calculates the summary sheet, and is started
p.e. from command button on sheet, and maybe on Open event too (the
workbook is much faster, but the user has to remember to refresh the
summary sheet).


--
Arvi Laanemets
( My real mail address: arvil<at>tarkon.ee )



Gregory said:
Yes, you are right. I have some arrays of input data (foundries and
registries), which were received from other program sources over 16 orgs
in our holding, then finance director preparing finance plan or other
reports, dynamically changing input data and report forms (for example,
hi describes [2,3,7,8][7-9][2^10-90|2^1000-1143,2^1148-3120,2^3135-5999]
as profit of [2,3,7,8] orgs in 3-d quarter, and else) . Of cause, some
times, he is making his work w/o internet, that means local realization
of algorithm. If i don't resolve time-memory problem by excel ODBC, i
should try another ODBC-complain local data-source, may be dbf or mysqld.
Almost of all i prefer current realization, cause it's work normal (one
time :))




Arvi Laanemets said:
Hi

Am I right that you are running >200 separate queries with source
table(s) in same workbook on same sheet? What do you want to achieve
with this, I wonder! Am I right in my suspect, that you calculate some
summary values from tables on other worksheets this way?

Anyway, I'm sure there are less resource-consuming solutions available -
maybe you describe, what kind of original data you have, and what do you
want to get out of them.


--
Arvi Laanemets
( My real mail address: arvil<at>tarkon.ee )



Is this advice? )))

I so prefer to store my algorithm (very flexible finance planning
sheets with my special formulas) which is very transportable

Of cause, i can transfer data to server (with RPC or other technology),
make recount, transfer back, but why i must do that?


"Tim Williams" <saxifrax@pacbell*dot*net> ÓÏÏÂÝÉÌ/ÓÏÏÂÝÉÌÁ × ÎÏ×ÏÓÔÑÈ
ÓÌÅÄÕÀÝÅÅ: Don't do it?

Tim

I discover that when i recounting more then 200 QT on one sheet with
this simply code



Set qt_data = result_sheet.QueryTables.Add("ODBC;DSN=Excel
Files;DBQ=" + ActiveWorkbook.Name + ";DefaultDir=" +
ActiveWorkbook.Path +
";DriverId=790;MaxBufferSize=2048;PageTimeout=5;",
result_sheet.range("AA1"), sql_str)

With qt_data
.PreserveFormatting = True
.FieldNames = False
.BackgroundQuery = False
.AdjustColumnWidth = False
.RefreshStyle = xlOverwriteCells
.Refresh
End With
qt_data.Delete



my EXCEL.exe process (2003 ver.) becomes very slow, "hard" (~300-400
Mb) and "unstoppable" (closing frames, but not unload process)

Can you give me any advice?
 
A

Arvi Laanemets

Hi


Gregory said:
Thanks to ALL!

Of cause, I also see wrong sides of that decision, may be some times i can
use SPREADSHEET FUNCTIONS (like SUMPRODUCT), but when i want to allocate a
group of costs in specified diapason over all orgs by criteria... Such is

A tip for using spreadsheet functions for such tasks. Let you have a list of
org's on summary sheet. Select a org from list (activate cell). You can
define a dynamic named range/formula, which calculates wanted summary value
for this org - with cell reference to same cell in form $C#. Mostly it's
possible to define such a formula so, that when you select another row, the
value for this row is calculated, etc. Now into adjacent column you can
enter something like
=YourNamedFormula
, and you get wanted value calculated for all org's.

And have you tried to use Pivot table - from what you sayd, it looks like
solution for you.

only sql or vba macro work. The main problem - WHY MEMORY DON'T CLEARED
AFTER DESTROING QUERYTABLE. I think, that IF NO ESCAPE OF THAT PROBLEM
EXISTS, then, may be, IT'S ONE OF ALL EXCEL DEVELOPERS MISTAKES.


I don't know, how MS Query handles memory after querytable is destroyed, but
I don't advice to have any illusions there. Excel himself manages computer
resources extremly badly - it simply grabs all what's available, can it use
it or not, especially when running ODBC.
And another example: I started with database design in DBase, and continued
with FoxBase and then FoxPro. When I some years ago started to work with MS
Office, I was surprised to find out, that queries like
SELECT DISTINCT * FROM table1 WHERE field [NOT] IN (SELECT field FROM table2
WHERE condition=True)
, which in FoxPro would be ready in a couple of seconds, needed minutes in
ACCESS. It looks like SQL in MS Office is not optimized properly. Now I'm
avoiding complex queries whenever it's possible, and p.e. instead query
above I use the construct like
SELECT DISTINCT * FROM table1 WHERE field [NOT] IN queryX
where
queryX=SELECT field FROM table2 WHERE condition=True
 
G

Gregory

THANKS ALL.

Let's stop.

I see, that I can't create >200 query tables on one sheet. That's all. I
have created SO MACH user input forms based on Excel (cause it is "usable"
for users), data from which transfers to web-server by XML-RPC, where stored
and prepared for reports or other analytics. We also can transfer all
patterns, created by my director and receive data back in one query table.
It will work well and much faster then excel.

THANKS ALL.


Arvi Laanemets said:
Hi


Gregory said:
Thanks to ALL!

Of cause, I also see wrong sides of that decision, may be some times i
can use SPREADSHEET FUNCTIONS (like SUMPRODUCT), but when i want to
allocate a group of costs in specified diapason over all orgs by
criteria... Such is

A tip for using spreadsheet functions for such tasks. Let you have a list
of org's on summary sheet. Select a org from list (activate cell). You can
define a dynamic named range/formula, which calculates wanted summary
value for this org - with cell reference to same cell in form $C#. Mostly
it's possible to define such a formula so, that when you select another
row, the value for this row is calculated, etc. Now into adjacent column
you can enter something like
=YourNamedFormula
, and you get wanted value calculated for all org's.

And have you tried to use Pivot table - from what you sayd, it looks like
solution for you.

only sql or vba macro work. The main problem - WHY MEMORY DON'T CLEARED
AFTER DESTROING QUERYTABLE. I think, that IF NO ESCAPE OF THAT PROBLEM
EXISTS, then, may be, IT'S ONE OF ALL EXCEL DEVELOPERS MISTAKES.


I don't know, how MS Query handles memory after querytable is destroyed,
but I don't advice to have any illusions there. Excel himself manages
computer resources extremly badly - it simply grabs all what's available,
can it use it or not, especially when running ODBC.
And another example: I started with database design in DBase, and
continued with FoxBase and then FoxPro. When I some years ago started to
work with MS Office, I was surprised to find out, that queries like
SELECT DISTINCT * FROM table1 WHERE field [NOT] IN (SELECT field FROM
table2 WHERE condition=True)
, which in FoxPro would be ready in a couple of seconds, needed minutes in
ACCESS. It looks like SQL in MS Office is not optimized properly. Now I'm
avoiding complex queries whenever it's possible, and p.e. instead query
above I use the construct like
SELECT DISTINCT * FROM table1 WHERE field [NOT] IN queryX
where
queryX=SELECT field FROM table2 WHERE condition=True
 
G

Gregory

Hi ALL!

My problem have SO STUPID decision - if i close datasource sheets (from
which SELECT's are create) ALL IS OK- memory state doesn't change.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top