M
MaxP
Hello,
Our database (a access 2003 db split into front- and backend) produces the
error 3048 if i open too much (10) different forms at once. The application
is complex and uses lots of nested queries, the forms are also quite complex.
Every reference in the code is set to nothing before going out of scope, we
use a global handle to the CurrentDB, and all recordsets are closed when not
longer needed.
I searched the web for information on this topic, and all I found out was
that there is a limit to the table handles access can handle.
From all I red, there is no possibility to enlarge this limit, and the
common workarounds (using static tables for view results, reduce the
complexity of the forms,...) cannot be applied, because we really need to see
all of this information at once, and the information shown in the forms
belongs together, it makes no sense to close one form, open another, and then
go back to get the picture.
As I think there is no practicable solution other than migrating to
SqlServer with a new developed Client, there is one question left that would
be very interresting to me:
WHY IS there a limit for table handles? CPU and memory usage are always far
from their limits when this error occurs. Is this just a hardcoded number?
That doesn't make much sense from a developers view!
Btw: I also tried to put the Tables in a SqlExpress DB, which worked fine
until i tried to read the data from my access frontend. As SqlExpress can't
handle paralell usage, I got Odbc timeouts with no end.
Using exactly the same frontend and the same Tables in a SQLServer 2000
Standard Edition, everything worked well, only the Error 3048 was also there.
Please tell me that this is not just hardcoded, that would be like crippling
a healthy application for no reason!
Our database (a access 2003 db split into front- and backend) produces the
error 3048 if i open too much (10) different forms at once. The application
is complex and uses lots of nested queries, the forms are also quite complex.
Every reference in the code is set to nothing before going out of scope, we
use a global handle to the CurrentDB, and all recordsets are closed when not
longer needed.
I searched the web for information on this topic, and all I found out was
that there is a limit to the table handles access can handle.
From all I red, there is no possibility to enlarge this limit, and the
common workarounds (using static tables for view results, reduce the
complexity of the forms,...) cannot be applied, because we really need to see
all of this information at once, and the information shown in the forms
belongs together, it makes no sense to close one form, open another, and then
go back to get the picture.
As I think there is no practicable solution other than migrating to
SqlServer with a new developed Client, there is one question left that would
be very interresting to me:
WHY IS there a limit for table handles? CPU and memory usage are always far
from their limits when this error occurs. Is this just a hardcoded number?
That doesn't make much sense from a developers view!
Btw: I also tried to put the Tables in a SqlExpress DB, which worked fine
until i tried to read the data from my access frontend. As SqlExpress can't
handle paralell usage, I got Odbc timeouts with no end.
Using exactly the same frontend and the same Tables in a SQLServer 2000
Standard Edition, everything worked well, only the Error 3048 was also there.
Please tell me that this is not just hardcoded, that would be like crippling
a healthy application for no reason!