N
nova
I have been reading a few interesting threads about DAO versus ADO in regards
to speed because I wanted to find out why my uploads were going so slow.
Here's what I want to do: I have 3 big tables that I want to fill with data
for a number of users to use in own defined queries. I'm talking about tables
with up to 10 million records (database up to 8 GB). I'm using VBA code to
read text files and load in the tables. This is supposed to be a one-time
operation. After that, I will do daily updates of around 10 to 20 thousend
records per day.
Now, because of the large amount of data (60 files - up to 200MB each
month), I cannot have this database on our network server. Since our company
has an SQL server, I wanted to experiment with that (I'm not familiar with
SQL server). Because I'm only used to work in Access and DAO, I had to
convert my code to ADO, which went fine after checking a few newsgroups.
I currently have the SQL tables linked in an access database using ODBC. But
it takes a very long time to update these tables (appr. 25 min. per file).
Therefor I'm trying now to load the data in local tables in order to move
these to the SQL server later. So far everything works fine and upload time
is acceptable (less than 1 min. per file).
What I wanted to know now is if the slow connection with the SQL server
wasn't going to be a problem for the end users if they will link them in
their database to execute their queries? Is there an alternative for ODBC ?
Or isn't ODBC the cause of the slow speed? What else can I do to easily and
(especially) quickly query these SQL tables ? You might need to know that it
is necessary for the users to be able to filter the tables (actually, I still
have to create the SQL 'views') on all possible fields. Should I put an index
on each field then?
to speed because I wanted to find out why my uploads were going so slow.
Here's what I want to do: I have 3 big tables that I want to fill with data
for a number of users to use in own defined queries. I'm talking about tables
with up to 10 million records (database up to 8 GB). I'm using VBA code to
read text files and load in the tables. This is supposed to be a one-time
operation. After that, I will do daily updates of around 10 to 20 thousend
records per day.
Now, because of the large amount of data (60 files - up to 200MB each
month), I cannot have this database on our network server. Since our company
has an SQL server, I wanted to experiment with that (I'm not familiar with
SQL server). Because I'm only used to work in Access and DAO, I had to
convert my code to ADO, which went fine after checking a few newsgroups.
I currently have the SQL tables linked in an access database using ODBC. But
it takes a very long time to update these tables (appr. 25 min. per file).
Therefor I'm trying now to load the data in local tables in order to move
these to the SQL server later. So far everything works fine and upload time
is acceptable (less than 1 min. per file).
What I wanted to know now is if the slow connection with the SQL server
wasn't going to be a problem for the end users if they will link them in
their database to execute their queries? Is there an alternative for ODBC ?
Or isn't ODBC the cause of the slow speed? What else can I do to easily and
(especially) quickly query these SQL tables ? You might need to know that it
is necessary for the users to be able to filter the tables (actually, I still
have to create the SQL 'views') on all possible fields. Should I put an index
on each field then?