Break large table into smaller

A

Andrew

Hi,

I have a large table (Tbl) that contains 450 000 records and 200 fields.
There are 5 Departments in it (feild Dept) and I want to create 5 tables from
it.

I have been able to do this using the following code:

Private Sub Btn_Produce_Commitment_Tables_Part_1()
On Error Resume Next
Dim dbs As DAO.Database
Dim rst As DAO.Recordset
Dim qdf As DAO.QueryDef
DoCmd.SetWarnings False

Set dbs = CurrentDb
Set rst = dbs.OpenRecordset("SELECT * FROM Qry_Summary_Of_Departments")
With rst
Do Until .EOF
sSQL = "SELECT Tbl_Range_By_Season_Data.* INTO
Tbl_Commitments_Data_" & !Dept & " "
sSQL = sSQL & "FROM Tbl_Range_By_Season_Data "
sSQL = sSQL & "WHERE (((Tbl_Range_By_Season_Data.Dept)=" & !Dept
& "));"
DoCmd.RunSQL sSQL
DoCmd.OpenQuery "Qry_Make_Table_Tbl_Summary_Of_Business_Groups_And_Depts"
' Call Btn_Produce_Commitment_Tables_Part_2
.MoveNext
Loop
Found_Last:
.Close
End With
' sSQL = "DELETE Tbl_Range_By_Season_Data_Temp.* FROM
Tbl_Range_By_Season_Data_Temp;"
' sSQL = "DELETE Tbl_Summary_Of_Business_Groups_And_Depts.* FROM
Tbl_Summary_Of_Business_Groups_And_Depts;"
Set qdf = Nothing
Set rst = Nothing
Set dbs = Nothing
End Sub

This process takes quite some time to run as I THINK it needs to open the
rst each time and check then write each record individually. It takes 15
minutes to do each Department.

Is there any way to speed it up? If it was Excel I'd sort it by department
and find the first occurrance of a new department and the last then highlight
the range, copy and paste into a new sheet. Is there an equivelant method
available.
 
J

John W. Vinson

Is there any way to speed it up? If it was Excel I'd sort it by department
and find the first occurrance of a new department and the last then highlight
the range, copy and paste into a new sheet. Is there an equivelant method
available.

No copy and paste is needed - a simple Append query will work.

BUT... you're breaking it the wrong way!!!!

Breaking an enormously wide 200 field table into five identically structured
enormously wide 200 field tables is simply *wrong*. It's making a bad
situation worse!

You're almost surely "committing spreadsheet" - a misdemeanor punishible by
being sentenced to read about normalization:

Jeff Conrad's resources page:
http://www.accessmvp.com/JConrad/accessjunkie/resources.html

The Access Web resources page:
http://www.mvps.org/access/resources/index.html

A free tutorial written by Crystal (MS Access MVP):
http://allenbrowne.com/casu-22.html

MVP Allen Browne's tutorials:
http://allenbrowne.com/links.html#Tutorials

It's all but certain that you have several sets of repeating fields in the
table. Could you perhaps post some representative fieldnames? Someone should
be able to suggest a better normalized design.

John W. Vinson [MVP]
 
A

Albert D. Kallal

sorting, and cut and paste should not be faster then your process....

About the only thing I could suggest is that you make sure there is a index
on the Dept field in the table..

The reocrdset problem is not an issue, since you only working with 5 records
in that reocrdset...right??
 
A

Andrew

Hi John,

Guilty as charged.


I've had a quick look through the sites you've recommended - plenty to take
in - THANKS.

To provide you with a little more background I have a big table as described
earlier. I have it indexed but every time I query it it takes 15+ minutes.
End users will be access the data via Excel. I have written a macro in Excel
to prompt the users for the department number which then extracts ALL the
reords matching the criteria. They aren't prepared to wait 15+minutes so I
was going to create smaller departmental tables so they'd be reading and
accessing their data only. I'd do this early in the morning so the data was
available for them when they arrived later that morning.

A question for you - does the number of fields a substantial impact on the
time taken to perform such a query? I thought soley the number of records
would determine it but obviously it is a combination of both.

Can you please answer this question and in the meantime I'm going to trim
down the number of fields significantly to around 20.

Andrew Buchter
 
J

John W. Vinson

A question for you - does the number of fields a substantial impact on the
time taken to perform such a query? I thought soley the number of records
would determine it but obviously it is a combination of both.

Absolutely, especially if it's over a network. Retrieving 200 fields just
plain takes longer than retrieving 20.

I'm not sure what your query is doing, but it's possible that further indexing
(on the department field in particular, but probably other fields as well -
any used for searching or sorting) will help.

John W. Vinson [MVP]
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top