Automating Macro Responses

J

Jason Lopez

I am pretty sure the subject line is making you scratch your head. So here
is my situation in more detail.

I have a form that, before loading, runs a macro that deletes a table and
then runs a Make-Table Query to be named as the same table that was just
deleted. The issue I have is that the macro asks me all these questions of
"Are you sure you want to do this?" Is there anyway to somehow convert the
macro in to code (which I know can be done) and have coded blended in with
it to automatically give those responses when the prompt comes up? The
sequence is the same and the responses are always the same as the fastest
way to run the queries for the reports and forms that rely upon it is to
create the new table and have it refreshed each time the database loads or
the form is loaded. Essentially, the sequence is like this:

1. Delete tbl_Query (prompt appears confirming that I want to do this)
2. Run qry_MakeNewTable (prompts appear to confirm the running of the query
and that it will write x rows to the new table)
3. Open Form

Any help in automating the responses to the prompts would be greatly
appreciated.

Jason
 
J

Jeff Boyce

Jason

Why? As in "why do you need to delete and re-Make the table?"

Or in other words, what does deleting and re-Making allow you to do?

(I ask because there may be simpler alternate approaches...)

Regards

Jeff Boyce
Microsoft Office/Access MVP
 
J

Jason Lopez

I have a previous post where I was provided with a two-stage query for
calculating the number of days between those involved in a particular
project (See post "Unique Query Dilema"). That query calculates the days of
all the completed jobs and takes (literally) forever. There are over 10,000
completed projects that are being filtered and evaluated based on certain
criteria as requested by the one running the report/form. To run just 200
filtered results takes about 15 minutes for the results to finally come up.
My goal is to have the table already created from the start or created ahead
of time by the user as the instructions currently around that two-stage
query is to run it and then walk away from the computer. If they have to
leave it on all night to run, then so be it. But then as you run down the
list, it stalls for a moment as the results are then displayed.

If there is a faster and easier alternate method, I am all ears. I would
like to make this process quicker and more simplified as the users are not
quite as computer savvy as my limited capabilities. Please share as I am
looking to try about anything to make this work in the most simplified
manner.

Jason
 
J

Jeff Boyce

Jason

It all starts with the data ... and I don't have a very clear picture of
your table structure.

Queries can run faster or slower depending on a number of factors...

Is the data "local" or "linked"?

Are the fields used for selection criteria, joins and sorting all indexed?

How many fields are in the table(s)?

How many tables are in the query?

Does the query use any Access functions on data coming from linked sources
(e.g., SQL-Server)?

How many fields is the query trying to return?

How many records (?10,000? -- that's nothing for Access to handle, assuming
the rest is in order)?

More info, please!

Regards

Jeff Boyce
Microsoft Office/Access MVP
 
J

Jason Lopez

The total number of records is into the 10,000's of just the project
numbers. But there are dates for tracking purposes associated with each
project. There could be anywhere from 1 to 20 dates associated with each
project. So the records for the dates (which is the query calculation part)
is up around the 100,000's by now. The exact count I don't know except that
it is a lot and if I try and export it to just a csv file, I reach 30 MB
easily. That is where I am trying to somehow automate the responses of
deleting and creating tables and writing records to a new table. I have no
problems with hitting numerous buttons to get the data into a local table
for faster calculations. But I am also trying to keep end users from
screwing up the data and/or not getting the metric results that they are
needing to pull.

Here is what I received from another user who helped me with the date
calculation:

------
for the above thing to work you need two queries

query1

SELECT signature.project, project.projectstartdate,
signature.Signature, signature.Dept, signature.SigDate, (SELECT
Count(project) + 1 FROM signature AS sig WHERE sig.sigdate <
signature.sigdate and sig.project = signature.project) AS recordnum
FROM project INNER JOIN signature ON project.project =
signature.project
ORDER BY signature.project, signature.SigDate;

and save this as signatures

now this query select the information from the signature table adn the
project table to get the project start date and add a control source
so you can compare the values to get the count of days between

then the magic query

SELECT signatures.project, signatures.Signature, signatures.Dept,
signatures.SigDate, IIf([sigdate] In (select top 1 sig.sigdate
from signatures as sig
where sig.project = signatures.project),[sigdate]-[projectstartdate],
(select top 1 signatures.sigdate-sig.sigdate
from signatures as sig
where sig.project = signatures.project and sig.recordnum =
signatures.recordnum-1)) AS TimeSign
FROM signatures;

save this as whatever

takes the control source in the query signatures and compares them to
get you the result

my raw test data

signature table
sigid project Signature Dept SigDate
1 1 John Do Leg 1/01/2008
2 1 Jane Smith control 3/01/2008
3 2 Jack Frost control 3/01/2008
4 1 frank leg 4/01/2008

Project table
project projectstartdate
1 29/12/2007
2 30/12/2007

and the result

project Signature Dept SigDate TimeSign
1 John Do Leg 1/01/2008 3
1 Jane Smith control 3/01/2008 2
1 frank leg 4/01/2008 1
2 Jack Frost control 3/01/2008 4

as i said if the above structure isnt yours please tell me and ill
make the modifications to compensate for it but all i need is the
control source for the project start date to be confirmed and i can
finilise the queries for you

hope this helps
------

That's what my calculation queries are. They work and work well. But they
also take a while to run as they are hitting the remote source. When I ran
the same thing on a local copy, it ran much faster of the same filtered date
range. If I were to guestimate the difference it time, I would have to say
the remote source took about 10-15 minutes for a set of 200 records while
locally it only took 3-5 minutes. I don't know if it was just the location
of the source information that was being accessed by the query. But the
time difference was noticeable.

I hope this helps to clarify a little bit more of what I am looking for.

Jason
 
J

Jeff Boyce

Jason

?Indexing?

?Network speed?

?Data stored in ??? back-end?

More info, please...

Regards

Jeff Boyce
Microsoft Office/Access MVP

Jason Lopez said:
The total number of records is into the 10,000's of just the project
numbers. But there are dates for tracking purposes associated with each
project. There could be anywhere from 1 to 20 dates associated with each
project. So the records for the dates (which is the query calculation
part) is up around the 100,000's by now. The exact count I don't know
except that it is a lot and if I try and export it to just a csv file, I
reach 30 MB easily. That is where I am trying to somehow automate the
responses of deleting and creating tables and writing records to a new
table. I have no problems with hitting numerous buttons to get the data
into a local table for faster calculations. But I am also trying to keep
end users from screwing up the data and/or not getting the metric results
that they are needing to pull.

Here is what I received from another user who helped me with the date
calculation:

------
for the above thing to work you need two queries

query1

SELECT signature.project, project.projectstartdate,
signature.Signature, signature.Dept, signature.SigDate, (SELECT
Count(project) + 1 FROM signature AS sig WHERE sig.sigdate <
signature.sigdate and sig.project = signature.project) AS recordnum
FROM project INNER JOIN signature ON project.project =
signature.project
ORDER BY signature.project, signature.SigDate;

and save this as signatures

now this query select the information from the signature table adn the
project table to get the project start date and add a control source
so you can compare the values to get the count of days between

then the magic query

SELECT signatures.project, signatures.Signature, signatures.Dept,
signatures.SigDate, IIf([sigdate] In (select top 1 sig.sigdate
from signatures as sig
where sig.project = signatures.project),[sigdate]-[projectstartdate],
(select top 1 signatures.sigdate-sig.sigdate
from signatures as sig
where sig.project = signatures.project and sig.recordnum =
signatures.recordnum-1)) AS TimeSign
FROM signatures;

save this as whatever

takes the control source in the query signatures and compares them to
get you the result

my raw test data

signature table
sigid project Signature Dept SigDate
1 1 John Do Leg 1/01/2008
2 1 Jane Smith control 3/01/2008
3 2 Jack Frost control 3/01/2008
4 1 frank leg 4/01/2008

Project table
project projectstartdate
1 29/12/2007
2 30/12/2007

and the result

project Signature Dept SigDate TimeSign
1 John Do Leg 1/01/2008 3
1 Jane Smith control 3/01/2008 2
1 frank leg 4/01/2008 1
2 Jack Frost control 3/01/2008 4

as i said if the above structure isnt yours please tell me and ill
make the modifications to compensate for it but all i need is the
control source for the project start date to be confirmed and i can
finilise the queries for you

hope this helps
------

That's what my calculation queries are. They work and work well. But
they also take a while to run as they are hitting the remote source. When
I ran the same thing on a local copy, it ran much faster of the same
filtered date range. If I were to guestimate the difference it time, I
would have to say the remote source took about 10-15 minutes for a set of
200 records while locally it only took 3-5 minutes. I don't know if it
was just the location of the source information that was being accessed by
the query. But the time difference was noticeable.

I hope this helps to clarify a little bit more of what I am looking for.

Jason



Jeff Boyce said:
Jason

It all starts with the data ... and I don't have a very clear picture of
your table structure.

Queries can run faster or slower depending on a number of factors...

Is the data "local" or "linked"?

Are the fields used for selection criteria, joins and sorting all
indexed?

How many fields are in the table(s)?

How many tables are in the query?

Does the query use any Access functions on data coming from linked
sources (e.g., SQL-Server)?

How many fields is the query trying to return?

How many records (?10,000? -- that's nothing for Access to handle,
assuming the rest is in order)?

More info, please!

Regards

Jeff Boyce
Microsoft Office/Access MVP
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top