Does Access 2000 support dual processor environment?

D

Douglas J Steele

Depends by what you mean by "support".

It'll work in a dual-process environment, but I don't believe it'll take any
advantage of the fact that there are two processors.
 
A

Albert D.Kallal

computer said:
Does Access 2000 support dual processor environment?

Well, the operating system does. So, if you have two programs running, and
one is access 2000, then it would get its own thread. You would have double
the performance at this point in time.

However, if you asking does ms-access itself take advantage of a duel
process..the answer is no.

On the other hand..why would upping the processing by a factor of 10 times
make ANY difference in ms-access? ms-access has not been processor bound for
at least 5 years in terms of hardware. In other words, taking a machine,a nd
multiplying the processing by a factor of 2, or 5 or even 10 times will NOT
make ANY difference in terms of speed of the application.

The bottle neck for ms-access is of course the speed of the disk drive, or
the network (if you are sharing the appcation). Since increasing the
processing does little, or nothing for the disk i/o..then increasing the
processing as a general rule does NOT change ANY performance of ms-access on
a good modern hardware.

So, even if ms-access could use both processor...it would make no
differences here. ms-access can easily execite "MILLIONS" of instrucitons in
one second....and I can't recal that every being much of a problem.

Really, since ms-access is not processor bound..then your question will not
answer much..

Hum...why do you ask?
 
C

computer

Douglas,

In a scenario with very large access db and as such very slow to process on
a P4 2.8 ghz. if xeon dual processor 2 ghz is used, will it be faster?
 
C

computer

Albert,

The reason I am asking is because I am looking for a way by which access db
processes quicker.
 
A

Albert D.Kallal

computer said:
Douglas,

In a scenario with very large access db and as such very slow to process
on
a P4 2.8 ghz. if xeon dual processor 2 ghz is used, will it be faster?

No...see my other post. Ms-access has not been processor bound for
years..and can execute millions of instructions in a second...
 
A

Albert D.Kallal

computer said:
Albert,

The reason I am asking is because I am looking for a way by which access
db
processes quicker.

Then you have to look at your design, and reduce the amount of i/o.

For example, opening up a reocrdset can take a VERY long time. Think of
opening up a reocrdset much like using a helicopter to cross the street. The
amount of time it takes for the blades to "spin up" to speed, and the
helicopter gets airborne is MUCH more time then it takes for a person to
simply "walk" across the same street.

In fact, ms-access can pull in about 100,000 records in the time it TAKES to
open a reocrdset. What this means is that while ms-access has all the
processing it needs, when it goes out to the operating system, and has to
WAIT for the system to open a table, then that is huge delay.

The logical conclusion from the above is that if you have a processing
loop, and INSIDE this loop have to open, and create a reocrdet for each
interaction of the loop..then you killed performance.

So, to get your performance up to speed, you need to address the i/o issue,
and reduce it at every possible point in your system.

You mention that you have a large file...it might help to mention the number
of records.

Lets assume a typical products database and we want to keep
inventory..

Lets assume 500 products in the product table. = 500 records

Lets assume that we had 5 stock taking days where we added stock to EACH
product 5 TIMES this year.
(so, each stock item was re-stocked 5 times during the year. Note that the
sample given design allows for price changes as new stock arrives).

That now means that our Inventory table has 2500 records.

Lets also assume that each Inventory item has 50 orders in the invoices
(order details) table on average.


That now means our Orders Details table has 50 * 2500 = 125,000 records.


So, what we want to do is calculate quantity on hand.

So, we got 125,000 detail records, and 2500 inventory items (and 500
products.

Remember, in a modern system, we do NOT store the quality on hand, but must
calculate it on the fly.

The beauty of this approach is that I can then simply delete, or add, or
modify records, and the totals for inventory is always 100% correct.


Further, I can just paint away with the
forms designer and build sub forms etc for invoice details where users
can enter the quantity ordered. Again, no special code is needed to
update the inventory stocks since I *CALCULATE* it on the fly
when needed.

That means the sql has to join up all records that belong to each
product..and sum them, and then subtract the quantities in the
invoice details.

Time to total up all of the in-stock records (that is running sql
statements to total all Inventory additions less all those 125,000 order
details to come up with a total for EACH product.???


On a average pc today, ms-access will total up and generate the quality on
hand for those 125,000 detail of is LESS then 1 second. (this time
includes the matching, adding, and subtracting of all orders in the system).

So, processing 125,000 reocrds is VERY fast....


How large are your datasets you are working with?
 
C

computer

Thanks Albert, I think I have enough info now.

Albert D.Kallal said:
Then you have to look at your design, and reduce the amount of i/o.

For example, opening up a reocrdset can take a VERY long time. Think of
opening up a reocrdset much like using a helicopter to cross the street. The
amount of time it takes for the blades to "spin up" to speed, and the
helicopter gets airborne is MUCH more time then it takes for a person to
simply "walk" across the same street.

In fact, ms-access can pull in about 100,000 records in the time it TAKES to
open a reocrdset. What this means is that while ms-access has all the
processing it needs, when it goes out to the operating system, and has to
WAIT for the system to open a table, then that is huge delay.

The logical conclusion from the above is that if you have a processing
loop, and INSIDE this loop have to open, and create a reocrdet for each
interaction of the loop..then you killed performance.

So, to get your performance up to speed, you need to address the i/o issue,
and reduce it at every possible point in your system.

You mention that you have a large file...it might help to mention the number
of records.

Lets assume a typical products database and we want to keep
inventory..

Lets assume 500 products in the product table. = 500 records

Lets assume that we had 5 stock taking days where we added stock to EACH
product 5 TIMES this year.
(so, each stock item was re-stocked 5 times during the year. Note that the
sample given design allows for price changes as new stock arrives).

That now means that our Inventory table has 2500 records.

Lets also assume that each Inventory item has 50 orders in the invoices
(order details) table on average.


That now means our Orders Details table has 50 * 2500 = 125,000 records.


So, what we want to do is calculate quantity on hand.

So, we got 125,000 detail records, and 2500 inventory items (and 500
products.

Remember, in a modern system, we do NOT store the quality on hand, but must
calculate it on the fly.

The beauty of this approach is that I can then simply delete, or add, or
modify records, and the totals for inventory is always 100% correct.


Further, I can just paint away with the
forms designer and build sub forms etc for invoice details where users
can enter the quantity ordered. Again, no special code is needed to
update the inventory stocks since I *CALCULATE* it on the fly
when needed.

That means the sql has to join up all records that belong to each
product..and sum them, and then subtract the quantities in the
invoice details.

Time to total up all of the in-stock records (that is running sql
statements to total all Inventory additions less all those 125,000 order
details to come up with a total for EACH product.???


On a average pc today, ms-access will total up and generate the quality on
hand for those 125,000 detail of is LESS then 1 second. (this time
includes the matching, adding, and subtracting of all orders in the system).

So, processing 125,000 reocrds is VERY fast....


How large are your datasets you are working with?


--
Albert D. Kallal (Access MVP)
Edmonton, Alberta Canada
(e-mail address removed)
http://www.members.shaw.ca/AlbertKallal
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top