Clarifications....
That is 0.040 seconds (really 0.037 sec) to recalculate 3000 formulas
already entered into cells. It takes about 0.090 sec to copy-and-paste.
I'm not sure if you are measuring copy-and-paste time or simply
recalculation time. I will use recalculation times below
Both numbers are with ScreenUpdating=True (normal mode). The times are 0.26
sec and 0.70 sec respectively with ScreenUpdating=False. That is useful
only if you want to put the operation into a VBA macro. It might make only
a small (imperceptible?) difference in your case. I will leave
ScreenUpdating=True for my times below.
Also, I should note that all times are for a worst-case (longest) lookup.
But I'm not sure that makes a significant difference since with
VLOOKUP(...,TRUE), there are at most 20 (or 21?) comparisons when the lookup
column comprises 1,048,576 rows.
In any case, since you are using Excel 2003 and you indicate that the lookup
data comprises only 450 rows, I will use that scenario, since the file size
might have significant time in external file references.
Finally, I reiterate that YMMV depending on the complexity of the workbook
besides the formulas and data you tell us about, as well as on your
computer's characteristics.
Walter Briscoe said:
stations is a reference to a worksheet in a closed workbook.
Stations refers to =[stations.xls]Sheet1!$A$1:$CB$427.
I can't remember how [stations.xls] references
C:\Users\IBM\AppData\Roaming\Microsoft\Excel\stations.xls
That is quite a significant "little detail" to omit! That makes a __huge__
difference in the VLOOKUP time, especially if stations.xls is closed. (Not
recommended.)
If stations.xls is closed in the VLOOKUP Excel instance, 3000 statements
(6000 VLOOKUPs) takes 31 to 33 sec. When stations.xls is open in the
VLOOKUP Excel instance, it takes about 0.033 sec.
So the first major performance improvement would be to ensure that
stations.xls is open in the same Excel instance.
Note: My "data.xls" file contains only A1:CB450 data (contants). If your
stations.xls has a lot of other data, even if they are constants, that might
significantly affect the file size, which might increase the close-file
VLOOKUP reference time significantly. That might explain why my "long" time
is about 33 sec, whereas yours is about 13 min.
Walter Briscoe said:
If I open that workbook, my 13:25 becomes ~0:10
Ding!
Much ado about nothing. You had already come to the right conclusion
yourself.
FYI, that explains why the reference is =[stations.xls]Sheet1!$A$1:$CB$427
instead of
=[C:\Users\IBM\AppData\Roaming\Microsoft\Excel\stations.xls]Sheet1!$A$1:$CB$427.
Excel automagically adjusts the reference when you open the file in the same
Excel instance.
Walter Briscoe said:
How do you measure 0.04 seconds? I am using my wrist watch.
For long-ish times, i.e. a lot more than 15.625 milliseconds, you could do
something like the following:
Sub doit()
Dim t As Double
Dim st As Single, et As Single ' type of Timer
st = Timer
Range("b1:b3000").Dirty ' range with VLOOKUP formulas
et = Timer
t = et - st ' seconds
MsgBox Format(t, "0.000000")
End Sub
Caveats....
The elapsed time might be off by as much as almost 15.625 msec; 0.015625
sec.
Performance measurement and interpretation is part art as well as science.
For example, the time for Range("b1").Dirty is misleading because it
includes significant overhead due to "communciation" between Excel and VBA.
The time for Range("b1:b3000").Dirty overwhelms that overhead, so the
overhead becomes insignificant (we hope). However, it opens the door for a
lot of extraneous computer activity (other events). On the other hand,
arguably that is realistic. It depends on what exactly we want to measure.
Another issue is whether or not to disable ScreenUpdating and Automatic
recalculation. Again, it depends on what exactly we want to measure. In
this case, I purposely keep those features enabled in order to mimic the
"real world" situation.
But if we did disable Automatic recalculation, we must use Range.Calculate
instead of Range.Dirty.
Either way can result in significant variability of time measurements.
Moreover, I have found that there is signficant VBA overhead the first time
a code path is executed after any change. And some overhead arises each
time a macro is executed. So I routinely ignore the first execution of a
code path. For "serious" measurements, I collect a "sufficient" sampling
for statistical analysis.
Finally, usually I use a timer with greater resolution. Besides avoiding
the off-by-16-msec anomaly, it is useful for measuring small intervals (if
we can control overhead). Here are the macros that I use.
Option Explicit
Public Declare Sub Sleep Lib "kernel32" (ByVal msec As Long)
Public Declare Function QueryPerformanceFrequency Lib "kernel32" _
(ByRef freq As Currency) As Long
Public Declare Function QueryPerformanceCounter Lib "kernel32" _
(ByRef cnt As Currency) As Long
Private freq As Currency, df As Double
Function myTimer() As Currency
QueryPerformanceCounter myTimer
End Function
Function myElapsedTime(ByVal dt As Currency) As Double
If freq = 0 Then QueryPerformanceFrequency freq: df = freq
myElapsedTime = dt / df
End Function
Usage:
Dim t as Double
Dim st As Currency, et As Currency
st = myTimer
' ... activity to measure ...
et = myTimer
t = myElapsedTime(et - st) ' seconds