K
KGOldWolf
I am running comparative performance tests and have an odd result. I can
post the code but the outcome is a bit surprising.
I read 10k rows which splits records resulting in 22k rows. In that process
I do a total of 100k VLOOKUPs (10k * 10 cells). When I built the 22k row
output directly to the worksheet, it took 12 seconds. I inserted an array,
built the output there and then moved the entire array to the worksheet at
the end of the procedure. That reduced processing time to 10 seconds (20%
reduction in elapsed time).
I then moved the "lookup table" to an array and the processing time jumped
to 60 seconds!
The questions are: 1) does that make sense to you? and 2) do some functions
(like VLOOKUPs) become inefficient when used in arrays?
My guess is that I am doing something wrong like not setting the array as a
table (but I can't seem to figure out how to do that).
I can post the code but I don't want to tie up a lot of board space.... any
general ideas on this?
Thanks,
Ken
post the code but the outcome is a bit surprising.
I read 10k rows which splits records resulting in 22k rows. In that process
I do a total of 100k VLOOKUPs (10k * 10 cells). When I built the 22k row
output directly to the worksheet, it took 12 seconds. I inserted an array,
built the output there and then moved the entire array to the worksheet at
the end of the procedure. That reduced processing time to 10 seconds (20%
reduction in elapsed time).
I then moved the "lookup table" to an array and the processing time jumped
to 60 seconds!
The questions are: 1) does that make sense to you? and 2) do some functions
(like VLOOKUPs) become inefficient when used in arrays?
My guess is that I am doing something wrong like not setting the array as a
table (but I can't seem to figure out how to do that).
I can post the code but I don't want to tie up a lot of board space.... any
general ideas on this?
Thanks,
Ken