E
ExcelMonkey
I have a spreadsheet which is 24MB! Yikes! Anyways, I have been given the
task of reducing its size and increasing its speed. Now I know one of the
first things I should do is go to each worksheets and hit CNTRL End to find
out what the Used Range is. From there I can delete unused cells and
unneeded formatting.
I can also export VBA code and attempt to clean it.
There are macros in worsheets which are always active running. I know these
may cause speed issues. For example when I select certain cells, text
appears in them. When I select them again, the text disappears. These are
used as anchors or identifiers when looping through rows with ceratin
optimisation macros.
There are not any User Defined Functions.
I can get rid of unneeded comments.
There is a ton of Data Validation drop boxes (not sure if this matters).
Minimal Charts in the model (6-7).
Then my last task will be to start looking at formulas. I can look for
Volatile formulas (NOW(), RAND()) and see if there are too many of them.
This model is very large and has some very sophisticated formulas for DATES
and Dynamic Ranges. A typical formula might be:
(=SUM($G157:OFFSET(P157,0,-ROUNDUP(INDEX(P$245$254,$C291)*$D291,0)))-SUM($F291:O291)+OFFSET(P157,0,1-ROUNDUP(INDEX(P$245$254,$C291)*$D291,0))*(INDEX(P$245$254,$C291)-ROUNDDOWN(INDEX(P$245$254,$C291),0))).
I can't remember if OFFSET is a volatile formula. I am also assuming that
most other types of formulas that take ranges (Index, lookups Match) may
cause speed issues as well.
There is an active scenario manager built into the model where the active
run is based on a drop down box which pulls data from one of 6 columns.
These columns represent inputs for each of the runs. I could always reduce
this to one scenario.
I can always export the sheets and save them as individual files to see how
much of the file memory size stems from each sheet. Note this will not
necessarily speak to the speed. And it will not speak to macro memory.
Can anyone provide me with some other tips which I can use to reduce the
size and increase the speed of this spreadsheet. Or tips that I can used to
identify what portion of the spreadsheet contributes to size and speed loss.
Thanks.
task of reducing its size and increasing its speed. Now I know one of the
first things I should do is go to each worksheets and hit CNTRL End to find
out what the Used Range is. From there I can delete unused cells and
unneeded formatting.
I can also export VBA code and attempt to clean it.
There are macros in worsheets which are always active running. I know these
may cause speed issues. For example when I select certain cells, text
appears in them. When I select them again, the text disappears. These are
used as anchors or identifiers when looping through rows with ceratin
optimisation macros.
There are not any User Defined Functions.
I can get rid of unneeded comments.
There is a ton of Data Validation drop boxes (not sure if this matters).
Minimal Charts in the model (6-7).
Then my last task will be to start looking at formulas. I can look for
Volatile formulas (NOW(), RAND()) and see if there are too many of them.
This model is very large and has some very sophisticated formulas for DATES
and Dynamic Ranges. A typical formula might be:
(=SUM($G157:OFFSET(P157,0,-ROUNDUP(INDEX(P$245$254,$C291)*$D291,0)))-SUM($F291:O291)+OFFSET(P157,0,1-ROUNDUP(INDEX(P$245$254,$C291)*$D291,0))*(INDEX(P$245$254,$C291)-ROUNDDOWN(INDEX(P$245$254,$C291),0))).
I can't remember if OFFSET is a volatile formula. I am also assuming that
most other types of formulas that take ranges (Index, lookups Match) may
cause speed issues as well.
There is an active scenario manager built into the model where the active
run is based on a drop down box which pulls data from one of 6 columns.
These columns represent inputs for each of the runs. I could always reduce
this to one scenario.
I can always export the sheets and save them as individual files to see how
much of the file memory size stems from each sheet. Note this will not
necessarily speak to the speed. And it will not speak to macro memory.
Can anyone provide me with some other tips which I can use to reduce the
size and increase the speed of this spreadsheet. Or tips that I can used to
identify what portion of the spreadsheet contributes to size and speed loss.
Thanks.