Slow recalculation

R

R. Choate

I have a simple but long worksheet (40K rows) with 6 columns. There is no VBA code. The formulas are relatively simple and there are
not any array formulas. It is set for manual recalculation. The worksheet does not have unused cells beyond the occupied range. I
have tested this on multiple fast computers with lots of RAM on each.
When I hit F9 it can take up to 30 minutes to recalculate. Please help.

Thanks in advance!
 
J

John F. Collins

Are the 40,000 formulas in each row different, or are there about 6 formulas
that were copied down 40,000 rows?

Is anything circular and iteration is turned on?

Have you tried recreating the workbook from scratch in a new instance of
excel?


R. Choate said:
I have a simple but long worksheet (40K rows) with 6 columns. There is no
VBA code. The formulas are relatively simple and there are
not any array formulas. It is set for manual recalculation. The worksheet
does not have unused cells beyond the occupied range. I
 
R

R. Choate

There are about 6 formulas copied down almost 40K rows. There are a few rows (12) used up by a little lookup table. There is nothing
circular and iteration is not turned on. I set it to 5 iterations at .01 change. None of this made any difference. It does suck up a
LOT of resources. My PC has a gig of RAM and has a 2.4Ghz processor. Other machines we've tried this on have had even more of both
and some with a little less. I did not personally let it go for 30 minutes. It won't let me do anything else on my computer while it
is recalculating. Every other app, even the most memory intensive ones (and there are a lot of them), process much faster and allow
me to multi-task on my PC. My PC is optimized for performance and is regularly de-fragged and maintained. This is a problem with one
file, which has been rebuilt and also it has been copied over to a new workbook. In short, everything reasonable has been checked
and tried. I just wondered if there was anything known about using Excel files with so many rows. I am aware that Excel bogs down
badly when loaded down with even the simplest of things. Excel can't handle very many charts in a single workbook, either, no matter
HOW much RAM and other goodies you might have. I know that I am not going to try to "fix" my PC to make this file go faster when I
know my PC works great, as described above. I don't have to "fix" the problem, I just want to do my due diligence and see if anybody
else has comments about large worksheets behaving badly. This file is over 8 Meg and only has one sheet and it has no VBA code at
all (or modules). I think it is just that Excel cannot deal with large files.

I am aware that there is a known issue with using the SUMIF function when the 2 ranges have different numbers if cells. This file
does use the SUMIF function, but the two ranges have equal numbers of cells (KB article #902324
http://support.microsoft.com/default.aspx?scid=kb;en-us;902324) and should not be a problem. If anybody has any further sage advice,
I'm all ears (That includes you, Tom, Norman, and the rest of you guys ;-) )

Thanks,

Richard
--
RMC,CPA


Are the 40,000 formulas in each row different, or are there about 6 formulas
that were copied down 40,000 rows?

Is anything circular and iteration is turned on?

Have you tried recreating the workbook from scratch in a new instance of
excel?


R. Choate said:
I have a simple but long worksheet (40K rows) with 6 columns. There is no
VBA code. The formulas are relatively simple and there are
not any array formulas. It is set for manual recalculation. The worksheet
does not have unused cells beyond the occupied range. I
 
T

Tom Ogilvy

Best I could offer is to look at it if you want to zip it and send it to
(e-mail address removed)

--
Regards,
Tom Ogilvy

R. Choate said:
There are about 6 formulas copied down almost 40K rows. There are a few
rows (12) used up by a little lookup table. There is nothing
circular and iteration is not turned on. I set it to 5 iterations at .01
change. None of this made any difference. It does suck up a
LOT of resources. My PC has a gig of RAM and has a 2.4Ghz processor. Other
machines we've tried this on have had even more of both
and some with a little less. I did not personally let it go for 30
minutes. It won't let me do anything else on my computer while it
is recalculating. Every other app, even the most memory intensive ones
(and there are a lot of them), process much faster and allow
me to multi-task on my PC. My PC is optimized for performance and is
regularly de-fragged and maintained. This is a problem with one
file, which has been rebuilt and also it has been copied over to a new
workbook. In short, everything reasonable has been checked
and tried. I just wondered if there was anything known about using Excel
files with so many rows. I am aware that Excel bogs down
badly when loaded down with even the simplest of things. Excel can't
handle very many charts in a single workbook, either, no matter
HOW much RAM and other goodies you might have. I know that I am not going
to try to "fix" my PC to make this file go faster when I
know my PC works great, as described above. I don't have to "fix" the
problem, I just want to do my due diligence and see if anybody
else has comments about large worksheets behaving badly. This file is over
8 Meg and only has one sheet and it has no VBA code at
all (or modules). I think it is just that Excel cannot deal with large files.

I am aware that there is a known issue with using the SUMIF function when
the 2 ranges have different numbers if cells. This file
does use the SUMIF function, but the two ranges have equal numbers of cells (KB article #902324
http://support.microsoft.com/default.aspx?scid=kb;en-us;902324) and should
not be a problem. If anybody has any further sage advice,
 
J

John F. Collins

I use files that measure in the tens of thousands of rows, and have plots
that hit the 32,000 limit on number of points. They do not use SUMIF, but
they use array formulas of the type {SUM(IF(range1<range2, range3,0))} As
files sizes grow, they work reasonably well until hitting a wall around 65
MB, after which they are unuseable. I think my version of Excel can not use
more than 64MB of RAM not matter what I allocate. Your 8 MB file should not
be a problem unless the initial memory allocation upon starting Excel is set
to around 8 MB.

To aid in your diagnosis, I have noticed two things that go wrong.

1) Normally the workbook keeps track of what has already been calculated and
what needs to be recalculated when a change to a cell is made. After a
certain size or complexity, Excel loses track of what needs to be
recalculated and what does not. A change that affects only one cell will
cause the entire workbook to be recalculated. This effect slows things down
considerably, but I can still use the book by setting recalculation to
manual and suffering through the recalc when needed. The recalcs only take
a minute or two or three. I have dual Pentium III 550 MHz processors or
thereabouts. If this effect occurs and you have dependencies among cells,
you could see how calculating them in the wrong order could require 40,000 +
39,999 +39,998 ... calcuations instead of 40,000 calculations.

2) When the file size gets too large, then Excel appears to get stuck in a
mode where it is constantly swapping enourmous quantities of information to
disk. It becomes unusable. It does not produce the "Not enough memory"
error dialog. Instead, it may look dead, but it will finish if you leave it
overnight.

John

P.S. It is still possible that file is corrupted. What happens when you
regenerate it in a new instance of Excel?

R. Choate said:
There are about 6 formulas copied down almost 40K rows. There are a few
rows (12) used up by a little lookup table. There is nothing
circular and iteration is not turned on. I set it to 5 iterations at .01
change. None of this made any difference. It does suck up a
LOT of resources. My PC has a gig of RAM and has a 2.4Ghz processor. Other
machines we've tried this on have had even more of both
and some with a little less. I did not personally let it go for 30
minutes. It won't let me do anything else on my computer while it
is recalculating. Every other app, even the most memory intensive ones
(and there are a lot of them), process much faster and allow
me to multi-task on my PC. My PC is optimized for performance and is
regularly de-fragged and maintained. This is a problem with one
file, which has been rebuilt and also it has been copied over to a new
workbook. In short, everything reasonable has been checked
and tried. I just wondered if there was anything known about using Excel
files with so many rows. I am aware that Excel bogs down
badly when loaded down with even the simplest of things. Excel can't
handle very many charts in a single workbook, either, no matter
HOW much RAM and other goodies you might have. I know that I am not going
to try to "fix" my PC to make this file go faster when I
know my PC works great, as described above. I don't have to "fix" the
problem, I just want to do my due diligence and see if anybody
else has comments about large worksheets behaving badly. This file is over
8 Meg and only has one sheet and it has no VBA code at
all (or modules). I think it is just that Excel cannot deal with large files.

I am aware that there is a known issue with using the SUMIF function when
the 2 ranges have different numbers if cells. This file
does use the SUMIF function, but the two ranges have equal numbers of cells (KB article #902324
http://support.microsoft.com/default.aspx?scid=kb;en-us;902324) and should
not be a problem. If anybody has any further sage advice,
 
R

R. Choate

Hi John,

The file is not mine, but I have a copy of it. The guy who made it is very sharp and is keen on detail. He swears they have tried
every variation on every machine they've got (about 40). That includes, he says, copying to text file and back. I don't know how
that would help, since you would lose your formulas by doing so. I guess he turned on formula view before exporting. He has also
copied to a clean workbook. Same result. I have also tried that with no better results. I don't think it is corrupted. I'm trying to
talk him into using Access for stuff like this.

By the way, have you read that Office 12 is going to give us 16,347 columns (finally) and over a million rows? They also said that
they are not going to put a governor on memory anymore and will allow whatever Windows has available. There are many other changes,
but those are the big highlights for me. I wonder what that will do for spreadsheets like this. Oh, they are also increasing the
number of characters in a cell to 32K and the number of nested functions is going up to 64.

As to my original issue. I say lets blow it off. This is not going to be an ongoing-use worksheet. Thanks anyway for the help.

Richard
--
RMC,CPA


I use files that measure in the tens of thousands of rows, and have plots
that hit the 32,000 limit on number of points. They do not use SUMIF, but
they use array formulas of the type {SUM(IF(range1<range2, range3,0))} As
files sizes grow, they work reasonably well until hitting a wall around 65
MB, after which they are unuseable. I think my version of Excel can not use
more than 64MB of RAM not matter what I allocate. Your 8 MB file should not
be a problem unless the initial memory allocation upon starting Excel is set
to around 8 MB.

To aid in your diagnosis, I have noticed two things that go wrong.

1) Normally the workbook keeps track of what has already been calculated and
what needs to be recalculated when a change to a cell is made. After a
certain size or complexity, Excel loses track of what needs to be
recalculated and what does not. A change that affects only one cell will
cause the entire workbook to be recalculated. This effect slows things down
considerably, but I can still use the book by setting recalculation to
manual and suffering through the recalc when needed. The recalcs only take
a minute or two or three. I have dual Pentium III 550 MHz processors or
thereabouts. If this effect occurs and you have dependencies among cells,
you could see how calculating them in the wrong order could require 40,000 +
39,999 +39,998 ... calcuations instead of 40,000 calculations.

2) When the file size gets too large, then Excel appears to get stuck in a
mode where it is constantly swapping enourmous quantities of information to
disk. It becomes unusable. It does not produce the "Not enough memory"
error dialog. Instead, it may look dead, but it will finish if you leave it
overnight.

John

P.S. It is still possible that file is corrupted. What happens when you
regenerate it in a new instance of Excel?

R. Choate said:
There are about 6 formulas copied down almost 40K rows. There are a few
rows (12) used up by a little lookup table. There is nothing
circular and iteration is not turned on. I set it to 5 iterations at .01
change. None of this made any difference. It does suck up a
LOT of resources. My PC has a gig of RAM and has a 2.4Ghz processor. Other
machines we've tried this on have had even more of both
and some with a little less. I did not personally let it go for 30
minutes. It won't let me do anything else on my computer while it
is recalculating. Every other app, even the most memory intensive ones
(and there are a lot of them), process much faster and allow
me to multi-task on my PC. My PC is optimized for performance and is
regularly de-fragged and maintained. This is a problem with one
file, which has been rebuilt and also it has been copied over to a new
workbook. In short, everything reasonable has been checked
and tried. I just wondered if there was anything known about using Excel
files with so many rows. I am aware that Excel bogs down
badly when loaded down with even the simplest of things. Excel can't
handle very many charts in a single workbook, either, no matter
HOW much RAM and other goodies you might have. I know that I am not going
to try to "fix" my PC to make this file go faster when I
know my PC works great, as described above. I don't have to "fix" the
problem, I just want to do my due diligence and see if anybody
else has comments about large worksheets behaving badly. This file is over
8 Meg and only has one sheet and it has no VBA code at
all (or modules). I think it is just that Excel cannot deal with large files.

I am aware that there is a known issue with using the SUMIF function when
the 2 ranges have different numbers if cells. This file
does use the SUMIF function, but the two ranges have equal numbers of cells (KB article #902324
http://support.microsoft.com/default.aspx?scid=kb;en-us;902324) and should
not be a problem. If anybody has any further sage advice,
 
J

John F. Collins

See this link provided by Ron de Bruin
John

http://www.decisionmodels.com/calcsecrets.htm


R. Choate said:
There are about 6 formulas copied down almost 40K rows. There are a few
rows (12) used up by a little lookup table. There is nothing
circular and iteration is not turned on. I set it to 5 iterations at .01
change. None of this made any difference. It does suck up a
LOT of resources. My PC has a gig of RAM and has a 2.4Ghz processor. Other
machines we've tried this on have had even more of both
and some with a little less. I did not personally let it go for 30
minutes. It won't let me do anything else on my computer while it
is recalculating. Every other app, even the most memory intensive ones
(and there are a lot of them), process much faster and allow
me to multi-task on my PC. My PC is optimized for performance and is
regularly de-fragged and maintained. This is a problem with one
file, which has been rebuilt and also it has been copied over to a new
workbook. In short, everything reasonable has been checked
and tried. I just wondered if there was anything known about using Excel
files with so many rows. I am aware that Excel bogs down
badly when loaded down with even the simplest of things. Excel can't
handle very many charts in a single workbook, either, no matter
HOW much RAM and other goodies you might have. I know that I am not going
to try to "fix" my PC to make this file go faster when I
know my PC works great, as described above. I don't have to "fix" the
problem, I just want to do my due diligence and see if anybody
else has comments about large worksheets behaving badly. This file is over
8 Meg and only has one sheet and it has no VBA code at
all (or modules). I think it is just that Excel cannot deal with large files.

I am aware that there is a known issue with using the SUMIF function when
the 2 ranges have different numbers if cells. This file
does use the SUMIF function, but the two ranges have equal numbers of cells (KB article #902324
http://support.microsoft.com/default.aspx?scid=kb;en-us;902324) and should
not be a problem. If anybody has any further sage advice,
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top