K
Keith R
I have a series of... oh, about 96-100 web pages that I need to download
into separate Excel worksheets. The source files are raw text. The web
address is just the name of the file (no htm or asp or other extension,
just "www..../cmhoct02" and when I copy and paste the entire page into
notepad, it looks exactly the same as onscreen. Characters are fixed width,
and each new column starts after a specific number of chars. I verified by
also looking at the source of the page, and there are no html tags, just
the raw text exactly like it looks in the browser.
I've done some other stuff with VBA, but never anything to do with
accessing or downloading web pages, so I need some pointers for where to
start (code snippets, general info, etc). I can write the code to generate
the string of the full web address including the filename (fortunately they
are sequential), but how do I use that to have Excel access the page and
grab the contents?
Also, when I copy/paste the web page into Excel as-is, it puts the whole
darn thing into one cell (one row anyway, and it seems to automatically
merge multiple columns).
When importing each page, I want to pull the "visual" data columns into
separate columns in each XL worksheet. I know if I was importing a flat
file, the import wizard would allow me to do this; can I access the same
functionality directly, or do I need to parse each line as it comes in and
force it into separate columns myself?
Using XL97
Many TIA,
Keith R
into separate Excel worksheets. The source files are raw text. The web
address is just the name of the file (no htm or asp or other extension,
just "www..../cmhoct02" and when I copy and paste the entire page into
notepad, it looks exactly the same as onscreen. Characters are fixed width,
and each new column starts after a specific number of chars. I verified by
also looking at the source of the page, and there are no html tags, just
the raw text exactly like it looks in the browser.
I've done some other stuff with VBA, but never anything to do with
accessing or downloading web pages, so I need some pointers for where to
start (code snippets, general info, etc). I can write the code to generate
the string of the full web address including the filename (fortunately they
are sequential), but how do I use that to have Excel access the page and
grab the contents?
Also, when I copy/paste the web page into Excel as-is, it puts the whole
darn thing into one cell (one row anyway, and it seems to automatically
merge multiple columns).
When importing each page, I want to pull the "visual" data columns into
separate columns in each XL worksheet. I know if I was importing a flat
file, the import wizard would allow me to do this; can I access the same
functionality directly, or do I need to parse each line as it comes in and
force it into separate columns myself?
Using XL97
Many TIA,
Keith R