table limit size

S

Shelby

I have a comma delimited file that has 900 fields. I
understand that Access has a 250 field limit. Is there a
solution for this problem or do I have to look for a
different database software? Thanks!
 
C

Craig Alexander Morrison

You may require Microsoft SQL Server (SQLS) as it can handle 1024 columns
(fields) in a table.

No doubt you will want to normalise such data once it is in the database. As
an alternative you may be able to obtain the data in a more normalised form
and import it into a series of tables into Jet (Access) which has a limit of
255 fields in a table.

Please bear in mind that the maximum record/row size is 2,000 characters in
Jet and around 8,000 bytes in SQLS.
 
J

John Nurick

Hi Shelby,

Every text file I've seen with more than a few dozen fields could be
normalised into one or more much narrower tables. Often many of the
fields aren't needed at all. The difference if there are more than 255
fields is that you have start the normalisation process at the text file
stage rather than after importing into Access (either that or use a
different database engine - though as Craig says your textfile may be
too wide even for SQL Server to swallow).

If you don't mind using command-line utilities there are textfile tools
at http://unxutils.sourceforge.net/ which can split selected fields out
of text files. Also, I have a Perl script that can convert a text file
with many hundreds of fields into a tall narrow file that Access can
import, after which it can be re-asssembled into a sensibly normalised
structure. If you have (or can install) Perl, email me (removing the
reversed "no spam" from my address) and I'll be happy to send you a
copy.
 
T

Tim Ferguson

I have a comma delimited file that has 900 fields.

Just as another penny's-worth...

We are all agreed that a file with 900 columns needs normalising into a
bunch of related tables.

I would think about parsing this file using Line Input # and a chunk of
logic to send the different fields to their proper tables. Not trivial, but
it probably depends on how complex the data actually are. And then again,
the neat thing about writing code is that you only have to do it once!

HTH


Tim F
 
G

Guest

Thank you so much to John, Craig, and Tim for your
suggestions. I'm not familiar with the normalisation
process but I'll do some research. Thanks again!
-----Original Message-----
Hi Shelby,

Every text file I've seen with more than a few dozen fields could be
normalised into one or more much narrower tables. Often many of the
fields aren't needed at all. The difference if there are more than 255
fields is that you have start the normalisation process at the text file
stage rather than after importing into Access (either that or use a
different database engine - though as Craig says your textfile may be
too wide even for SQL Server to swallow).

If you don't mind using command-line utilities there are textfile tools
at http://unxutils.sourceforge.net/ which can split selected fields out
of text files. Also, I have a Perl script that can convert a text file
with many hundreds of fields into a tall narrow file that Access can
import, after which it can be re-asssembled into a sensibly normalised
structure. If you have (or can install) Perl, email me (removing the
reversed "no spam" from my address) and I'll be happy to send you a
copy.



I have a comma delimited file that has 900 fields. I
understand that Access has a 250 field limit. Is there a
solution for this problem or do I have to look for a
different database software? Thanks!

--
John Nurick [Microsoft Access MVP]

Please respond in the newgroup and not by email.
.
 
J

John Nurick

Hi Tim,
the neat thing about writing code is that you only have to do it once!

That's only neat if you can use the code more than once<g>. I first
wrote my Perl script when faced with a 1700-field textfile: the way it
was structured made it certain that any others from the same source
would be similarly wide but with different fields.

So rather than write code to parse that particular file into a
normalised structure I spent rather less time writing a generalised
procedure that can convert any wide textfile into one with just three
fields
RecordID, FieldName, FieldValue
which can be queried directly or used to build "proper" normalised
tables.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top