trying to create a sample table with 31,000,000 rows

I

Ian

I'm trying to create a table with 31 million rows so I created an append
query.

INSERT INTO rawdata ( [User], Data, UsageDate, [Hour] )
SELECT 1284 AS Expr1, 10.586 AS Expr2, #1/1/2009# AS Expr3, 12 AS Expr4
FROM rawdata;

The uniqueID is called ID. I want the append query to run repeatedly until
the count of ID is greater than 31,000,000.

Any ideas?
 
P

PieterLinden via AccessMonster.com

Ian said:
I'm trying to create a table with 31 million rows so I created an append
query.

INSERT INTO rawdata ( [User], Data, UsageDate, [Hour] )
SELECT 1284 AS Expr1, 10.586 AS Expr2, #1/1/2009# AS Expr3, 12 AS Expr4
FROM rawdata;

The uniqueID is called ID. I want the append query to run repeatedly until
the count of ID is greater than 31,000,000.

Any ideas?

dim lngCount as long
for lngCount = 1 to 31000000
DBEngine(0)(0).Execute "INSERT INTO rawdata ( [User], Data, UsageDate,
[Hour] ) SELECT 1284 AS Expr1, 10.586 AS Expr2, #1/1/2009# AS Expr3, 12 AS
Expr4 FROM rawdata;", dbFailOnError
next lngCount

... might be faster with a recordset, though... still it will take some time.
 
P

PieterLinden via AccessMonster.com

hmm... 31 million rows may cause the DB to explode... hard limit of 2GB...

other ways of doing it... create a table of say 10K records (just autonumbers)
Create a cartesian product between that table and the one with your single
record. (Select so you get only one). Then turn the Cartesian product into
an append query. Run 100 times using a For loop.
 
I

Ian

I'll try this one first -- I'm hoping that the small number of fields will
keep it under 2GB or I can split it into 3 years or put it on a SQL box at
work.

Does the code go into a module?

PieterLinden via AccessMonster.com said:
Ian said:
I'm trying to create a table with 31 million rows so I created an append
query.

INSERT INTO rawdata ( [User], Data, UsageDate, [Hour] )
SELECT 1284 AS Expr1, 10.586 AS Expr2, #1/1/2009# AS Expr3, 12 AS Expr4
FROM rawdata;

The uniqueID is called ID. I want the append query to run repeatedly until
the count of ID is greater than 31,000,000.

Any ideas?

dim lngCount as long
for lngCount = 1 to 31000000
DBEngine(0)(0).Execute "INSERT INTO rawdata ( [User], Data, UsageDate,
[Hour] ) SELECT 1284 AS Expr1, 10.586 AS Expr2, #1/1/2009# AS Expr3, 12 AS
Expr4 FROM rawdata;", dbFailOnError
next lngCount

... might be faster with a recordset, though... still it will take some time.

--



.
 
I

Ian

Pieter -- I've only used VBA in forms and am a bit of dullard with this stuff
-- I put the code into as Module1 and hit run but I get a compile error on
the lngCount =... line. Where have I gone wrong?

Option Compare Database

Dim lngCount As Long
For lngCount = 1 To 31000000
DBEngine(0)(0).Execute "INSERT INTO rawdata ( [User], Data,
UsageDate,[Hour] ) SELECT 1284 AS Expr1, 10.586 AS Expr2, #1/1/2009# AS
Expr3, 12 AS Expr4 FROM rawdata;", dbFailOnError
Next lngCount

PieterLinden via AccessMonster.com said:
Ian said:
I'm trying to create a table with 31 million rows so I created an append
query.

INSERT INTO rawdata ( [User], Data, UsageDate, [Hour] )
SELECT 1284 AS Expr1, 10.586 AS Expr2, #1/1/2009# AS Expr3, 12 AS Expr4
FROM rawdata;

The uniqueID is called ID. I want the append query to run repeatedly until
the count of ID is greater than 31,000,000.

Any ideas?

dim lngCount as long
for lngCount = 1 to 31000000
DBEngine(0)(0).Execute "INSERT INTO rawdata ( [User], Data, UsageDate,
[Hour] ) SELECT 1284 AS Expr1, 10.586 AS Expr2, #1/1/2009# AS Expr3, 12 AS
Expr4 FROM rawdata;", dbFailOnError
next lngCount

... might be faster with a recordset, though... still it will take some time.

--



.
 
I

Ian

In the end I ran the append query myself repeatedly (2^n) so the Tbl grew
exponentially and it was done in a matter of minutes. 33.5 million record
sets with 4 columns was 1.2GB!!!

The bigger issue was connecting it to a Pivot Table in excel. I ran out of
physical memory trying to create the pivot table. It held 1.5GB of physical
memory while working on it. I shut some other stuff down and it was no
problem but wow. "Nobody will ever need more than 640K of memory"



PieterLinden via AccessMonster.com said:
Ian said:
I'm trying to create a table with 31 million rows so I created an append
query.

INSERT INTO rawdata ( [User], Data, UsageDate, [Hour] )
SELECT 1284 AS Expr1, 10.586 AS Expr2, #1/1/2009# AS Expr3, 12 AS Expr4
FROM rawdata;

The uniqueID is called ID. I want the append query to run repeatedly until
the count of ID is greater than 31,000,000.

Any ideas?

dim lngCount as long
for lngCount = 1 to 31000000
DBEngine(0)(0).Execute "INSERT INTO rawdata ( [User], Data, UsageDate,
[Hour] ) SELECT 1284 AS Expr1, 10.586 AS Expr2, #1/1/2009# AS Expr3, 12 AS
Expr4 FROM rawdata;", dbFailOnError
next lngCount

... might be faster with a recordset, though... still it will take some time.

--



.
 
J

John W. Vinson

I'm trying to create a table with 31 million rows so I created an append
query.

INSERT INTO rawdata ( [User], Data, UsageDate, [Hour] )
SELECT 1284 AS Expr1, 10.586 AS Expr2, #1/1/2009# AS Expr3, 12 AS Expr4
FROM rawdata;

The uniqueID is called ID. I want the append query to run repeatedly until
the count of ID is greater than 31,000,000.

Any ideas?

You can do this with no code at all, with a little prep work!

Create a table named Num with one Long Integer field, N. Open Excel and select
column A, rows 1 to 1000; use "Insert... Fill Series" to fill it with numbers
1 to 1000. Import this spreadsheet into Num.

You can then use a Cartesian join query:

NSERT INTO rawdata ( [User], Data, UsageDate, [Hour] )
SELECT 1284 AS Expr1, 10.586 AS Expr2, #1/1/2009# AS Expr3, 12 AS Expr4
FROM Num AS N1, Num AS N2, Num AS N3
WHERE N1.N <= 31;

This query will generate 31 * 1000 * 1000 rows. At 12 bytes per row this
should fit within the 2GByte limit.

Not sure what you get from 31 million identical rows but...!
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top