A Microsoft Office (Excel, Word) forum. OfficeFrustration

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » OfficeFrustration forum » Microsoft Access » Running & Setting Up Queries
Site Map Home Register Authors List Search Today's Posts Mark Forums Read  

trying to create a sample table with 31,000,000 rows



 
 
Thread Tools Display Modes
  #1  
Old December 27th, 2009, 09:41 PM posted to microsoft.public.access.queries
Ian
external usenet poster
 
Posts: 485
Default trying to create a sample table with 31,000,000 rows

I'm trying to create a table with 31 million rows so I created an append
query.

INSERT INTO rawdata ( [User], Data, UsageDate, [Hour] )
SELECT 1284 AS Expr1, 10.586 AS Expr2, #1/1/2009# AS Expr3, 12 AS Expr4
FROM rawdata;

The uniqueID is called ID. I want the append query to run repeatedly until
the count of ID is greater than 31,000,000.

Any ideas?
  #2  
Old December 27th, 2009, 10:29 PM posted to microsoft.public.access.queries
PieterLinden via AccessMonster.com
external usenet poster
 
Posts: 307
Default trying to create a sample table with 31,000,000 rows

Ian wrote:
I'm trying to create a table with 31 million rows so I created an append
query.

INSERT INTO rawdata ( [User], Data, UsageDate, [Hour] )
SELECT 1284 AS Expr1, 10.586 AS Expr2, #1/1/2009# AS Expr3, 12 AS Expr4
FROM rawdata;

The uniqueID is called ID. I want the append query to run repeatedly until
the count of ID is greater than 31,000,000.

Any ideas?


dim lngCount as long
for lngCount = 1 to 31000000
DBEngine(0)(0).Execute "INSERT INTO rawdata ( [User], Data, UsageDate,
[Hour] ) SELECT 1284 AS Expr1, 10.586 AS Expr2, #1/1/2009# AS Expr3, 12 AS
Expr4 FROM rawdata;", dbFailOnError
next lngCount

... might be faster with a recordset, though... still it will take some time.

--
Message posted via AccessMonster.com
http://www.accessmonster.com/Uwe/For...eries/200912/1

  #3  
Old December 27th, 2009, 10:31 PM posted to microsoft.public.access.queries
PieterLinden via AccessMonster.com
external usenet poster
 
Posts: 307
Default trying to create a sample table with 31,000,000 rows

hmm... 31 million rows may cause the DB to explode... hard limit of 2GB...

other ways of doing it... create a table of say 10K records (just autonumbers)
. Create a cartesian product between that table and the one with your single
record. (Select so you get only one). Then turn the Cartesian product into
an append query. Run 100 times using a For loop.

--
Message posted via AccessMonster.com
http://www.accessmonster.com/Uwe/For...eries/200912/1

  #4  
Old December 27th, 2009, 11:19 PM posted to microsoft.public.access.queries
Ian
external usenet poster
 
Posts: 485
Default trying to create a sample table with 31,000,000 rows

I'll try this one first -- I'm hoping that the small number of fields will
keep it under 2GB or I can split it into 3 years or put it on a SQL box at
work.

Does the code go into a module?

"PieterLinden via AccessMonster.com" wrote:

Ian wrote:
I'm trying to create a table with 31 million rows so I created an append
query.

INSERT INTO rawdata ( [User], Data, UsageDate, [Hour] )
SELECT 1284 AS Expr1, 10.586 AS Expr2, #1/1/2009# AS Expr3, 12 AS Expr4
FROM rawdata;

The uniqueID is called ID. I want the append query to run repeatedly until
the count of ID is greater than 31,000,000.

Any ideas?


dim lngCount as long
for lngCount = 1 to 31000000
DBEngine(0)(0).Execute "INSERT INTO rawdata ( [User], Data, UsageDate,
[Hour] ) SELECT 1284 AS Expr1, 10.586 AS Expr2, #1/1/2009# AS Expr3, 12 AS
Expr4 FROM rawdata;", dbFailOnError
next lngCount

... might be faster with a recordset, though... still it will take some time.

--
Message posted via AccessMonster.com
http://www.accessmonster.com/Uwe/For...eries/200912/1

.

  #5  
Old December 27th, 2009, 11:25 PM posted to microsoft.public.access.queries
Ian
external usenet poster
 
Posts: 485
Default trying to create a sample table with 31,000,000 rows

Pieter -- I've only used VBA in forms and am a bit of dullard with this stuff
-- I put the code into as Module1 and hit run but I get a compile error on
the lngCount =... line. Where have I gone wrong?

Option Compare Database

Dim lngCount As Long
For lngCount = 1 To 31000000
DBEngine(0)(0).Execute "INSERT INTO rawdata ( [User], Data,
UsageDate,[Hour] ) SELECT 1284 AS Expr1, 10.586 AS Expr2, #1/1/2009# AS
Expr3, 12 AS Expr4 FROM rawdata;", dbFailOnError
Next lngCount

"PieterLinden via AccessMonster.com" wrote:

Ian wrote:
I'm trying to create a table with 31 million rows so I created an append
query.

INSERT INTO rawdata ( [User], Data, UsageDate, [Hour] )
SELECT 1284 AS Expr1, 10.586 AS Expr2, #1/1/2009# AS Expr3, 12 AS Expr4
FROM rawdata;

The uniqueID is called ID. I want the append query to run repeatedly until
the count of ID is greater than 31,000,000.

Any ideas?


dim lngCount as long
for lngCount = 1 to 31000000
DBEngine(0)(0).Execute "INSERT INTO rawdata ( [User], Data, UsageDate,
[Hour] ) SELECT 1284 AS Expr1, 10.586 AS Expr2, #1/1/2009# AS Expr3, 12 AS
Expr4 FROM rawdata;", dbFailOnError
next lngCount

... might be faster with a recordset, though... still it will take some time.

--
Message posted via AccessMonster.com
http://www.accessmonster.com/Uwe/For...eries/200912/1

.

  #6  
Old December 28th, 2009, 01:27 AM posted to microsoft.public.access.queries
Ian
external usenet poster
 
Posts: 485
Default trying to create a sample table with 31,000,000 rows

In the end I ran the append query myself repeatedly (2^n) so the Tbl grew
exponentially and it was done in a matter of minutes. 33.5 million record
sets with 4 columns was 1.2GB!!!

The bigger issue was connecting it to a Pivot Table in excel. I ran out of
physical memory trying to create the pivot table. It held 1.5GB of physical
memory while working on it. I shut some other stuff down and it was no
problem but wow. "Nobody will ever need more than 640K of memory"



"PieterLinden via AccessMonster.com" wrote:

Ian wrote:
I'm trying to create a table with 31 million rows so I created an append
query.

INSERT INTO rawdata ( [User], Data, UsageDate, [Hour] )
SELECT 1284 AS Expr1, 10.586 AS Expr2, #1/1/2009# AS Expr3, 12 AS Expr4
FROM rawdata;

The uniqueID is called ID. I want the append query to run repeatedly until
the count of ID is greater than 31,000,000.

Any ideas?


dim lngCount as long
for lngCount = 1 to 31000000
DBEngine(0)(0).Execute "INSERT INTO rawdata ( [User], Data, UsageDate,
[Hour] ) SELECT 1284 AS Expr1, 10.586 AS Expr2, #1/1/2009# AS Expr3, 12 AS
Expr4 FROM rawdata;", dbFailOnError
next lngCount

... might be faster with a recordset, though... still it will take some time.

--
Message posted via AccessMonster.com
http://www.accessmonster.com/Uwe/For...eries/200912/1

.

  #7  
Old December 28th, 2009, 01:34 AM posted to microsoft.public.access.queries
John W. Vinson
external usenet poster
 
Posts: 18,261
Default trying to create a sample table with 31,000,000 rows

On Sun, 27 Dec 2009 13:41:01 -0800, Ian wrote:

I'm trying to create a table with 31 million rows so I created an append
query.

INSERT INTO rawdata ( [User], Data, UsageDate, [Hour] )
SELECT 1284 AS Expr1, 10.586 AS Expr2, #1/1/2009# AS Expr3, 12 AS Expr4
FROM rawdata;

The uniqueID is called ID. I want the append query to run repeatedly until
the count of ID is greater than 31,000,000.

Any ideas?


You can do this with no code at all, with a little prep work!

Create a table named Num with one Long Integer field, N. Open Excel and select
column A, rows 1 to 1000; use "Insert... Fill Series" to fill it with numbers
1 to 1000. Import this spreadsheet into Num.

You can then use a Cartesian join query:

NSERT INTO rawdata ( [User], Data, UsageDate, [Hour] )
SELECT 1284 AS Expr1, 10.586 AS Expr2, #1/1/2009# AS Expr3, 12 AS Expr4
FROM Num AS N1, Num AS N2, Num AS N3
WHERE N1.N = 31;

This query will generate 31 * 1000 * 1000 rows. At 12 bytes per row this
should fit within the 2GByte limit.

Not sure what you get from 31 million identical rows but...!
--

John W. Vinson [MVP]
 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is Off
HTML code is Off
Forum Jump


All times are GMT +1. The time now is 12:45 PM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 OfficeFrustration.
The comments are property of their posters.