|Subject:||Re: Can someone answer this?
|Date:||Tue, 17 May 2016 18:02:26 +1000
|From:||Tom Krieg <firstname.lastname@example.org>
Also, and this is a HUGE time-saver, don't create 5 foreign keys and 6
indexes until after you've done the bulk import. Script the fk and idx
creation and run it while you're watching the latest game of thrones.
That's reduce a bulk import fro 6 hours to a few minutes.
On 14/05/2016 1:24 PM, Tom Krieg wrote:
> I have a table with 1.4 million records and a primary key. I scan the
> table with a cursor and update each record. This takes up to 4 hours.
> I have a table with 1.4 million records (1:1 to the above table). I
> don't have time to hang around for 4 hours, so I limit the records to be
> retrieved and updated to 100,000. e.g cursor = SELECT TOP 100000
> <primary key> WHERE <primary key> IS NULL etc. This takes 13 minutes.
> Then I execute the statement again. This takes 4 minutes. The 3rd time
> takes 3 minutes and then it's 1:59 minutes. The whole table takes about
> 30 minutes, repeating the statement 14 times.
> When I do stuff like this, should I put the whole 100,000 record thing
> in a loop and repeat until IF EXISTS Select top 100000 is false? ... I
> have another table, I'll try it.