Thursday, September 29, 2011

Large Data Transfer over SQL server DB

Transferring a large data set from server to server or even from database to another on the same server or even from table to table in the same database from SQL server is very big problem concerning time, efficiency, and memory and space together.

I was facing this problem in transferring large data set, contains 1.5 million records per table and each table set is a query from source tables. so the output is transferring query results to a tables. It sounds simple but it is not!

when doing it by SQL import wizard it will take one query per time!!!! how could they think i will need only this!
when trying to do it by SSIS this where i went to hell.... although SSIS has some benefits at least outlining the task flow of the work, but transferring large amount of data is much bigger than it.
my server specs are:
Core 2 due intel processor
6 G.B Memory
win 2008 64 bit
60 GB disk space

It sound OK for transferring 1.6 million records from a query, but it took nearly 1 hour to do so.
Sure I added indices and all the tuning it should take, but with no big improvement.

The best solution I found is: "Bulk insert"...
but how could i dump 1.5 million to a text file quickly as  SSIS also fails doing that. the solution to this problem was bcp utility. "bulk copy" how could not Microsoft integrate it into SSIS or in the import wizard. it is like hiding a treasure.

bcp is brilliant in that, minimizing the time into seconds instead of hours in both dump in files and import into table. the only concern about that was to have a good HDD space to dump files on and then deletes it.

It solved my problem. hooray.

No comments:

Post a Comment