SSIS : Huge Data Transfer from Source (SQL Server) to Destination (SQL Server)

how to load millions of records in ssis
transfer data from one database to another sql server using ssis
import data from flat file to sql server using ssis

Requirement :

  1. Transfer millions of records from source (SQL Server) to destination (SQL Server).
  2. Structure of source tables is different from destination tables.
  3. Refresh data once per week in destination server.
  4. Minimum amount of time for the processing.

I am looking for optimized approach using SSIS.

Was thinking these options :

  1. Create Sql dump from source server and import that dump in destination server.
  2. Directly copy the tables from source server to destination server.

Transfer Database Task, SSIS Integration Runtime in Azure Data Factory Yes Azure Synapse Analytics (​SQL DW). To add and configure a SQL Server destination, the package must already include at least one Data Flow task and a data source. For more information, see Binary Large Object (Blob) Data (SQL Server). SQL Server Destination. SQL Server destination is used to load data into a local SQL Server database. It bulk loads the data into tables or views. This component cannot be used for SQL Server located on the remote server. Also, it reads the connection configuration from an OLE DB connection manager.

Lots of issues to consider here. Such as are the servers in the same domain, on same network, etc.

Most of the time you will not want to move the data as a single large chunk of millions of records but in smaller amounts. An SSIS package handles that logic for you, but you can always recreate it as well but iterating the changes easier. Sometimes this is a reason to push changes more often rather than wait an entire week as smaller syncs are easier to manage with less downtime.

Another consideration is to be sure you understand your delta's and to ensure that you have ALL of the changes. For this reason I would generally suggest using a staging table at the destination server. By moving changes to staging and then loading to the final table you can more easily ensure that changes are applied correctly. Think of the scenario of a an increment being out of order (identity insert), datetime ordered incorrectly or 1 chunk failing. When using a staging table you don't have to rely solely on the id/date and can actually do joins on primary keys to look for changes.

Linked Servers proposed by Alex K. can be a great fit, but you will need to pay close attention to a couple of things. Always do it from Destination server so that it is a PULL not a push. Linked servers are fast at querying the data but horrible at updating/inserting in bulk. 1 XML column cannot be in the table at all. You may need to set some specific properties for distributed transactions.

I have done this task both ways and I would say that SSIS does give a bit of advantage over Linked Server just because of its robust error handling, threading logic, and ability to use different adapters (OLEDB, ODBC, etc. they have different performance do a search and you will find some results). But the key to your #4 is to do it in smaller chunks and from a staging table and if you can do it more often it is less likely to have an impact. E.g. daily means it would already be ~1/7th of the size as weekly assuming even daily distribution of changes.

Take 10,000,000 records changed a week.
Once weekly = 10mill
once daily = 1.4 mill
Once hourly = 59K records
Once Every 5 minutes = less than 5K records

And if it has to be once a week. just think about still doing it in small chunks so that each insert will have more minimal affect on your transaction logs, actual lock time on production table etc. Be sure that you never allow loading of a partially staged/transferred data otherwise identifying delta's could get messed up and you could end up missing changes/etc.

One other thought if this is a scenario like a reporting instance and you have enough server resources. You could bring over your entire table from production into a staging or update a copy of the table at destination and then simply do a drop of current table and rename the staging table. This is an extreme scenario and not one I generally like but it is possible and actual impact to the user would be very nominal.

Bulk Load Data by Using the SQL Server Destination, When searching online for problems related to SSIS data import, build an SSIS package that loads a huge amount of data from SQL Server with Then, within the Data Flow Task, we add an OLE DB Source and OLE DB Destination. the SQL Server command line (sqlcmd) · How to move SQL database  Method-1 : Upload SQL data to Amazon S3 in Two steps. In this section we will see first method (recommended) to upload SQL data to Amazon S3. This is the fastest approach if you have lots of data to upload. In this approach we first create CSV files from SQL Server data on local disk using SSIS Export CSV Task.

I think SSIS is good at transfer data, my approach here:

1. Create a package with one Data Flow Task to transfer data. If the structure of two tables is different then it's okay, just map them. 2. Create a SQL Server Agent job to run your package every weekend

Also, feature Track Data Changes (SQL Server) is also good to take a look. You can config when you want to sync data and it's good at performance too

SQL OFFSET FETCH Feature: Loading Large Volumes of Data , SQL Server import and export wizard. Connect to a source database via the Choose a data source step. Connect to a destination SQL Server database in the Choose a destination step. Choose the Copy data from one or more tables or views option, In the Specify table copy or query step: More about SSIS package can be found on the SQL Server Integration Services page. After launching the SQL Server Import and Export Wizard Welcome page, the following will appear: To prevent this page to appear next time when the SQL Server Import and Export Wizard is launched, check the Do not show this starting page again. checkbox.

Techniques to bulk copy, import and export in SQL Server, How to import large text files into SQL Server using SSIS in Visual Studio. 4) Drag [Source Assistant] onto the designer window within the Data Flow Task and select 8) Your last step is to add a destination table by dragging the [​Destination  In SQL we have Table1 which has many Oracle server connection details. what we have to do is, we need to copy data from Oracle.Table5 to SQl.Table5. the condition is in SQL.Table5 i know the factory name and using the factory name i can get the oracle server details in SQL.Table1. so using this details i need to connect to oracle and get the

How to Import a Very Large Text File into SQL Server using SSIS in , It was performed between 2 servers which were on a 10Gbit Both the Source and Destination tables had compression enabled on the Configured the OLE DB Destination Connection Manager to use: SQL Server Native So when moving large amounts of data, this would mean that data transfers can  In this tip we will provide the steps to install a data source driver and to configure an ODBC connection to the IBM DB2 for i (iSeries). We will also show an example SSIS (SQL Server Integration Services) package that exports data from the iSeries to a SQL Server database.

SSIS, Learn about options to copy a table from one SQL Server database to another Using a linked server; Using PowerShell and dbatools-io; Using SSIS Type" select "SQL Server" (note this can be changed to "Other data source" can see clearly this is not the best option to transfer large amounts of data,  Transfer Objects Between Instances of SQL Server. The Transfer SQL Server Objects task supports a SQL Server source and destination. Events. The task raises an information event that reports the object transferred and a warning event when an object is overwritten. An information event is also raised for actions such as the truncation of

  • so you're moving across servers? or just between databases or tables on the same server?
  • Linked Servers can query each other directly, that's probably the simplest technique to use. If the data is only refreshed once a week Minimum amount of time does not seem to be that critical.
  • Thanks for the response . This helps me to move further.
  • You're welcome. "Thank you"s are nice, and I do appreciate them, but up votes are better. (And I think Matt's answer deserves one, too) They tell other users that the answer is worth reading, which is useful when they're this long. Good luck on your project.
  • Eric, If Destination Schema is different from source , what is the best option to handle this using SSIS?
  • @Vasu I agree with Eric when you get enough reputation upvotes are great! I follow very similar patterns with my SSIS packages and agree with the statement that stored procedures are way better than SSIS transformations. I like SSIS for the "plumbing" moving data from A to B but most transformation is done at staging table. I will many times use my tables both as staging and as transformation tables. E.g. include extra columns specific to reporting/data warehouse then update that staging table and once transformed bring into final destination table.
  • Thanks for the reminder about upvotes and rep, @Matt. And Vasu, I agree with Matt on the staging tables. If your schemas are different, then I'd use stored procedures and staging tables in the destination database to manipulate the data as needed. I'd probably use a procedure to put the final data into the dbo layer. I've used data flows in SSIS, too, but usually work inside one database goes faster with code right there on the server.
  • Thank you Matt. Structure of source tables is different from destination tables. Destination tables we are using for reporting . What precaution do I need to consider if we use SSIS package?
  • Solid information, Matt. +1 on the effort toward a pretty broad question.