Dynamic Column Mapping in SSIS : Part 2 (using dts variable)

Previously I have discussed the dynamic column mapping using script component. Now, I am going to mention one more way of dynamic column mapping in SSIS. Let’s say, our source table is Table_1:

ID    JAN    FEB    MAR

1    100    200    300

2    400    500    600

3    700    800    900

We want to map its columns dynamically in data flow task.

First, u need to create a DTS package variable called “COMMAND” in your SSIS and the type will be string. The value of the variable should be “Select 1 as Col1,2 as Col2,3 as Col3,4 as Col4”.

Now, drag a data flow task onto the designer. Double Click the task to go to the editor. On the editor, drag one oledb source and double click to edit it. Select the connection, then select the Data Access Mode as “SQL Command From Variable”. Then choose your variable “COMMAND” in variable name. Then, click on the column mapping and click ok.

Now, drag one OleDb destination on the editor of the data flow task. Connect it with the source and double click to edit it. Now, Configure the Destination in usual way. Below is a screen shot:

Now, click on Mappings and it should be like below:

As, u can see the columns from variable “COMMAND” will be working as the source and they will be mapped to the destination table’s columns. After clicking “OK”, go back to the original control flow designer area and drag one script task from the toolbox. Place it before data flow task and connect it with the data flow task.

Double click the script task, set ReadWriteVariable to User variable “COMMAND”. Click on the Edit Script button. Now write some C# code to

1. read the source table columns dynamically/programmatically

2. build an SQL command string using the column names of the source table

3. assign the SQL command string to the variable “COMMAND”.

After all these, u r ready to go. So, execute the package and data will be loaded into the destination. Below is the code snippet for doing the above 3 tasks:

public
void Main()

{


// TODO: Add your code here

Dts.TaskResult = (int)ScriptResults.Success;


SqlConnection Conn = new
SqlConnection(“Data Source=DACW0077;Initial Catalog=Demo;Integrated Security=SSPI;”);


SqlCommand Cmd = new
SqlCommand(“Select top 1* from Table_1”, Conn);


SqlDataReader Reader;

Conn.Open();

Reader = Cmd.ExecuteReader();


string Col1_Name = Reader.GetName(0); //GET THE COLUMN NAMES


string Col2_Name = Reader.GetName(1); //GET THE COLUMN NAMES


string Col3_Name = Reader.GetName(2); //GET THE COLUMN NAMES


string Col4_Name = Reader.GetName(3); //GET THE COLUMN NAMES

        //////BUILD THE SQL COMMAND AND ASSIGN IT TO VARIABLE/////////////////

Dts.Variables[“COMMAND”].Value = “Select “ + Col1_Name + ” AS Col1,” + Col2_Name + ” AS Col2,” +

Col3_Name + ” AS Col3,” + Col4_Name + ” AS Col4 from Table_1″;


//MessageBox.Show(Dts.Variables[“COMMAND”].Value.ToString());

}


30 thoughts on “Dynamic Column Mapping in SSIS : Part 2 (using dts variable)

  1. Hi Zahid,

    How to handle tables with varible numbers of columns? I have about 30 tables each with different number of columns. Any help pl?
    Thanks.
    Jay

  2. Jay
    The select statement built with my script assumes that the source table has four columns. In case of variable number of columns, you can read Reader.FieldCount property to get the number of columns in the source table. Later, u can execute a loop based on the Count and read column names dynamically and produce the Select command string.
    i=0;
    While (i<Reader.FieldCount)
    {
    str = str + Reader.GetName(i);
    i++;
    }
    But, the main problem will be the destination table because it is dynamic, too. You can build a complete SQL statement in the script. For example:
    "Insert into table_name " + "Select statement built previously with the loop", then execute the SQL statement. This is more like a C# programming solution rather than an SSIS solution.

    Or, you can develop data flow task at run time using C# library for DTS. You can search for it on the internet.

  3. Hi Zahid,
    I tried the same package as following your steps.

    but still I am getting the error as:
    [SSIS.Pipeline] Error: “component “OLE DB Source” (1)” failed validation and returned validation status “VS_NEEDSNEWMETADATA”.
    Please help me out.

    an urgent issue is depended on this logic.

    thank you

    1. Hi Ramana

      Just check if the number of columns mentioned in the variable matches exactly with the number of columns taken in the OLEDB SOURCE. Let’s say, your variable is like “Select 1 as Col1,2 as Col2, 3 as Col3, 4 as Col4”. Then u design the source and destinations as mentioned in my post. But, at runtime, the value of the variable becomes “Select 1 as Col1, 2 as Col2”, in this case you will receive the exact error you have just mentioned. Let me know if it solves your problem.

      -Zahid

      1. Hi Zahid,
        First of all thank you very much for responding to me.

        Every thing is perfect I have done following your steps.
        The error was solved for me when I changed the data source property(validateexternalmetadata) to false.

        Now My new requirement is,
        My source table and destination table will change now when i keep this logic in for each loop container respectively.
        so now I am converting source table columns to col1,col2,col3 ,etc as in our logic.
        and now can I change my destination columns as col1,col2,col3 respectively and so they can map to correct input columns automatically
        and so which makes my requirement possible and load data from different source tables to different destination tables in for each loop container logic.

        Thank you,
        Ramana

  4. Hi Ramana

    You can set the Data Access Mode to “Table Name or View Name Variable” in the OLE DB Destination. The variable will provide the table name. You can set different table names in your variable in a script task or any other way.
    Note that if the number/type of columns are different for different loads, this technique will not work. Because, the number of columns and data types are fixed in the mapping.
    Also, if u use script task for building your source SQL command,as well as, the destination table name, u can actually build the entire SQL insert statement in your script task itself and execute the INSERT command programmatically. It is actually a programming solution rather than an SSIS solution.

    Zahid

    1. Yes Zahid,

      But
      I have taken care of number/type of columns for different loads,
      My exact problem is,though the destination table for respected source table is handled properly and called destination table from “Table Name or View Name Variable”,for the first iteration(first source to first dest) is success and when comming to second iteration obviously source table changed and in comming down flow of data flow task it is bringing new source columns ‘which is ok’ ,the down comming columns are trying to map for previous iteration destination columns(though my destination table is changed from variable).
      So is’nt possible for the data flow task to read columns of destination table and map themselves accordingly to incomming columns.(where i have taken care of number/type of columns).

      I am sorry if am irritating you Zahid.

      Thank you,
      Ramana

  5. Hi Ramana

    You can use any of these three:

    1. Keep the column names same for all your destination tables.
    2. Use script component and use it as a destination. See my first post on dynamic column mapping, I have used script component as a source in that post. You can get a good idea from that post and use it as a destination)
    3. Use only script task (no data flow task, programming solution as I have mentioned before)

    Zahid

    1. Hi Zahid,

      when i am using send mail task ,i am getting following error:
      [Send Mail Task] Error:
      An error occurred with the following error message:
      “Failure sending mail.
      System.Net.WebException:
      Unable to connect to the remote server
      System.Net.Sockets.SocketException:
      No connection could be made because the target machine actively
      refused it 192.168.1.249:25”.

      please suggest me a solution for this

  6. Hi Zahid,

    I am getting follow, please help

    Error: Failed to lock variable “COMMAND” for read/write access with error 0xC0010001 “The variable cannot be found. This occurs when an attempt is made to retrieve a variable from the Variables collection on a container during execution of the package, and the variable is not there. The variable name may have changed or the variable is not being created.”.

    1. Check if u forgot to add the variable in the scrip task editor. A variable cannot be accessed from inside your script if u do not include it first as read/write variable.
      In some cases, I saw this error even after the variable was added properly. If it is your case, try to create a new script task. Or, open your script and try to debug it in VS editor. Hopefully, the variable will be recognized from your code afterwards.

      Let me know if it was helpful.

  7. Raman,
    How did you take care of the varying datatype and number of columns for every iteration/load?????

    1. If it is that much varying, u have to use script task to read metadata of source table and build the entire destination table runtime and dump data there. You will no longer be able to map columns at design time. Everything will be done at run time.

  8. Hey Zahid,

    Thanks a lot for this excellent post!! My requirement is somewhat on the similar lines. I do need to dynamically map columns. However, based on the business process (Finance vs sales), my data source (which are stored procedures querying various different tables) will change too. I am a complete novice in SSIS and was wondering if you could please help me out with my requirement.

    1. Hi Chai

      Is it possible to make two seperate data flow task for two different business process. May be, that will be easier to deal with. In my post, it was assumed number of columns will be fixed but column name can vary. What about your case? I guess, number of columns will also change. So, if u want to follow this approach, u have to write a stored procedure which would return the appropiate select statement based on the business process. Then u have to assign that text into the variable. Now, the question is, it might happen that for Finance, select statement will be like ‘Select A,B,C from TableFinance’ and for Sales, it is ‘Select A,B,C,D from TableSales’. So, to make them look like similar the Finance statment should be like this ‘Select A,B,C, NULL AS D from TableFinance’. And, yes, at beginning, you have prepare the data flow task for four columns as 4 is the maximum.

  9. I am new to SSIS, i have a query regarding SSIS ,If i have 10 files in source folder contain 1 row header and 1 row footer 10 files have different columns need to delete header and footer ,footer count wil be saved in one variable than file should be move to process folder .Please let me know its urgent for my project

    1. Your requirement is not 100% clear to me. You have to use script task and pure C# coding to load data into table. May be, you can use a configuration table/file somewhere which will store file name and their respective column information. Then, you can use that information to move files from one location to the other. You can also build destination table runtime using C# code and dump data to that table. Let me know if it helps.

    1. Hi Mohan

      DTS means Data Transformation Service. In the past (in SQL Server 2000), DTS used to do the job of SSIS. SSIS is in fact an updated version of DTS. Till now, some components and terminologies use the word ‘DTS’.

  10. Hi Zahid,

    How the actual source table is connected to the OLE DB Source to read from ?

    I am getting the only the first row of data as 1,2,3,4…

    Thanks

  11. Not sure if I understood you correctly. Did you see the C# code snippet towards the end of original post? A script task must be used to connect to source and build COMMAND variable correctly.

  12. Hi @Zahid,

    we have around 1700 tables in oracle and same 1700 tables exist in SQL server. each table columns vary frequently.but at any point of time source and destination columns will be same.
    I need to use dynamical column mapping for both source and destination.

    I have followed the below approach
    1) I have created a staging table where it will have table name, source query (oracle), Target Query (SQL server) used that table in Execute SQL task and stored the result set as the full result set

    2) created for each loop container off that execute SQL task result set and with the object and 3 variables table name, source query and destination query

    3 a)in the data flow task source I have chosen OLEDB source for oracle connection and choose data access mode as an SQL command from a variable (passed source query from loop mapping variable)

    3 b) in the data flow task destination I have chosen OLEDB source for SQL connection and choose data access mode as an SQL command from a variable (passed Target query from loop mapping variable)

    and looping it for all the 1700 tables..it is not working can you please guide us how I need to create it for 1700 tables dynamically from oracle to SQL server using SSIS

    1. Hi Vikas

      “passed Target query from loop mapping variable” – did you mean “Target Table”?
      What is the exact error you are getting?

  13. Hi Zahid,
    i have a excel file source and destination is OLEDB .In my source excel file there are columns like oct_2019,Nov_2019,Dec_2019 and in destination we will map as previous_month2,previous_month1,current_month.what is my issue is when next month come ex:Jan_2020 in my excel source it should map to as Current_month automatically.But its throwing error.Can you provide the solution for this.

    1. No. we are manually mapping first day of every month.But i need it dynamically map.Because it is scheduled and fails first day of every month.

Leave a reply to Rajesh Kuttan Cancel reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.