I am somewhat new to SSIS and have a question on branching/control flow. We have 3 manufacturing facilities on AS400. Requirement were to create independent SSIS packages in case certain facilities were down, to not interrupt or fail other facilities. to be honest, I didn't want to maintain 3 * (each plant package) packages and decided to create one that takes in a command line parameter and will create seperate batch files (requirements for our archaic scheduling system) that pass in the plant to be processed.
Everything works fine and the packages analyze the passed parameter with no problem and evaluate it when calling stored procedures int a SQL Task. I am required (I think - let me know if I'm wrong here) to have 3 seperate Data Flow tasks depending on which plant is being processed because they are different connections (same server, different DBs - AS400). Which Data Flow Taks run is based on a precedence constraint that analyzes the passed in plant number.
After the Data Flow Tasks are complete, I'd like to merge back into one path again (even though1 out of N will only run at a time) because they are all simply stored proc calls with different parameters. It seemed silly to me to have the same tasks repeated 3 different times. I looked into started up tasks after events, but I don't know a lot about it and it seems as though I would still be maintaining 3 different paths anyway.
Can anyone point me in the right direction?
P.S. (Even if the sollution is to have 1 Data Flow Task I'm still interested in knowing how to converge paths back together)
You can take an upstream control flow item and hook it to three data flows. Then coming out of each data flow you can hook them up to one downstream control flow item. Then you'll just have to work with your precedence constraints to ensure that if one doesn't run the whole package doesn't fail.|||If I'm undertanding you correctly, that's exactly what I tried. I should have mentioned that. The problem is that the package stops (with success) after the data flow is complete. I assumed it was because all but one of the paths coming to the post-data-flow task did not even execute (which is what I want) so they did not pass a complete or success flag tp the first post-data-flow task. So in short, everything runs great, it just stops as soon as the data flow task is complete.
Is there some other condition that I can put on the post-data-flow constraints? Or another toolbox item that I can use to connect them?
BTW. I appreciate the quick response!
|||You can change the precedence of those connectors. Double click on the connector line and you'll see the options available to you. Look at using "OR" in the multiple conditions.|||Yahtzee!!
Totally my bad. I had THOUGHT I set them all to "OR", but your suggestion made me check again. One was set to "AND".
Much appreciated!
No comments:
Post a Comment