SQL11UPD05-TSCRIPT-01
This wiki is a transcript of a previously recorded video.
Related content assets:
Welcome to this presentation introducing SQL Server 2012 Integration Services Improvements. My name is Peter Myers; I’m a SQL Server MVP and a mentor with SolidQ.
In this presentation we’ll introduce what the new improvements and features are in SQL Server Integration Services 2012. We’ll also discuss what migration potential you have with existing Integration Services investments. We’ll also discuss developer opportunities.
So this being the fourth release of Integration Services, the first being in 2005, and with this release we include many new improvements and features around the usability for new developers, improved productivity for developers, improved deployment configuration and management, and other miscellaneous improvements.
When we talk about SQL Server and its Enterprise Information Management (EIM), collectively we talk about three services that deliver this capability. So in this module we are focusing on Integration Services, just one of the three Enterprise Information Management services. Integration Services is primarily designed to implement ETL processes – it provides a robust, flexible, fast, scalable and extensible architecture. So with reference to Enterprise Information Management, we are referring to complete, current, consistent and clean data and it’s Integration Services that predominately provides the complete and current capabilities.
In later presentations we’ll explore Master Data Services and the new one, Data Quality Services.
New improvements and features for new developers. What you will find is that when you create a project in SQL Server Data Tools (SSDT), there’s a Getting Started window that creates links to samples and videos to help you become productive more quickly. The new SSIS Toolbox exposes descriptions and links to selected items that you see in the toolbox and allows categorization including Favorites. There is a Data Flow connection assistants to assist and facilitate the configurations of sources and destinations in the data flow.
This is a great one. Developers have been asking for this for some time and that’s the ability to undo and redo, very similar to what you have in Office with the control-Z, control-Y; that capability is now fully supported in the package designer. The data flow Script component supports debugging now inside SQL Server Data Tools. The designer has been updated in its look and feel, so it is much more slick and a nice new set of icons that describe all of the components. Different colors too are used to describe the paths in the data flow. There is also the ability to zoom and with this zoom slider you see the design of your package at different levels. So that’s the updated look and feel.
The new VSTA scripting environment now has support for the .NET 4.0: there’s auto-save and recovery; there’s improved performance when opening packages so this will speed up your productivity when opening up a product or a package; the ability to define connection managers at project level. So we’ll talk about the new project deployment model – in this model we can define project connection managers and then these are available automatically to all packages in the project. Icon markers are then used to indicate when expressions are used in the tasks, so you’ll see, on the top left corner of the task, the fx symbol that denotes that there are dynamic assignments to some of the properties.
Improvements also to the variable window, so when variables are created, they will automatically be created at the package scope not the scope of the selected item in the designer. There is a button that also allows changing variables scope. There’s a new data flow column mapper that facilitates the mapping of the columns, particularly when you are editing a data flow and you’re disrupting the design of it by changing something within the data flow. It’s a more convenient way to fixing when any column mappings are broken.
Automatic column mapping where name and data type matches is also supported. Simplified data viewers – so this supports viewing the data when executing a package in SQL Server Data Tools, so it is simplified in the sense that there is now only the grid viewer; so right click a path, add data viewer and that data viewer is then configured.
There is also simplified package formats and this makes it easier to do a few changes. There’s enhanced data flow components and now the Pivot transform user interface and the Row Count transform user interface actually has a user interface – it’s not the advanced editor. New control flow tasks, the CDC Control task facilitates working with Change Data Capture (CDC) as does the new data flow components with CDC Source and CDC Splitter. The Expression task allows for assigning expression to a variable in the control flow and then also in the new data flow components we have ODBC source and destination and the DQS Cleansing transform to integrate cleansing from your Data Quality Services (DQS) Knowledge Base. Much more for that in a later presentation.
For improved deployment, configuration and management, there is the Integration Services catalog. Now, essentially, this is a user database that is named SSISDB that is hosted on a SQL relational instance and it contains a collection of views and stored procedures that provide a T-SQL API. So a new project model for bundling project resources together is available in what is known as the project deployment model. When you create an Integration Services project that uses this mode the build produces an ispac file. Now you can deploy this direct from SQL Server Data Tools to the SSIS catalog or you can hand off this ispac file to a DBA who can then go ahead and install your project. In addition, the new parameter model is provided to simplify configuration management. So these are similar to parameters in programming functions. Essentially they are read-only variables – they can be defined at project scope in which case all packages have those parameters. They can also be defined with package scope. In addition, we can configure these parameters to be sensitive so that they ensure the values are encrypted and that they require that explicit values be provided at package execution time.
Now, available in the SSIS catalog are connection manager properties that are automatically exposed. We can also create and manage environments that are a collection of environment variables, not to be confused with environment variables that are defined at the operating system level. As an example, an environment could be your production and test environments and each have the same variables by name but different values assigned, potentially different connection strings, configurations and so on. So at execution time, a package that is in the same folder can then apply an environment and combine the environment variables to different paths of the package. There is automatic capture of package execution logs. This is on by default so any package that is executed in the SSIS catalog will log basic logging information. Now this is governed by a server-wide logging level property that is by default set to by Basic, and you can override this or an individual package execution – you can change into None or to another logging level. Data tap functionality, which is also new, is used to dynamically capture data rows as they flow through the data flow. Without any need to modify the package, you can execute the package and have an output of a path in the data flow sent to a file. Finally, the server can be managed using PowerShell.
Built-in reports are another new capability that allows you to gain visibility over your environment within the SSIS catalog. So there is the Integration Services Dashboard that allows you to view all packages that have run on the server in the past 24 hours, there is Performance Reports that allow you to view a package’s performance over time and Error Message Reports. Now there is also the built-in ability to implement custom reports based off the views inside the SSIS catalog. The great thing about these reports is that they include many drill-through experiences so you can just click and navigate through the reports to discover the information of interest to you.
Other miscellaneous improvements. There are annotations that auto-grow, they also wrap when you press Enter, and are persisted as clear-text. Expression improvements, now the 4,000 character limits have been removed. New functions have been added to the expression language including REPLACENULL, LEFT, TOKEN, and TOKENCOUNT . So it is possible now what we see in the third bullet point that flat files will support embedded qualifiers and a variable number of columns per row. What this means is that a particular value in a column could be actually delimited by a comma or some other such delimeter. And the token would allow you to extract a particular item from that delimited string. Raw files that are produced in the data flow using a raw file destination now include sort information that’s part of the metadata, and proved error messages for the Execute SQL tasks and the option to allow automatic type conversion in the Execute SQL task also.
From the point of view of migration – projects that have been developed in SQL Server 2005, 2008, and 2008 R2 can be converted to the new project deployment model which, as we have already discussed, produces the output of an ispac file which then can be deployed to the new SSIS catalog. So projects are converted by using the Integration Services Project Conversion Wizard, which is launched either from SSDT or from SQL Server Management Studio (SSMS). The wizard can update Execute Package tasks and replace package configurations with the new parameters.
So be aware that migration of Data Transformation Services (DTS) solutions is not supported. All DTS components have been removed from the SQL Server 2012 release. The DTS Migration Wizard is gone, the Execute DTS Packages task has been removed, and DTS support in SSMS has been removed. In addition, the ActiveX Script task has been removed.
So we include a demonstration that is in a separate recording that will show you how to develop and monitor a SQL Server 2012 Integration Services solution. This is a great demonstration to see how a legacy project developed in SQL Server 2008 R2 can be converted to the project model and how the new configuration with package parameters can be used effectively. We then go ahead and deploy that project to the SSIS catalog and we have the opportunity to execute and explore the environment through the built-in reports.
For developer opportunities, what do we have here in Integration Services 2012? Well you can manage the SSIS catalog by using the public T-SQL API, or there is also a .NET interface. This opens up the opportunities to develop management tools and solutions, develop custom monitoring and dashboard reports and embed them into the SSMS. Embed functionality into solutions – for example, package execution can be launched from an application. And automate project deployment as part of an installer package. Consistent with the previous releases of Integration Services, you can develop dynamic packages or Integration Services extensions by using the object model. This could include custom tasks, custom data flow components consisting of transformations, data sources, and destinations, data source connectors and ForEach enumerators.
So in summary we see for this fourth release of Integration Services there have been many, many improvements around the developer experience to enhance usability and productivity. Improvements for deployment, configuration and management with the SSIS catalog and many numerous other miscellaneous enhancements, many asked for by customers and now delivered in this upcoming release with SQL Server 2012. In addition, there are straight-forward, and optional, conversion from Integration Services 2005, 2008, and 2008 R2.
We’ll leave you with some resources, and I think a great resource, particularly for becoming productive quickly with Integrations Services 2012, is the new Getting Started window that provides you links to videos and resources. In addition, TechEd recordings for 2011 provide an introduction with what’s new in SQL Server – was formally known as code name Denali but is officially now known as SQL Server 2012. We have the Integration Services Team blog and of course books online for Integration Services and SQL Server.
There is a hands-on lab you can help yourself to that explores many of the new features of Integration Services and that should be a resource accessible from where you located this video. Thank you very much for watching this presentation.
Return to SQL Server 2012 Developer Training Kit BOM (en-US)
peter_myers edited Original. Comment: Edited by the presenter.
peter_myers edited Revision 1. Comment: Edited by the presenter.