Reconciling Differences Between the Source and the Destination By default, a bulk insert takes data from the fields of a source file and puts it into the samenumber of fields, using the
Trang 1FIGURE 11.3
Text qualifiers are needed when commas occur in the data of a comma-delimited text file Use the Transform Data task for these files.
The second way to create a format file is to use the bcp utility interactively
Open a Command Prompt and type in a bcp command The following command could be used
to generate the format file in Listing 11.1:
bcp pubs.dbo.stores out c:\temp\stores.txt -UsaThe bcp utility will ask you a number of questions about the fields in this bulk copy One ofthe last questions you will be asked is whether or not you want to create a format file If yousay yes, you will be asked for the host filename, which is used as the name of the format filethat will be created
Reconciling Differences Between the Source and the Destination
By default, a bulk insert takes data from the fields of a source file and puts it into the samenumber of fields, using the same order, in the data destination If you don’t have the samenumber of fields or if the fields are in a different order, you usually have three options:
• Use a view in place of the destination table Create the view so that its fields line up withthe fields of the source text file This is usually the easiest option to implement
• Use a format file This option is usually harder to implement, but it gives the mostflexibility
• Change the destination table so its fields match the fields in the text file
Other Data Movement and Manipulation Tasks
P ART III
276
Trang 2Extra Fields in the Data Destination Table
You may have fields in the destination table that do not exist in the source text file, as shown inthe following example
The destination is the Stores table in the Pubs database, which has the following fields:
stor_id, stor_name, stor_address, city, state, zip
The source text file is missing the last three fields:
1110Eric the Read Books 788 Catamaugus Ave.
2220Barnum’s 567 Pasadena Ave.
3330News & Brews 577 First St
You could use the following view as the destination for this Bulk Insert task:
create view vwStoresForBulkInsertFewerFields as
select stor_id, stor_name, stor_address from stores
This code and the code for the following create table and create view items are on the book’s
1 Create a temporary table that has the same structure as the source data file:
create table tmpStoresForBulkInsertFewerFields (
[stor_id] [char] (4)NOT NULL , [stor_name] [varchar] (40) NULL , [stor_address] [varchar] (40) NULL )
2 Generate a format file using the temporary table as the destination for the Bulk Inserttask Your generated format file will look like this:
8.0 3
1 SQLCHAR 0 4 “” 1 stor_id SQL_Latin1_General_CP1_CI_AS
2 SQLCHAR 0 40 “” 2 stor_name SQL_Latin1_General_CP1_CI_AS
3 SQLCHAR 0 40 “\r\n” 3 stor_address SQL_Latin1_General_CP1_CI_AS
The Bulk Insert Task
Trang 33 Add the missing fields in the order they appear in the destination, using 0 for the columnlength and 0 for the column order field.
4 If you have a row delimiter (in the example, the new line character), move that to the lastline
5 Change the number in the second row of the format file to the number of fields in thedestination table
When you are done, your format file should look like Listing 11.3
LISTING 11.3 This Format File Accommodates Extra Fields in the Data Destination Table 8.0
6
1 SQLCHAR 0 4 “” 1 stor_id SQL_Latin1_General_CP1_CI_AS
2 SQLCHAR 0 40 “” 2 stor_name SQL_Latin1_General_CP1_CI_AS
3 SQLCHAR 0 40 “” 3 stor_address SQL_Latin1_General_CP1_CI_AS
4 SQLCHAR 0 0 “” 0 city SQL_Latin1_General_CP1_CI_AS
5 SQLCHAR 0 0 “” 0 state SQL_Latin1_General_CP1_CI_AS
6 SQLCHAR 0 0 “\r\n” 0 zip SQL_Latin1_General_CP1_CI_AS
The files for this example are on the book’s CD as FewerFieldsInSource.txt andFewerFieldsInSource.fmt
Rearranging Fields When Moving from Source to Destination
It’s easier when you have the same fields in the source and the destination, but they’re in a ferent order
dif-For example, you could have a text file to import into stores that has the correct six fields, butthe field order in this text file is stor_name, stor_id, stor_address, city, state, zip:
Eric the Read Books 1100788 Catamaugus Ave Seattle WA98056 Barnum’s 2200567 Pasadena Ave Tustin CA92789 News & Brews 3300577 First St Los Gatos CA96745The view that you could create to use as the destination table is as follows:
create view vwStoresForBulkInsertRearrange as
select stor_name, stor_id, stor_address, city, state, zip from stores
If you want to do the rearranging with a format file, start by generating the normal file, whichwill look like Listing 11.4
Other Data Movement and Manipulation Tasks
P ART III
278
Trang 4L ISTING 11.4 A Generated Format File 7.0
L ISTING 11.5 Switching the Numbering in Column 6 Reorders Fields as They Enter the Destination Table
7.0 6
Extra Fields in the Source Text File
If the text file being used as the source for a Bulk Insert task has more fields than the tion, using a view is not an option The easiest way to handle this situation is to create theextra fields in the destination table If you don’t want to do that, you can use a format file
destina-In this example, your source text file has the six fields for the Stores table but also has threeextra fields—stor_type, stor_descript, and manager_name:
1111,John Doe,Eric the Read Books,788 Catamaugus Ave.,Seattle, WA,98056,discount,good books
2222,Dave Smith,Barnum’s,567 Pasadena Ave.,Tustin, CA,92789,historical,better books,
3333,Jane Doe,News & Brews,577 First St.,Los Gatos, CA,96745,current events,best books
The Bulk Insert Task
Trang 5You could follow these steps:
1 Create a temporary table that has the same structure as the source data file:
create table tmpStoresForBulkInsertExtraFields (
[stor_id] [char] (4) NOT NULL, [manager_name] char(40) NULL, [stor_name] [varchar] (40) NULL, [stor_address] [varchar] (40) NULL, [city] [varchar] (20) NULL,
[state] [char] (2) NULL, [zip] [varchar] (50) NULL, [stor_type] char(40) NULL, [stor_descript] char(40) NULL )
2 Generate a format file using the temporary table as the destination for the Bulk Inserttask Your generated format file will look like this:
7.0 9
destina-When you’re done, the format file should look like Listing 11.6
LISTING 11.6 Adding Additional Fields with a Format File 7.0
Trang 6The files for this sample are on the book’s CD as ExtraFieldsInSource.txt andExtraFieldsInSource.fmt.
Other Properties of the Bulk Insert Task
The Bulk Insert task has many additional properties Most of them can be set on the Optionstab of the Bulk Insert Task Properties dialog, as shown in Figure 11.4
The Bulk Insert Task
Many settings on the Options tab of the Bulk Insert Task Properties dialog greatly affect performance.
The code sample at the end of this chapter shows how to set all these properties in Visual Basiccode
Check Constraints
When this option is selected, the data is checked for compliance with all constraints as it isadded to the destination table By default, constraints are ignored when adding records with aBulk Insert:
Default value:FalseEffect on performance: Decreases performance when selectedObject property:CheckConstraints
Equivalent parameter of the Bulk Insert command:CHECK_CONSTRAINTSEquivalent parameter of bcp:-h “CHECK_CONSTRAINTS”
You enable constraints in Transact-SQL code with the CHECKparameter in the ALTER TABLEstatement You can disable them with the NO CHECKparameter Selecting or not selecting thisproperty implements identical behavior for the Bulk Insert task, although other data modifica-tions taking place at the same time will still have those constraints enforced
Trang 7The Bulk Insert task runs more quickly if the constraints are not checked You can create anExecute SQL task that checks for and processes any records that have been entered into thetable that violate the table’s constraints Set this Execute SQL task to take place upon the suc-cessful completion of the Bulk Insert task.
Other Data Movement and Manipulation Tasks
to be enforced and all the update triggers to fire The command will fail if any record
in the table violates one of the constraints
All the update triggers will be run by this command If you take all your insert gers and also make them update triggers, this code activates all the triggers that were missed during the Bulk Insert If any of the triggers fails to be successfully com- pleted, this update command will also fail.
You need more complex code to clean up the data if it fails this constraint and ger test
trig-NOTE
Keep Nulls
Selecting this option causes null values to be inserted into the destination table wherever thereare empty values in the source The default behavior is to insert the values that have beendefined in the destination table as defaults wherever there are empty fields
Default value:FalseEffect on performance: Improves performance when selectedObject property:KeepNulls
Equivalent parameter of the Bulk Insert command:KEEPNULLSEquivalent parameters of bcp:-k
Trang 8A Bulk Insert task that keeps nulls could run faster You can create an Execute SQL task afterthe Bulk Insert that will apply the table’s defaults Here is a SQL statement that puts thedefault value into all the PhoneNumber fields that have empty values:
Update tblCustomer set PhoneNumber = Default where PhoneNumber = Null
This strategy assumes that there are no records in the PhoneNumber field where you ally want to place a Nullvalue
intention-Enable Identity Insert
This option allows the insertion of values into an Identity column in the destination table
Default value:False
Effect on performance: NegligibleObject property:KeepIdentity
Equivalent parameter of the Bulk Insert command:KEEPIDENTITY
Equivalent parameters of bcp:-E
There are three possible ways to handle a Bulk Insert into a table that has an identity column:
• If you want to ignore the values for the identity column in the source data file, leave thedefault setting of Falsefor this property The table’s identity column will be filled withautomatically generated values, as in a normal record insert
• If you want to keep the values for the identity column that are in your source data file,select this option SQL Server sets the IDENTITY_INSERToption on for the Bulk Insertand writes the values from the text file into the table
• If your text file does not have a field for the identity column, you must use a format file
This format file must indicate that the identity field is to be skipped when importingdata The table’s identity column will be filled with the automatically generated values
Table Lock
SQL Server has a special locking mechanism that is available for bulk inserts Enable thismechanism either by selecting this property or using sp_tableoptionto set the “table lock onbulk load” option to True
Default value:False
Effect on performance: Significantly improves performance when selectedObject property:TableLock
Equivalent parameter of the Bulk Insert command:TABLOCK
Equivalent parameters of bcp:-h “TABLOCK”
The Bulk Insert Task
Trang 9When this special locking mechanism is enabled, a bulk insert acquires a bulk update lock.This lock allows other bulk inserts to take place at the same time but prevents any otherprocesses from accessing the table.
If this property is not selected and the “table lock on bulk load” option is set to False, theBulk Insert will acquire individual record locks This significantly reduces the speed of theBulk Insert task
Sorted Data
By default, the Bulk Insert task processes the records in the data file as if they were in no ticular order Setting this property to true improves the performance of a bulk insert if the fol-lowing three requirements are met:
par-• A clustered index exists on the table
• The data file is in the same order as that clustered index
• The order specified by the SortedDataproperty matches the ordering of the table’s tered index
clus-Default value: Not selected Empty string for property value
Effect on performance: Improves performance when selected, but only if all the ments for its proper use are met
require-Object property:SortedData, which holds the string specifying the sort order
Equivalent parameter of the Bulk Insert command:ORDEREquivalent parameters of bcp:-h “ORDER (<Ordering String>)”
If the table does not have a clustered index, or an ordering other than the clustered index isspecified, this property is ignored
The ordering string is constructed in the same way as the syntax of the ORDER BYclause in aSQL statement If the ordering of customers were alphabetical by city and oldest to youngestwithin a city, the ordering string would be
City, Age DESC
Code Page
This option specifies the code page that has been used for the data in the source file This erty affects the Bulk Insert only in cases where there are characters with values less than 32 orgreater than 127
prop-Default value: OEMOther possible values: ACP, RAW, Specific code page number
Other Data Movement and Manipulation Tasks
P ART III
284
Trang 10Effect on performance: Usually noneObject property:CodePage
Equivalent parameter of the Bulk Insert command:CODEPAGE
Equivalent parameters of bcp:-C
Data File Type
There are two choices to make in this property—the choice between charand native datatypes, and the choice between regular character fields and Unicode character fields
If you have Unicode data in your data, you must use widecharorwidenativeto bulk insertyour data
bulk copying data out of SQL Server with bcp If you are using text files to transfer databetween two SQL Server databases, using nativemode improves performance
Default value:char
Here are all the possible values, with their constants:
Object property:DataFileType
Equivalent parameter of the Bulk Insert command:DATAFILETYPE
Equivalent parameters of bcp:-sfor native,-wfor wide character
Insert Commit Size
By default, all records are inserted into the destination table as a single transaction This erty allows for fewer records to be included in each transaction If a failure takes place during
prop-The Bulk Insert Task
Trang 11the Bulk Insert, all inserts in the current transaction are rolled back If some batches havealready been committed, those records stay in the destination database.
Default value: Not selected Batch size is 0, indicating that all records are to be inserted
This is one of the places where the Bulk Insert properties do not exactly match the parameters
of the Bulk Insert Transact-SQL command Two parameters are available in the Transact-SQLcommand that are not available when you’re doing a Bulk Insert task KILOBYTES_PER_BATCHandROWS_PER_BATCHare both used by SQL Server to perform the Bulk Insert more efficiently
Maximum Errors
This property specifies the maximum number of allowable errors before the Bulk Insert task isterminated (It is not possible to set this property in the interface It must be done in code orwith Disconnected Edit.)
Other Data Movement and Manipulation Tasks
P ART III
286
I have not been able to use this property with the Bulk Insert task No matter what values I set for the MaximumErrors property and the BatchSize property, the task still fails with the first error.
NOTE
Default value: 10Effect on performance: NoneObject property:MaximumErrorsEquivalent parameter of the Bulk Insert command:MAXERRORSEquivalent parameters of bcp:-m
Only Copy Selected Rows, Starting with Row, and Stopping at Row
These properties allow you to choose to include only a particular range of records from thesource data file in your bulk insert
Trang 12Default values: Not selected, 0, and 0 All the records in the file are included in the bulk insert.
Effect on performance: NoneObject Properties:FirstRowandLastRow
Equivalent parameters of the Bulk Insert command:FIRSTROWandLASTROW
Equivalent parameters of bcp:-Fand-L
Creating a Bulk Insert Task in Visual Basic
I have created a Visual Basic procedure,fctCreateBulkInsertTask, that creates a connection,
a step, a task, and a custom task for a Bulk Insert task
You can find the code for this procedure in several forms on the book’s CD in the directory forChapter 11:
• In a Visual Basic Project, with files CreateBulkInsertTask.vbp,CreateBulkInsertTask.frm, and CreateBulkInsertTask.bas
• Modified for VBScript as CreateBulkInsertTask.scr
• In a DTS Package, CreateBulkInsertTask.dts Load this package into the PackageDesigner and execute it The package will be saved in SQL Server storage Open thatpackage and you will see the Bulk Insert task The package can be run repeatedly to cre-ate more Bulk Insert tasks The new tasks will not be visible in the Package Designeruntil you close the Package and then reopen it
The code for fctCreateBulkInsertTaskis shown in Listing 11.7 The procedure needs someutility functions, which are included with the code listings on the CD The project requires areference to the Microsoft DTSPackage Object Library
L ISTING 11.7 The Visual Basic Code to Create a Bulk Insert Task Option Explicit
Public Function fctCreateBulkInsertTask( _ pkg As DTS.Package2, _
Optional sBaseName As String = “BulkInsertTask”, _ Optional sDataSource As String = “(local)”, _ Optional sConnectionUserID As String = “”, _ Optional sConnectionPassword As String = “”, _ Optional sCatalog As String = “pubs”, _ Optional sDestinationTableName As String = “stores”, _ Optional sDataFile As String = “”, _
Optional sExistingConnection As String = “”, _ Optional lBatchSize As Long = 0, _
The Bulk Insert Task
Trang 13Optional bCheckConstraints As Boolean = False, _ Optional sCodepage As String = “”, _
Optional lDataFileType As Long = 0, _ Optional sFieldTerminator As String = “\t”, _ Optional sRowTerminator As String = “\n”, _ Optional sFormatFile As String = “”, _ Optional lFirstRow As Long = 0, _ Optional lLastRow As Long = 0, _ Optional bKeepIdentity As Boolean = False, _ Optional bKeepNulls As Boolean = False, _ Optional lMaximumErrors As Long = 10, _ Optional sSortedData As String = “”, _ Optional bTableLock As Boolean = False) As String
On Error GoTo ProcErr
Dim con As DTS.Connection2 Dim stp As DTS.Step2 Dim tsk As DTS.Task Dim cus As DTS.BulkInsertTask
‘Check to see if the selected Base name is unique sBaseName = fctFindUniqueBaseName(pkg, sBaseName)
If sExistingConnection = “” Then
‘Create connection for Bulk Insert Destination Set con = pkg.Connections.New(“SQLOLEDB”) With con
.ID = fctNextConnectionID(pkg) Name = “con” & sBaseName DataSource = sDataSource Catalog = sCatalog
.UserID = sConnectionUserID Password = sConnectionPassword
‘If User ID is empty string, use trusted connection
If sConnectionUserID = “” Then UseTrustedConnection = True Else
.UseTrustedConnection = False End If
End With pkg.Connections.Add con
Other Data Movement and Manipulation Tasks
P ART III
288
LISTING 11.7 Continued
Trang 14Else Set con = pkg.Connections(sExistingConnection) End If
‘Create task and custom task Set tsk = pkg.Tasks.New(“DTSBulkInsertTask”) Set cus = tsk.CustomTask
With cus
‘Set ConnectionID ConnectionID = con.ID
‘Properties Name = “tsk” & sBaseName Description = sBaseName
‘Set to values provided by the Package Designer DataFile = sDataFile
.DestinationTableName = sDestinationTableName FormatFile = sFormatFile
.FieldTerminator = sFieldTerminator ‘Tab RowTerminator = sRowTerminator ‘New line character CheckConstraints = bCheckConstraints ‘False KeepNulls = bKeepNulls ‘False
.KeepIdentity = bKeepIdentity ‘False TableLock = bTableLock ‘False SortedData = sSortedData ‘Not sorted Codepage = sCodepage
.DataFileType = lDataFileType ‘char BatchSize = lBatchSize
.MaximumErrors = lMaximumErrors FirstRow = lFirstRow
.LastRow = lLastRow End With
pkg.Tasks.Add tsk
‘Create step for task Set stp = pkg.Steps.New With stp
.Name = “stp” & sBaseName Description = sBaseName TaskName = tsk.Name End With
The Bulk Insert Task
Trang 15pkg.Steps.Add stp fctCreateBulkInsertTask = stp.Name Set con = Nothing
Set tsk = Nothing Set cus = Nothing Set stp = Nothing ProcExit:
Exit Function ProcErr:
MsgBox Err.Number & “ - “ & Err.Description fctCreateBulkInsertTask = “”
GoTo ProcExit End Function
Conclusion
You can’t use the Bulk Insert task in every situation If you don’t need the extra speed, it’soften not worth the effort to make it work But you’ll find that when you need to load largetext files into SQL Server, this task provides the best combination of speed and convenience
Other Data Movement and Manipulation Tasks
P ART III
290
LISTING 11.7 Continued
Trang 16CHAPTER 12 The Execute SQL Task
IN THIS CHAPTER
• When to Use the Execute SQL Task 292
• Creating the Execute SQL Task 292
• Writing Queries for Different Database Systems 294
• Using Input Parameters in Execute SQL Tasks 294
• Using Output Parameters for Row Values 296
• Using an Output Parameter for the Rowset 299
• Dynamically Modifying the SQL Statement 300
• Using the Execute SQL Task to Execute a DTS Package from a Remote Server 301
• Creating an Execute SQL Task in Visual Basic 306
Trang 17Other Data Movement and Manipulation Tasks
P ART III
292
Microsoft has significantly improved and extended the value of the Execute SQL task in SQLServer 2000 You can now do the following:
• Use global variables to dynamically modify the query used in the task
• Use global variables to receive the value of fields for one record returned by a SELECTquery
• Use a global variable to receive a reference to the recordset returned by a SELECTquery.This recordset can then be referenced in ActiveX scripts as if it were an ADO recordset
When to Use the Execute SQL Task
The transformation tasks allow you to perform rapid row-by-row processing of your data TheExecute SQL task gives you the power of SQL-oriented set processing, which will usually beeven faster If you can write your data transformation as a SQL statement and you don’t need
to use special processing for individual rows, you can usually use an Execute SQL task.You can use the Execute SQL task for executing a wide variety of queries, as long as you areusing a user account with sufficient permissions:
• Individual statements or batches of SQL statements
• Data retrieval queries—SELECT
• Data modification queries—INSERT,UPDATE, and DELETE
• Queries that load data from one table into another—INSERT…SELECTandSELECT INTO
• Data definition queries—CREATE,DROP, and ALTER
• Date access control queries—GRANT,DENY, and REVOKE
• Stored procedures
• DTS packages, by using SQL Server’s OLE Automation stored procedures
Creating the Execute SQL Task
You can create an Execute SQL task in the Package Designer or with code The last section ofthis chapter shows how to create an Execute SQL task using code
The Import/Export Wizard creates a variety of Execute SQL tasks It uses these tasks to droptables, create tables, and delete data from tables You cannot use the wizard to create ExecuteSQL tasks for other purposes
The Package Designer’s Execute SQL Task Properties dialog is shown in Figure 12.1 It givesyou three ways to set the SQL Statement:
Trang 18• Write it in the SQL Statement box.
• Use the Browse button to load the query from a file
• Use the Build Query button to create a query using a visual interface
The Execute SQL Task
The Execute SQL Task Properties dialog gives you several ways to create a query.
When I’m developing queries for a SQL Server database, I usually use the Query Analyzer to create my SQL statements The Query Designer in SQL Server 2000 pro- vides an excellent development environment for creating and testing queries After the query or query batch is working the way I want it to, I load it into the Execute SQL Task Properties dialog.
TIP
You have to select a DTS connection that has been previously defined in the package You canprovide a description and a value for the command timeout You also have buttons for parsingthe query and providing parameters for it
The Execute SQL task has very few properties Besides the NameandDescriptionproperties,theExecuteSQLTaskobject in SQL Server 7.0 has only these four properties:
Trang 19• SQLStatement—The text of the query.
• ConnectionID—The ID of the connection used for the query
• CommandTimeout—The length of time in seconds that the task waits for a response fromthe connection The default value is 0, which causes the task to wait forever
• CommandProperties—A pointer to the collection of OLE DB properties for theconnection
The extended SQL Server 2000 object,ExecuteSQLTask2, has three additional properties,which implement the ability to use parameters with the Execute SQL task:
• InputGlobalVariableNames—A semicolon-delimited list of the names of the globalvariables used as parameters in the query
• OutputAsRecordset—The name of the global variable that is to be assigned an objectreference to the recordset returned by the query
• OutputGlobalVariableNames—A semicolon-delimited list of the names of the globalvariables that are to receive the data values returned by the query
TheSQLStatementproperty has been modified in ExecuteSQLTask2so that it can includequestion marks, which act as placeholders for the input parameters
Writing Queries for Different Database Systems
You have to write the query for the Execute SQL task using the SQL dialect of the data nection If you are using the task with an Oracle database, for example, you have to use SQLthat can be understood by Oracle
con-You can check the syntax of the query as you are creating it by using the Parse Query button.The task does no parsing of the query itself Rather, it passes the query to the OLE DBprovider for parsing
You cannot use the Execute SQL task with connections that do not support SQL, such as textfiles
Using Input Parameters in Execute SQL Tasks
SQL Server 2000 allows you to use parameters in your SQL statement These parameters arefilled with values from global variables By using parameters, you can easily modify the text of
a query as the DTS package is executing
The most common use of input parameters is to provide values for filters in a WHEREclause Inthe following example from the Northwind sample database, orders shipped by a particular car-rier in a particular time period are loaded into a separate table for analysis The time period and
Other Data Movement and Manipulation Tasks
P ART III
294
Trang 20shipper are set with parameters The DTS package for this example is on the CD inDemoInputParameters.dts:
Insert tblShipperReview Select * From orders Where ShippedDate >= ? And ShippedDate < DateAdd(d,1,?) And ShipVia = ?
Figure 12.2 shows the Parameter Mapping dialog, which you use to map these parameters tothe appropriate global variables If the global variables don’t exist, you can open the GlobalVariables dialog to create them
The Execute SQL Task
You map the input parameters to the appropriate global variables in the Parameter Mapping dialog.
After mapping the global variables, you can look at the InputGlobalVariableNamesusingDisconnected Edit The value of this property will be the following string:
ShippedDateStart;ShippedDateEnd;ShipViaYou can set the value of the input global variables in a variety of ways:
• In a Dynamic Properties task (perhaps from values in a text file or an INI file, or fromvalues retrieved by a query)
• In an ActiveX Script task
Trang 21• In a transformation ActiveX script, based on values found in the data being transformed.
• As output parameters from another Execute SQL task
• Manually by selecting Properties on the Package menu, choosing the Global Variablestab, and typing in a value for the parameter
• From another DTS package with the Execute Package task
• In the parameters of the DTSRun command line that executes the package
The sample application uses an ActiveX Script to check if the three global variables have validvalues If they do not, values are assigned to them
A DTSRun command that could be used to execute the package from SQL Server storage withselected values for the global variables would be as follows:
DTSRun /S “(local)” /N “DemoInputParameters” /A
➥“ShippedDateStart”:”7”=”2/1/1997” /A “ShippedDateEnd”:”7”=”7/1/2000” /A
➥“ShipVia”:”2”=”1” /W “0” /EThis file is on the CD in a batch file called DemoInputParameters.bat
Other Data Movement and Manipulation Tasks
Using Output Parameters for Row Values
The Parameter Mapping dialog has a second tab that you can use to set output parameters forthe Execute SQL task, as shown in Figure 12.3
Each field in the record returned by a SELECTquery can be assigned to a global variable Youcan map none, some, or all of the fields You capture the values of the fields by mapping them
to global variables
When you use output parameters for row values, you can only capture values from one record.The next section of this chapter will show you how to capture the whole recordset in a globalvariable
Trang 22F IGURE 12.3
You capture values from particular fields by mapping them to output parameters.
If you have multiple queries in a batch, the Execute SQL task will use the first SQL statementthat returns a result as the source for the output parameters If you want, you can gather valuestogether in local variables and assign them at the end of the batch, as in this example fromNorthwind:
Declare @cAvgPriceBeforeUpdate money Declare @cAvgPriceAfterUpdate money Set NoCount on
Set @cAvgPriceBeforeUpdate = (Select AVG(UnitPrice) As AvgUnitPrice From Products) Update Products
Set UnitPrice = UnitPrice * 1.05 Where UnitsInStock <= 10 And UnitsInStock > 5 Update Products
Set UnitPrice = UnitPrice * 1.1 Where UnitsInStock <= 5
Set @cAvgPriceAfterUpdate = (Select AVG(UnitPrice) As AvgUnitPrice From Products) Select @cAvgPriceBeforeUpdate As AvgPriceBeforeUpdate,
Trang 23The DTS package for this example is on the CD in DemoOutputParameters.dts.
Other Data Movement and Manipulation Tasks
P ART III
298
This example will not work without NoCount turned on If you don’t do this, the Execute SQL task will consider the report of rows modified by the first UPDATE state- ment as the query that should be used to set the output parameters.
NOTE
There are ways around the limitation of output parameters capturing values from only one record For a SQL Server connection, you could open a cursor and assign the values of different records to different variables and return those values Or, as in the following code sample, you could return three of the distinct values found in a partic- ular field:
Declare @sName1 varchar(30), @sName2 varchar(30), @sName3 varchar(30)
set @sName1 = (
select top 1 au_lname from authors )
set @sName2 = (
select top 1 au_lname from authors where au_lname <> @sName1
)
set @sName3 = (
select top 1 au_lname from authors where au_lname <> @sName1
and au_lname <> @sName2 )
Select @sName1, @sName2, @sName3
I wouldn’t recommend that you use this strategy—the rowset output parameter would normally be better
NOTE
Trang 24Using an Output Parameter for the Rowset
You can also capture the entire rowset returned by an Execute SQL task in a single global able You set the rowset parameter in the Parameter Mapping dialog, as shown in Figure 12.4
vari-This global variable can then be used as an object reference to a disconnected ADO recordset
You can browse this recordset in an ActiveX script
The Execute SQL Task
You can capture the output of an Execute SQL query as a disconnected ADO recordset.
For example, if you wanted to examine the suppliers in the Northwind sample database withthe most sales during a particular time period, you could use the following query:
select top 5 s.SupplierID, s.CompanyName, Convert(Numeric(15,2),
SUM(od.UnitPrice * od.Quantity * od.Discount)) As TotalSales from Suppliers s
inner join Products p
on s.SupplierID = p.SupplierID inner join [Order Details] od
on p.ProductID = od.ProductID inner join Orders o
on od.OrderID = o.OrderID where o.OrderDate >= ?
and o.OrderDate < DateAdd(d,1,?) group by s.SupplierID, s.CompanyName order by
SUM(od.UnitPrice * od.Quantity * od.Discount) DESC
Trang 25If you assigned the output parameter for the rowset to a global variable namedrstTopSupplier, you could reference this data in an ActiveX script with the following code:Option Explicit
Function Main Dim rst, fld, msg
Set rst = GlobalVariables(“rstTopSupplier”).Value rst.MoveFirst
Do Until rst.EOF = TRUE
msg = “”
For Each fld in rst.Fields msg = msg & fld.Name & vbTab & fld.Value & vbCrLf Next
Msgbox msg rst.MoveNext
Loop
Main = DTSTaskExecResult_Success End Function
The DTS package for this example is on the CD in DemoOutputRowset.dts
Dynamically Modifying the SQL Statement
The use of parameters has greatly increased the flexibility and usefulness of the Execute SQLtask, but sometimes it is still useful to modify a task’s SQL statement directly with code Forinstance, you might want to execute the same SQL code on two tables that have the same defi-nition but different names You cannot change a table name in the SQL statement with a para-meter, but you can dynamically change the text of the SQL statement as the package is beingexecuted
The three most likely ways to change the SQL statement are as follows:
• Use an ActiveX Script task to construct the new statement, and set the SQLStatementproperty of the Execute SQL task in that ActiveX script This was the only way todynamically modify the SQLStatementproperty in SQL Server 7.0
• Use an ActiveX task to construct the new statement, but actually assign it to theSQLStatementproperty by using the Dynamic Properties task
• Use the Dynamic Properties task to assign the new SQLStatementproperty to the text in
a particular file
Other Data Movement and Manipulation Tasks
P ART III
300