1. Trang chủ
  2. » Công Nghệ Thông Tin

Tài liệu SQL Server MVP Deep Dives- P18 ppt

40 344 0

Đang tải... (xem toàn văn)

Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống

THÔNG TIN TÀI LIỆU

Thông tin cơ bản

Tiêu đề Bi for the Relational Guy
Trường học University of Microsoft SQL Server
Chuyên ngành Computer Science / Data Management
Thể loại PPT presentation
Định dạng
Số trang 40
Dung lượng 1,17 MB

Các công cụ chuyển đổi và chỉnh sửa cho tài liệu này

Nội dung

Reporting Services is just that: a service that runs starting with SQL Server 2008 as aself-hosted service on a network server, as illustrated in figure 1.. Using Visual Studio to create

Trang 1

Certain basic terms, many of which are defined differently by different people, areused in business intelligence I will submit my definitions, which are based on mybackground, to serve as a reference for this chapter

ƒ Data warehouse—Aa relational store of data that serves as a basis for reporting

queries and/or a source for OLAP cubes It consists of tables that support tiple subject areas in the organization It is often designed using dimensionalmodeling techniques and, in most Microsoft shops, is housed in SQL Server

mul-This term is often interchanged with data mart, and in most cases data mart is

more accurate Data warehouse just sounds so much more substantial I tend to

use data warehouse even when referring to a data mart, because it tends to be

used generically to cover both, and so that is the term used throughout thischapter

ƒ Data mart—Can be thought of as a mini data warehouse that is specific to a

par-ticular subject area Subject areas tend to align themselves with organizationaldepartments, such as Sales, Finance, or Human Resources

ƒ Dimensional model—Represents business data in terms of dimensions and facts.

Dimensional models are represented as star or snowflake schema (The section

“Dimensional modeling” later in this chapter will cover this topic in moredetail.)

ƒ Business intelligence ( BI )—Generally, a process and infrastructure that facilitates

decision-making based on business data BI is too often thought of in technicalterms; it is far more driven by business

ƒ Data mining—The process of discovering valuable trends in historical data that

can provide insight into future events, based on a predetermined set of factors

A well-known example is on Amazon.com, where books are recommended toyou based on your past buying patterns compared to those of other customerswho have purchased the same books

ƒ ETL (extract, transform, and load)—The process of moving data from a source

sys-tem, usually online transactional processing (OLTP), transforming it into thedata schema represented in the data warehouse, and loading it into data ware-house tables ETL will usually also initiate cube loading The transformationstage can include various processes, such as converting codes, cleansing thedata, and looking up surrogate keys from dimension tables

ƒ OLAP (online analytical processing)—A process that allows a user to quickly analyze

data using common techniques known as slicing, dicing, and drillthrough Inthe Microsoft world, OLAP is provided via Analysis Services cubes

Trang 2

Really, what is so different?

Really, what is so different?

If you’ve had any exposure to operational reporting, you’ll already know many of thedifferences between reporting systems and traditional OLTP systems Some of thethese are shown in table 1

The difference is even more fundamental OLTP applications are designed based on adiscreet set of specifications Specific data is to be collected, and there are clear pat-terns about who will enter the data, at what point in the business process, and usingwhat method The first step to designing a business intelligence solution is to take sev-eral steps back to understand the business at its core: Why does it exist? What is its mis-sion? How does the business plan to achieve its mission? What key performanceindicators (KPIs) need to be measured to assess success? A business intelligence solu-tion needs to be able to address not just the needs of today, but those of the future,and that can only be accomplished by obtaining a core understanding of the underly-ing business processes

I remember a past client who had chosen to implement a replicated OLTP datascheme for all of their reporting needs They were suffering from numerous repercus-sions of this decision, including tempdb capacity issues, slow query times, and theinability to scale When asked why they were not open to discussion about a businessintelligence solution that provided more efficient analysis via OLAP cubes, they cited aprior attempt at a BI application that only addressed the queries for which it wasdesigned When the questions (queries) changed, the cube did not contain the infor-mation necessary to respond, and the whole project was aborted This is why it is socritical to model the data warehouse based on the business, not on the specific report-ing needs of the day

NOTE One of the hardest things for a relational DBA to come to grips with is theredundancy involved in data warehousing It’s disk intensive, to be sure.Often, a copy of a subset of the data is made for staging prior to loadingthe data warehouse, then there is the data warehouse itself, plus the cubestore This redundancy can be mitigated somewhat in the data warehousedesign, but it’s best to come to terms with the idea of redundancy as soon

as possible One exciting benefit is the potential to archive data from theoperational system as it is loaded into the data warehouse, making theOLTP system more lean

Table 1 OLTP versus reporting environment characteristics

Trang 3

Approach

The following is a high-level view of how a business intelligence project should beapproached This is intended to provide an overview to contrast with the approachtaken in typical OLTP development projects

1 Determine overall strategy—The general approach to a business solution is to

develop an overall strategy to the data warehouse, determining how ments interact with each other and developing a high-level plan for how eachsubject area will be built out In practice, I find that most companies skip thisstep Departments in an organization tend to vary in their readiness for datawarehousing, and cooperation from all departments is critical for making thisstep possible

depart-2 Address a subject area—Each subject area should be addressed in great detail,

fleshing out the relevant dimensions and developing one or more star schemas

to represent the business segment This is done by conducting interviews withbusiness subject-matter experts

TIP One common pitfall I have found is clients insisting that the IT staffknows all there is to know about the business It is true that they are inti-mate with the business rules that underlie the technology solutions thatrun much of the business, but that should not be confused with a coreunderstanding of the business, including insights into where the business

is heading IT personnel are a valuable resource for knowing where data

is housed and how to best get it into the data warehouse The data modelshould be based on interviews with stakeholders within the departmentsrepresented in the subject area

3 Develop the dimensional model—Developing a dimensional model that represents

the business based on the information gathered in the preceding step is mount in a successful business intelligence solution It’s important to get thisstep right An indication of a well-designed model is its ability to accommodatechanges easily

para-4 Extract, transform, and load—When the dimensional model has been established,

it is time to determine data sourcing, or how to best populate the model ETLprocesses need to be designed to accommodate the initial loading of the datawarehouse, as well as ongoing incremental loads, which will, hopefully, be able

to isolate new data in the source system from data that has been previouslyloaded

5 Develop the cube—Cube design usually closely follows the dimensional design,

which is one reason for the focus on the dimensional design Analysis Servicesprovides easy mechanisms for understanding the dimensional model and build-ing dimensions and measure groups

The remaining steps involve data validation, the automation of remaining processes,and more This is a simplified, high-level description of the approach to building a

Trang 4

Dimensional modeling

business intelligence solution For more information, the best text I have found on

the topic is The Data Warehouse Lifecycle Toolkit by Ralph Kimball and others (Wiley

Pub-lishing, 2008) Note that this book will only take you through the implementation ofthe relational data warehouse For information regarding the implementation ofAnalysis Services cubes, as well as other things to consider when using the Kimball

approach in a Microsoft BI implementation, check out The Microsoft Data Warehouse

Toolkit by Joy Mundy and Warren Thornthwaite (Wiley Publishing, 2006).

Dimensional modeling

I’d like to spend some more time on dimensional modeling, because it is core to theimplementation of a quality data warehouse It can also be difficult to grasp for thoseused to modeling tables in third normal form Why build a dimensional model in thefirst place? What’s wrong with duplicating the source tables?

First of all, end users have a difficult time navigating tables in third normal form Anormalized model is intended to support the fast and accurate input of data, withminimal redundancy OLTP table and column names are usually cryptic, and severaltables may have to be joined together in order to create a query The application hasalso built-in cryptic business rules that a user would have to know about, such as WHEREActiveInd = 'A', or WHERE EntityCode = 'XYZ' Dimensional models make moresense to users because they more closely match how they view the business They pro-vide a flexible means of accessing the data On top of all that, dimensional designsrespond more quickly to queries that span large amounts of data SQL Server oftenrecognizes star schemas and optimizes accordingly

Dimensional models are implemented as star or snowflake schemas I will onlyaddress the star schema, because the snowflake schema can be considered a variation

on the star schema A star schema consists of from one to many dimensions (onedimension would be highly unlikely) that are related to a single fact table The dimen-

sions represent ways of looking at the data—the who, what, where, when, and why The

fact table has implicit foreign keys to each of these dimensions, as well as a collection

of facts that are numerical For example, a fact table may represent sales in terms ofamount, quantity, tax, discount, and so on The numbers in a fact table can then beaggregated, usually summed, across aspects of the dimensions to answer queries aboutwhat has happened in the past

Let’s take an example of a star schema that represents retail sales The dimensionswould be things like customer (who), product (what), sales territory (where), date(when), and promotion (why) The star schema might look something like figure 2

An important thing to note is that the fact table joins to the dimension tables onsurrogate keys, which are usually IDENTITY columns in SQL Server, intended toabstract the row from the business key Always use surrogate keys (except on the Datedimension), even if you don’t think you need one right now Surrogate keys are useful

in slowly changing dimensions and when a dimension spans multiple entities, each ofwhich may assign distinct business keys For example, Human Resources may use one

Trang 5

key for an employee, such as a social security number, and a Sales system may assign itsown business key to an employee Using a surrogate key will tie the two together acrossapplications, facilitating richer analysis.

TIP Even if you are developing a dimensional model that currently supports asingle department in the organization, be sure to keep the entire organi-zation in mind to facilitate future enhancements Shared, or conformed,dimensions facilitate analysis across departmental fact tables For exam-ple, an Employee dimension might include employee salary, manager,and hire dates (Human Resource information) as well as sales regionassignments (Sales information) Storing both sets of data enables theanalysis of hiring programs and manager-mentoring effectiveness onsales performance

Cubes, anyone?

You could stop after creating the data warehouse Some do Analysis Services, or anyOLAP technology, requires a new skill set to develop and maintain cubes But a cubesolution vastly improves the analytical and usability factor in the resulting solution

Figure 2 A star schema

Trang 6

Cubes, anyone?

Users no longer have to know how to write T-SQL or be constrained by static reports.There are also a number of tools that consume OLAP data sources—Excel (shown infigure 3), PerformancePoint, Reporting Services, and Panorama, just to name a few.And the query response speed can go from many minutes (or hours) for a SQL query

to seconds from a cube The power of online analysis is the ability to quickly ask aquestion, get an answer, and then ask another question based on the first result Analysis Services cubes are specifically designed to retrieve thousands or millions

of data points and aggregate them quickly Some of this performance is accomplishedthrough aggregations, which precalculate the data at predefined intersections ofdimensions Think of it as building several indexed views to provide summarizedinformation in SQL Server The difference in query time often differs in orders ofmagnitude

There is a common misconception regarding where cube data is stored Cubeshave their own separate data store in addition to the data warehouse This can begreatly minimized using ROLAP (relational OLAP) partitions, but this is universallyconsidered a bad idea except for small real-time partitions, due to performance.MOLAP (multidimensional OLAP) is the preferred storage option, which results in aredundant data store that is specifically tuned for OLAP query performance Cubeprocessing is the activity that loads data into the cube from the data source

Figure 3 MS Excel 2007 Pivot Table sourcing Analysis Services cube

Trang 7

Resist cutting corners in regard to storing data in the data warehouse It might betempting to prune data from the data warehouse once it is contained in the cube, butthis would remove your ability to reprocess the associated cube partitions, and thatdata would eventually be lost, because cubes must be reprocessed if associated dimen-sion structures change Again, the relational DBA approach is all about economizing

on disk usage, but that mindset is at odds with the data warehouse implementation

The Microsoft BI stack is represented in figure 4

How do I get started?

This chapter was intended to give you a basic introduction to business intelligenceand to explain why it differs so dramatically from the traditional OLTP approach.From here, my best recommendation is to start the way I started, with a book I cannot

recommend enough: The Data Warehouse Lifecycle Toolkit by Ralph Kimball and others One of the things that I like about it is that it not only explains the what, but it also explains the why I occasionally deviate from the approaches recommended in the book, but only after reading the why to be sure there was nothing I was missing This

book is not technical; it is more specific to the methodology, which is what is most

Figure 4 The Microsoft BI stack

Trang 8

Summary

lacking in technicians coming from an OLTP environment It does contain some greatinformation on dimensional modeling, including techniques for handling commonscenarios that arise when trying to represent business data in a dimensional model

It is difficult to make a universal recommendation about how to get your feet wet.One option is to develop your own cube to represent a solution to a problem One of

my favorite projects was to build a cube to analyze how disk space was being used on mycomputer I have a difficult time identifying files that are duplicated in several folders,and building such a cube helped me identify where most of my space was going (Ifreed up about 30 percent of my disk space based on this analysis!) Another imple-mentation could analyze server resource consumption In fact, such a solution is avail-able for download as a PerformancePoint sample (see “Scorecards and Dashboards foryour IT Department” at http://www.microsoft.com/bi/department/department.aspx?id=tab1), so you can validate your approach

If there is a data warehouse project in development or already implemented atyour company, you may be able to find a way to become a team member and learnfrom other practitioners with more experience And there are courses available fromthe major training vendors to get you started

Summary

I hope you have gained some insight into the various aspects of business intelligenceand learned how it differs from legacy OLTP applications This should give you thefundamentals, so you can decide if this is something you want to pursue I have foundbusiness intelligence projects to be some of the most rewarding in my professionalcareer Historically, IT projects have typically been about getting information enteredinto computer systems, so there is no greater reward than to see the excitement on abusiness user’s face when they see how easy it can be to make some actionable knowl-edge out of years of stored data

About the author

Erin Welker has spent 25 years in information technologydevelopment, management, database administration, and busi-ness intelligence She began working with SQL Server in ver-sion 1.11, and Analysis Services, SSIS, DTS, and ReportingServices since their inception Erin has consulted at severalFortune 500 and other well-known companies, developingbusiness intelligence solutions with a specialization in perfor-mance She loves to share her passion for SQL Server and BIthrough authoring and speaking

Trang 9

SQL Server 2008 Reporting Services William Vaughn

I want to share some of the Reporting Services technology implemented in SQLServer 2008—at least those features that make a difference to application develop-ers Thanks to your (and my) suggestions, Reporting Services is better this time.Thankfully, they left in almost all of the good features and added a few more Thatsaid, there are still a number of things I would like improved—but I’ll get to that.Yes, there are a few Reporting Services features that are still under development

These include the real MicrosoftReportViewer control that can process the

second-generation Report Definition Language (RDL) produced by the Reporting vices 2008 Report Designers

Ser-Why should developers care about Reporting Services?

Over the last 35 years I’ve written lots of programs and taught many developers how

to build best practice applications Many of these programs were simply ways topresent data to ordinary end users or extraordinary corporate executives, so theycould better understand, assimilate, and leverage the information being presented.Initially these reports were heavily processed and formatted data dumps to paper,which were often discarded as fast as the systems could print them Later, reportswere implemented as programs—often as Windows Forms applications, but morerecently they might be based on ASP, Windows Presentation Foundation (WPF), orSilverlight platforms Yes, sometimes developers were forced to use one of the earlyMicrosoft and third-party Visual Studio report development tools, but after aperiod of frustration, some decided it was easier to take up a career in coal miningthan to face the rigors of one of these troublesome reporting paradigms All thathas changed—mostly due to innovations in Microsoft Reporting Services

Trang 10

What is Reporting Services?

Let’s get started with an overview of Reporting Services in terms that anyone canunderstand Later on I’ll show how Visual Studio in one of its many manifestationscan serve as a powerful tool to create your reports, get them deployed to the Report-ing Services server, or include them in your ASP, Windows Forms, or other projects

What is Reporting Services?

Reporting Services is just that: a service that runs (starting with SQL Server 2008) as aself-hosted service on a network server, as illustrated in figure 1 The SQL Server 2000and 2005 versions of Reporting Services run as a web service hosted by Internet Infor-mation Services (IIS)

NOTE It’s best if Reporting Services runs on a dedicated system because in stress production environments it can consume considerable resourcesthat can hobble the performance of a DBMS engine sharing the sameresources

high-When a report is deployed to a Reporting Services service instance, its RDL file is pressed into a binary large object (BLOB) and stored in the SQL Server Reporting Ser-vices Catalog, where it waits to be rendered by a SOAP request from your application,the Report Manager, SharePoint Services, or by referencing the right URL

com-Now let’s look more closely at the Report Server Reporting Services processesrequests for reports as follows:

ƒ The Report Server extracts the RDL by virtual directory path and name from theSQL Server Reporting Services report catalog It’s then decompressed (anddecrypted) and passed to the Report Processor

ƒ The Report Processor works like a language interpreter—the RDL is the script thatdrives its operations Its job begins by extracting the connection strings from

Figure 1 SQL Server Reporting Services architecture

Trang 11

the SQL Server report catalog tables and passing them, along with the queriesand report parameters to the Data Processing Extension Yes, a report can refer-ence many data sources and can contain many queries—some of which mightrequire input parameters.

ƒ The Data Processing Extension opens the connection(s), passing in the

appropri-ate credentials, and executes the report queries using any parameters specified

in the RDL Yes, the SELECT statements used to fetch report data or to populateparameter pick-lists are imbedded in the report RDL The Report Processor sub-sequently merges data from the named columns extracted from the query row-sets with the RDL-defined report elements and passes the results to the ReportRendering extension

ƒ The Report Rendering extension works like a printer but with a number of

specifi-cally targeted output devices; the default is HTML so reports can be shown inInternet Explorer (No, Reporting Services reports don’t render consistently inFirefox or Chrome browsers.) In addition, in SQL Server 2008 the Report Ren-dering extension can also output the report to PDF, TIFF, Excel, CSV, and toWindows Word as well

When an application or a user requests a report, the RDL might require the user orthe Report Processor to supply the values for one or more report parameters Theseparameters are often used to focus the report data on a specific subset or set displayoptions as managed by RDL-resident expressions Capturing parameters requiresReporting Services to render appropriate dialog boxes in the browser to capture theseparameters as shown in figure 2

NOTE When starting the Report Manager for the first time (or after a period ofinactivity), be prepared for a wait It can take 30 to 90 seconds to get theReporting Services functionality compiled and to render the initial menu

of available reports

Figure 2 Report parameters as generated by the Report Processor

Trang 12

Using Visual Studio to create an RDL report

As I discuss later, when you set up your report with the Visual Studio Business gence (BI) tools, you can specify as many query and report parameters as needed, aswell as default values to use in place of user-provided values After all required param-eters are supplied, the user can click the View Report button to render thereport—and repeat this as many times as needed It’s up to your code to make thatprocess efficient—in some cases, the Report Processor might not need to re-run thequery which can save considerable time

Intelli-Using Visual Studio to create an RDL report

SQL Server Reporting Services is available with all editions of SQL Server including theSQL Server 2008 Express with Advanced Services edition Included with SQL Server is

a version of Visual Studio used to create CLR executables and Reporting Servicesreports (among other tasks) I’ll use this pared down business intelligence develop-ment toolkit to create, preview, and deploy a sample report If you have any version ofVisual Studio above Express, you’ll be able to create and preview reports as well—evenwithout Reporting Services, but I’ll talk about that option a bit later

NOTE SQL Server Compact editions do not support Reporting Services Formore info, see http://www.microsoft.com/express/sql/download/

default.aspx.Setting up Reporting Services is an important step but we don’t have enough room todevote a dozen pages to this process Thankfully, it’s easier than ever, so I don’t antici-pate that you’ll have significant problems

In this example, I’ll be using Visual Studio 2008 SP1 that includes the ability to ate Reporting Services reports Nope, without SP1, you won’t be able to do so, as SP1activates all of the Reporting Services BI functionality In Visual Studio 2005 you couldbuild, test, and deploy reports However, you can’t open Visual Studio 2005 BI reportprojects in Visual Studio 2008 without SP1

So follow along and I’ll walk you through the process of creating your own report.Relax, it will be fun—I promise

1 Start Visual Studio 2008 (SP1) and choose New Project

2 Choose Business Intelligence Projects from the New Project dialog box

3 Choose Report Server Project Wizard This launches a wizard that I expectyou’ll use once After that you’ll reuse the RDL file it generates to create otherreports

4 Before clicking OK, point to an appropriate project save path and name theproject I’m calling mine SQL Bible Report Click OK

5 After the initial Welcome screen (where you clicked Next) you’re ready to startdefining where the data for your report is sourced Sure, the data can comefrom anywhere—anywhere that can be seen with a NET provider including theobject linking and embedding (OLE) DB and Open Database Connectivity(ODBC) This means data can come from SQL Server, Analysis Services, SAP,

Trang 13

Oracle, a flat file, a third-party database visible with a custom ODBC driver orOLEDB provider, or even a JET/Access database

6 Name your data source so you’ll recognize it later No, I don’t recommendDataSource1 Next, choose the Type from the drop-down list I chose SQLServer

7 Fill in the Connection string by clicking on the Edit… button or type it in self if you’re sure how to code it I’m connecting to AdventureWorks2008

your-8 Set the Credentials default to the Security Support Provider Interface (SSPI),

which is fine for development Peter (my coauthor of Hitchhiker’s Guide to SQL

Server 2000 Reporting Services) and I do not recommend using trusted

authentica-tion for producauthentica-tion reports for many good reasons

9 If you plan to have several reports in the Visual Studio project that use the sameConnectionString, go ahead and click Make This a Shared Data Source.Remember, the data source name you use here might be shared by otherreports in other projects so check with your report DBA

10 Click Next to open the Report Wizard’s Design the Query dialog box Enter thedata source–specific SQL to return the rowset (just one) used to populate yourreport or click the Query Builder to get the wizard to help build it Thislaunches the ever-familiar Query Designer we’ve used for years However, thistime you’re creating a report, not an updatable TableAdapter This means youmight not need nearly all of the columns or even the Primary Key columns NOTE To make your report run faster and keep your DBA happy, besure to choose only the columns needed and include a WHEREclause in your query to focus the rowset on only the data needed

by the report

11 For my report, I chose a few columns from the AdventureWorks2008 tion.Products table I also set up a WHERE clause that returns rows where theSellStartDate is between a given range of dates as shown in figure 3

Produc-12 After the query is done, click Next to proceed to the report layout dialog boxes.These give you the option to specify a Tabular or Matrix report

Figure 3 Specifying the report query with

a WHERE clause

Trang 14

Using Visual Studio to create an RDL report

13 Click Next to specify which column values are to be used to specify report pageand group breaks as shown in figure 4 Note that I chose not to include theWeight in this report This means the query will fetch data that’s notneeded—not a good idea Click Next to continue

14 Choose the Table layout mode (stepped or blocked) and whether you want totals Click Next to continue

sub-15 Choose the Table style (the color used as accents in the report) and click Next

16 Stop This is a critical point and if you don’t get this right, nothing will workcorrectly For some reason, the wizard has not detected that I installed a namedinstance during SQL Server setup and chose to use SSL web traffic encryption.This means you must change the Report Server path in the Choose the Deploy-ment Location dialog box to point to the correct Reporting Services instance

In my case, I installed SQL Server 2008 on my BETAV1 system with the instancename of SS2K8 Note that the virtual directory (VD) is named ReportServer (thedefault VD name) followed by an underscore and the instance name as follows:BETAV1/ReportServer_SS2K8

But before you jump in and change the Report server name, I suggest yousee how Reporting Services Setup initialized the virtual directory name by

Figure 4 Specifying the report page, group breaks, and detail elements

Trang 15

starting the SQL Server 2008 Reporting Services Configuration tool Log intoyour Reporting Services instance and choose the Web Service URL tab (on theleft) If the Report Server virtual directory is properly installed, it will providethe Report Server Web Service URLs at the bottom of the dialog box as shown infigure 5

NOTE These settings are dependent on whether or not you had a properSSL certificate in place when Reporting Services was installed In

my case, I created a certificate for BETAV1 beforehand Notice thatthe Reporting Services configuration tool can also see my laptop’sReporting Services installation You can click on these URLs to test

if Reporting Services is running properly and to be taken to theReport Server virtual directory

Yes, you’ll need to change the Report Server name URL each and every timeyou create a report from the Wizard No, you won’t want to run the wizardagain In most cases you’ll leverage existing reports and report projects wherethese settings are maintained

We’re almost ready to move forward, but before we do, consider the name ofthe deployment folder—this is where Visual Studio will deploy the report Con-sider that anyone with access to an explorer might be able to see your Report-ing Services Virtual Directory (VD) and the Report Manager, so as you startcreating reports, others will be able to see your unfinished reports as you learn

to use Reporting Services

You can deal with this problem in a couple of ways First, you should workwith Report Manager and your DBA to prevent access to reports under develop-ment Next, it makes sense to create a directory to place the work in progressreports, and use rights management to hide these from casual viewers Thismakes sure that unauthorized people don’t run reports they shouldn’t Moreimportantly, it prevents people from bringing the system to its knees by runningreports that consume all of the available resources or those that are not readyfor use

17 Now we’re ready to click Next, which brings us to the last dialog box Here youcan name the report, view the Report Summary, and choose to preview thereport after the wizard is done Name the report, check the Preview checkbox,and click Finish

WARNING After you click Finish, you’re done You won’t be able to rerun the

wiz-ard to alter the report later

ReportServer Web Service URLs

http://BETAV1:80/ReportServer SS2K8 https://betav61:443/ReportServer SS

https://betav1:443/ReprotServer SS2

URLs:

Figure 5 Reporting Services Configuration Manager Web Service URLs report

Trang 16

Using the Visual Studio 2008 Report Designer

18 At this point you’re taken back to Visual Studio where your report layout isshown in the Design Window, the Report Data window is exposed, and the(report) Preview tab is exposed You are ready to execute the report and render

it locally (as shown in figure 6)

No, you don’t need to have Reporting Services up and running at this point—thatwon’t be necessary until you’re ready to deploy the report

Sure, the Visual Studio BI tools include a Report Designer that helps developers(or trained end users) build reports by hand No, this does not involve setting typeand getting ink all over your clothes (been there, done that) It means working with adrag-and-drop paradigm to add appropriate report control elements to a report anddrag columns from pre-built DataSet objects exposed in the Report Data window

Using the Visual Studio 2008 Report Designer

If you’re familiar with SQL Server 2005 Reporting Services, you’ll see a number ofimportant differences in the Visual Studio 2008 BI toolset Note that there is no Datatab—it’s been replaced by the far more sophisticated Report Data window, which isused to manage Built-in Fields (something new for Reporting Services 2008), Parame-ters, Images, and the data sources created for the report Sure, you can pull in datafrom any number of data sources for a single report There are other changes as welllike the incorporation of the Tablix report element that combines functionality of theTable and Matrix report elements This permits you to group by rows and columns aswell as manage RichText data in your reports

Figure 6 Visual Studio BI project with the newly generated report

Trang 17

After you’re ready to see your report rendered and populated with data, click the view tab on the Report Design window At this point, the local renderer in Visual Stu-dio processes the RDL just as would be done by the Reporting Services service Itdisplays the report just as the user would see it (well, almost)

Pre-Managing report parameters

If your report requires any query (or report) parameters, the report is rendered with

UI prompting dialog boxes to capture these values from the user as shown in figure 7.Where did these parameters come from? Well, remember that the query WHERE clausecalled for two dates to be provided before the query can run When the Report Pro-cessor finds parameters imbedded in the report, it automatically generates the UI ele-ments to capture the parameters—even if it means building an interactive drop-downlist for acceptable parameter values

NOTE Report parameters don’t initially have a preset default value Wouldn’t itmake sense to provide default values or even a drop-down list of accept-able values? Unfortunately, the wizard has not evolved to the point ofcapturing default values for parameters Thankfully, this is fairly easy toconfigure here in the Visual Studio BI tools or in the Report Manager

You can configure how parameters are displayed, managed, populated, and validatedthrough a Report Parameter Properties window Figure 8 illustrates the dialog boxexposed by drilling into the Report Data window and a specific Report Parameter.Here you’re given the option to set default values, provide a hard-coded list of permis-sible values, define an expression to compute the default value, or specify a query topopulate a list of permissible values (and more)

If you define default values for all your report parameters, the Report Processordoes not wait to capture parameters before rendering the report This means it’s

Dundas offeringsMicrosoft Partners like Dundas have also helped Microsoft add new functionality tothe BI tools As you develop your reports you’ll find an improved Chart report element

as well as a Gauge element Dundas also provides a host of data visualization ucts to enrich your reports that are exposed through the Reporting Services extensi-bility model

prod-Figure 7 The report rendered in the Preview tab

Trang 18

Using the Visual Studio 2008 Report Designer

important to provide default values that don’t return all available rows Let usersdecide which part of the data they need to see

NOTE Report performance is directly tied to the amount of work the ReportProcessor needs to perform to build and render the report This meansthat parameters that focus the report query on specific, relevant rows arecritical to a best-practice design

Deploying your report

When you’re happy with how the report behaves and appears to the user, you’re ready

to deploy it to a specific Reporting Services instance Yes, you chose that instancename when you ran the Report Wizard, but let’s assume that you did not use theReport Wizard to create your report (and I don’t expect you will after you have builtyour first report) In any case, it does not hurt to double-check these settings in thereport project property page as shown in figure 9

Notice that you’re able to set whether or not Visual Studio overwrites existingserver-side data sources when deploying the report This should be left as False as itpermits your DBA to properly reset the data souce when it comes time to bring yourreport into a production directory

Verify the TargetServerURL to make sure it correctly addresses your Reporting vices Server instance as I discussed earlier

Ser-Figure 8 Setting a default value for report parameters

Figure 9 Setting the report deployment properties

Trang 19

Assuming the Reporting Services instance is started and accessible from your opment system, you’re ready to deploy the report From the Solution Explorer window,right-click an individual report (.RDL) or the report project and choose Deploy If theTargetServerURL is set correctly and the Reporting Services server instance is runningand properly initialized, your report should deploy in a few seconds.

devel-Using the Report Manager

After your report is deployed, you’re ready to see how it is rendered in a browser byinvoking it from Report Manager

WARNING If you plan to use Report Manager, it assumes you have not installed

the SharePoint Services extensions This configuration option is aneither/or deal—you can use one or the other, but not both

Again, I suggest using the SQL Server Reporting Services Configuration utility to verifythat your Report Manager URL is correctly configured as shown in figure 10

I find it’s easiest to click on the appropriate URL and save it in my IE favorites Doing

so launches the Report Manager as shown in figure 11—eventually (it can take 90 onds to launch)

sec-NOTE I like to think of the Report Manager as two tools, one to view directoriesand launch selected reports and another to manage the reports andReporting Services configuration

Figure 10 Verifying the Report Manager URL

Figure 11 The Report Manager home directory

Trang 20

Using the Visual Studio MicrosoftReportViewer control

In this case we’ll be navigating to the new SQL Bible Report path and launching thenew report we just deployed Click on the SQL Bible Report icon and choose Products

by Sell Date (or the path and report name you choose for your report) This instructsthe Reporting Services service to start the Report Processor, which processes thereport and sends the rendered HTML back to the browser The report that’s shown inthe browser should appear to be similar to the report previewed in the Visual Studio BIproject but because these are two different rendering engines (one client-side and oneserver-side), you can expect some subtle differences—especially on complex reports

In the 2008 version of Reporting Services, you can now render reports to MicrosoftWord as well as the previously supported XML, CSV, PDF, MHTML, Excel, and TIFF for-mats You can also navigate to a specific page, zoom in, find specific items, and printthe report

Using the Visual Studio MicrosoftReportViewer control

No discussion of Reporting Services would be complete without mentioning the latestimplementation of the (still evolving) Visual Studio ReportViewer (MRV) control.Unfortunately, Visual Studio SP1 does not include an update for this control despiteits name change It’s expected that the real version of the MicrosoftReportViewer con-trol won’t appear until Visual Studio 2010 ships Be that as it may, it’s still an importanttechnology first introduced in Visual Studio 2005

The MRV control was made possible by leveraging the innovative work done by theMicrosoft engineers working on the Visual Studio BI tools Because they had to create

a stand-alone Report Processor that previewed the report on the developer’s system,all they had to do was expose the reporting interface as a class (not to minimize thework required to accomplish this) The end result is the MicrosoftReportViewer con-trol that developers can use to leverage some RDL reports Visual Studio also containsreport-authoring tools that create local client-side reports and persist them as ReportDefinition Language Client-side (RDLC) files But there’s a hitch First, consider thatthere are three types of RDL report file formats:

ƒ First-generation RDL as implemented by Reporting Services 2005

ƒ First-generation RDLC as implemented by Visual Studio 2005 BI tools

ƒ Second-generation RDL as implemented by Reporting Services 2008Because there are few significant differences between first-generation RDL and RDLCreport files, they are easily transmogrified Unfortunately, there is no support for sec-ond-generation RDL local reports in the Visual Studio 2008 MRV control However,you can display server-hosted second-generation reports We expect this to be updatedwith the next major revision of Visual Studio or as a separate release from the SQLServer team sometime in 2010

Let’s see how the MRV control works Consider that the MRV Report Processordoes not attempt to run any queries It assumes that you pass in the following (basic)items to the control before you can expect it to render the report:

Ngày đăng: 15/12/2013, 13:15

TỪ KHÓA LIÊN QUAN