1. Trang chủ
  2. » Công Nghệ Thông Tin

Thủ thuật Sharepoint 2010 part 59 potx

13 169 0
Tài liệu đã được kiểm tra trùng lặp

Đang tải... (xem toàn văn)

Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống

THÔNG TIN TÀI LIỆU

Thông tin cơ bản

Định dạng
Số trang 13
Dung lượng 1,37 MB

Các công cụ chuyển đổi và chỉnh sửa cho tài liệu này

Nội dung

The first checkbox on the page determines whether the usage data is collected and stored in the Logging database.. If you want to reduce the impact logging has on your servers, you can d

Trang 1

developer dashboard

While this book is squarely aimed at SharePoint administrators, we need to cover a new piece of

functionality called the developer dashboard Despite what the name may suggest, it’s not just for

developers The developer dashboard is dashboard that shows how long it took for a page to load, and which components loaded with it A picture is worth 1000 words, so Figure 15-8 probably explains it better

FIguRE 15-8

This dashboard is loaded at the bottom of your requested web page As you can see, the

dash-board is chock full of information about the page load You can see how long the page took to load (708.77 ms) as well as who requested it, its correlation ID, and so on This info is useful when the helpdesk gets those ever popular “SharePoint’s slow” calls from users Now you can quantify exactly what “slow” means as well as see what led up to the page load being slow If Web Parts were poorly designed and did a lot of database queries, you’d see it here If they fetched large amounts of SharePoint content, you’d see it here If you’re really curious you can click the link on the bottom

Trang 2

left, “Show or hide additional tracing information” to get several pages worth of information about every step that was taken to render that page

Now that you’re sold on the developer dashboard, how do

you actually use it? Like we mentioned before, it is exposed

as a dashboard at the bottom of the page when it renders

The user browsing the page must be a site collection

admin-istrator to see the developer dashboard, and it must be

enabled in your farm By default it is shut off, which is one

of the three possible states It can also be on, which means

the dashboard is displayed on every page load Not only is

that tedious when you’re using SharePoint, but it also has a

performance penalty The third option, ondemand, is a

more reasonable approach In ondemand mode the developer dashboard is not on, but it’s warming

up in the on deck circle, waiting for you to put it in the big game When the need arises, a site collec-tion administrator can turn it on my clicking the icon indicated in Figure 15-9 When you are finished with it, you can put it back on the bench by clicking the same icon

How do you go about enabling the developer dashboard to make this possible? You have two options You can use sad, old STSADM, or you can use shiny new Windows PowerShell The fol-lowing code shows both ways of enabling it

Using STSADM:

stsadm -o setproperty -pn developer-dashboard -pv on

stsadm -o setproperty -pn developer-dashboard -pv off

stsadm -o setproperty -pn developer-dashboard -pv ondemand

Using Windows PowerShell:

$dash = [Microsoft.SharePoint.Administration.SPWebService]::

ContentService.DeveloperDashboardSettings;

$dash.DisplayLevel = ‘OnDemand’;

$dash.TraceEnabled = $true;

$dash.Update()

Notice that at no point do you specify a URL when you’re setting this It is a farm-wide setting Never fear though; only site collection administrators will see it, so hopefully it won’t scare too many users if you have to enable it for troubleshooting

Logging database

Microsoft has always made it pretty clear how it feels about people touching the SharePoint

databases The answer is always a very clear and concise, “Knock that off!” They didn’t support reading from or writing to SharePoint databases, period End of story That became a problem, however, because not all of the information administrators wanted about their farm or servers was

FIguRE 15-9

Trang 3

discoverable in the interface, or with the SharePoint Object Model This resulted in rogue adminis-trators, in the dark of night, quietly querying their databases, hoping to never get caught

SharePoint 2010 addresses this by introducing a logging database This database is a repository

of SharePoint events from every machine in your farm It aggregates information from many different locations and writes them all to a single database This database contains just about everything you could ever want to know about your farm Even better, you can read from and write to this database

if you would like, as the schema is public Do your worst to it, Microsoft doesn’t care

Microsoft’s reason for forbidding access to databases before was well intentioned Obviously, writ-ing to a SharePoint database potentially puts it in a state where SharePoint can no longer read it and render the content in it We all agree this is bad What is less obvious though is that reading from a database can have the same impact A seemingly innocent but poorly written SQL query that only reads values could put a lock on a table, or the whole database This lock would also mean that SharePoint could not render out the content of that database for the duration of the lock That’s also a bad thing This logging database, however, is just a copy of information gathered from other places and is not used to satisfy end user requests, so it’s safe for you to read from it or write to it If you destroy the database completely, you can just delete it and let SharePoint re-create it The freedom

is invigorating

Figure 15-10 shows some of the information that is copied into the logging database

FIguRE 15-10

Configuring the logging Database

How do you use this magical database and leverage all this information? By default, health data col-lection is enabled This builds the logging database To view the settings, open SharePoint Central Administration and go into the now familiar Monitoring section Under the Reporting heading, click “Configure usage and health data collection,” as shown in Figure 15-11

Trang 4

FIguRE 15-11

Let’s start our tour of the settings at the top The first checkbox on the page determines whether the usage data is collected and stored in the Logging database This is turned on by default, and here is where you would disable it, should you choose to

The next section enables you to determine which events you want reported in the log By default, all eight events are logged If you want to reduce the impact logging has on your servers, you can disable events for which you don’t think you’ll want reports You always have the option to enable events later You may want to do this if you need to investigate a specific issue You can turn the logging on during your investigation, and then shut it off after the investigation is finished

The next section determines where the usage logs will be stored By default they are stored in the Logs directory of the SharePoint Root, along with the trace logs The usage logs follow the same naming convention as the trace logs, but have the suffix .usage As with the trace logs, it’s a good idea to move these logs off of the C:\ drive if possible You can also limit the amount of space the usage logs take, with 5GB being the default

The next section, Health data collection, seems simple enough: just a checkbox and a link The checkbox determines whether SharePoint will periodically collect health information about the members of the farm The link takes you to a list of timer jobs that collect that information When

Trang 5

you click the Health Logging Schedule link, you’re taken to a page that lists all of the timer jobs that collect this information You can use this page to disable the timer jobs for any information you don’t want to collect Again, the more logging you do, the greater the impact on performance Figure 15-12 shows the health data collection timer jobs

FIguRE 15-12

Clearly, SharePoint collects a vast amount of information Not only does it monitor SharePoint-related performance, such as the User Profile Service Application Synchronization Job, it also keeps track of the health of non-SharePoint processes, like SQL It reports SQL blocking queries and DMV (dynamic management view) data Not only can you disable the timer jobs for information you don’t want to collect, you can also decrease how often they run, to reduce the impact on your servers

The next section of the Configure web analytics and health data collection page is the log collection schedule, which enables you to configure how frequently the logs are collected from the servers in the farm, and how frequently they are processed and written to the logging database This lets you

Trang 6

control the impact the log collection has on your servers The default setting collects the logs every

30 minutes, but you can increase that to reduce the load placed on the servers

The final section of the page displays the SQL instance and database name of the reporting data-base itself The default settings use the same SQL instance as the default content datadata-base SQL instance, and use the database name WSS_Logging Although the page recommends using the default settings, there are some pretty good reasons to change its location and settings Considering the amount of information that can be written to this database, and how frequently that data can

be written, it might make sense to move this database to its own SQL server While reading from and writing to the database won’t directly impact end user performance, the amount of usage this database could see might overwhelm your SQL server, or fill up the drives that also house your other SharePoint databases If your organization chooses to use the logging database, keep an eye on the disk space that it uses, and the amount of activity it generates On a test environment with about one month’s worth of use by one user, the logging database grew to over 1GB This database can get huge If you need to alter those settings you can do so in Windows PowerShell with the Set-SPUsageApplication cmdlet The following PowerShell code demonstrates how to change the log-ging database’s location:

Set-SPUsageApplication -DatabaseServer <Database server name> -DatabaseName

<Database name> [-DatabaseUsername <User name>] [-DatabasePassword <Password>] [-Verbose]

Specify the name of the SQL server or instance where you would like to host the logging database You must also specify the database name, even if you want to use the default name, WSS_Logging

If the user running the Set-SPUsageApplication cmdlet is not the owner of the database, provide the username and password of an account that has sufficient permissions Because this database consists of data aggregated from other locations, you can move it without losing any data It will simply be repopulated as the collection jobs run

To get the full list of PowerShell cmdlets that deal with the Usage service, use Get-Command as follows:

get-command -noun spusage*

Consuming the logging Database

We’ve talked a lot about this logging database, what’s in it, and how to configure it, but we haven’t yet covered how you can enjoy its handiwork There are many places to consume the information

in the logging database The first place is Central Administration Under Monitoring ➪ Reporting are three reports that use information in the logging database The first link is View administrative reports Clicking that link takes you to a document library in Central Administration that contains

a few canned administrative reports Out of the box there are only search reports, as shown in Figure 15-13, but any type of report could be put here Microsoft could provide these reports, or they can be created by SharePoint administrators

The documents in this library are simply web pages, so click any of them to see the information they contain These particular reports are very handy for determining the source of search bottlenecks

Trang 7

This enables you to be proactive in scaling out your search infrastructure You are able to see how long discrete parts of search take, and then scale out your infrastructure before end users are affected

FIguRE 15-13

The next set of reports in Central Administration are the health reports These reports enable you

to isolate the slowest pages in your web app, and the most active users per web app Like the search reports, these reports enable you to proactively diagnose issues in your farm After viewing details about the slowest pages being rendered, you can take steps to improve their performance Figure 15-14 shows part of the report To view a report, click the Go button on the right

The report shows how long each page takes to load, including minimums, maximums, and averages This gives you a very convenient way to find your trouble pages You can also see how many data-base queries the page makes This is helpful, as datadata-base queries are expensive operations that can slow down a page render You can drill down to a specific server or web app with this report as well, since the logging database aggregates information from all the servers in your farm Pick the scope

of the report you want and click the Go button The reports are generated at runtime, so it might take a few seconds for it to appear After the results appear, you can click a column heading to sort

by those values

Trang 8

FIguRE 15-14

The third and final set of reports in Central Admin that are fed from the logging database are the Web Analytics reports These reports provide usage information about each of your farm’s web applications, excluding Central Admin Clicking the View Web Analytics reports link takes you to a summary page listing the web apps in your farm, along with some high-level metrics like total num-ber of page views and total numnum-ber of daily unique visitors Figure 15-15 shows the Summary page When you click on a web application on the Summary page you’re taken to a Summary page for that web app that provides more detailed usage information This includes additional metrics for the web app, such as referrers, total number of page views, and the trends for each, as shown in Figure 15-16

The web app summary report also adds new links on the left These links enable you to drill further down into each category Each new report has a graph at the top, with more detailed information at the bottom of the screen If you want to change the scope of a report, click Analyze in the ribbon This shows the options you have for the report, including the date ranges included You can choose one of the date ranges provided or, as shown in Figure 15-17, choose custom dates

Trang 9

FIguRE 15-15

FIguRE 15-16

Trang 10

FIguRE 15-17

This gives you the flexibility to drill down to the exact date you want You can also export the report out to a CSV file with the Export to Spreadsheet button Because this is a CSV file, the graph

is not included — only the dates and their values These options are available for any of the reports after you choose a web app

As mentioned, the web analytics reports do not include Central Administration While it’s unlikely that you’ll need such a report, they are available to you The Central Admin site is simply a highly specialized site collection in its own web app Because it is a site collection, usage reports are also available for it To view them, click Site Actions ➪ Site Settings Under Site Administration, click Site web analytics reports This brings up the same usage reports you just saw at the web app level You also have the same options in the ribbon, with the exception of being able to export to CSV Figure 15-18 shows the browser report for Central Admin

Because these reports are site collection web analytics reports, they are available in all site collections

as well as in Central Admin This is another way to consume the information in the logging data-base You can view the usage information for any site collection or web, just open Site Actions ➪ Site Settings to get the web analytics links You have two similar links: Site Web Analytics reports and Site Collection Web Analytics reports These are the same sets of reports, but at different scopes The site collection–level reports are for the entire site collection The Site-level reports provide the same

information but at the site (also called web) level You have a further option of scoping the reports at

that particular site, or that site and its subsites Figure 15-19 shows the options available at the site level

Ngày đăng: 02/07/2014, 12:20