Thursday, October 29, 2009

How to automate SSRS reports deployment or How to create a setup for SSRS Reports deployment

I'm reading: How to automate SSRS reports deployment or How to create a setup for SSRS Reports deploymentTweet this !
Today on 29th Oct 2009, I had a chance to present the use of a tool called RSScripter for deployment of SSRS Reports at London SQL Server Users Group. SQL Server MVP Jamie Thomson presented a nice session explaining use of MS Build to automate SSRS Reports deployment. In alignment to the same topic, I got a few minutes to present my knowhow on the same topic.

As documented on the product site, "Reporting Services Scripter is a .NET Windows Forms application that enables scripting and transfer of all Microsoft SQL Server Reporting Services catalog items to aid in transferring them from one server to another. It can also be used to easily move items on mass from one Reporting Services folder to another on the same server. Depending on the scripting options chosen, Reporting Services Scripter can also transfer all catalog item properties such as Descriptions, History options, Execution options (including report specific and shared schedules), Subscriptions (normal and data driven) and server side report parameters."

It has a very nice GUI and the options are pretty self-relevant. There are a few steps that one needs to follow to create a setup out of this utility for reports deployment.

RSScripter


Step 1: Configure paths for the below


a) Point to correct path of RS.exe location on the Globals Tab


b) Target report server url where you intend to deploy your SSRS Reports


c) SSRS Solution or Project file under which all the SSRS reports are added or select individual reports that you would want to target. Also you can click on the Get Catalog button to find out the reports on the server that you have configured.

d) Output location where you want the setup files to be created


e) Select Generate Report Script on the Report Tab

f) Select Specify New Parent Folder / Create Folder Hierarchy option from the General Tab as per your requirement.

g) Use other properties as per requirements, which can be used to fine tune the settings for deployment. Detailed information about the effect of using these properties can be found from the readme file of the tool, hosted on its product site.

Step 2: Click the Script button to create setup files

Step 3: Execute the setup batch file created, and this would deploy all the reports.

Behind the scenes, the batch file executes the VB Script files created for each SSRS Report. This script files contains the code, that creates a corresponding object of SSRS Reports model, and uses the RS.exe utility with these reports as parameters. This utility sets all the properties of these objects that are collected thru the user interface.

The biggest assets of this utility is its a FREEWARE and it uses VBScript, MS DOS Batch File, and RS.exe to create a setup for deployment. Also the output can be version controlled by using some version control software. The setup created by this utility is very transparent to understand, no extra assemblies (.dll) and it builds on RS.exe and VBScript which doesn't require any additional piece of code or software to interpret.

If still there is someone left, who is a fan of coding, all the information on how to create scripts using VBScript to deploy each different object of SSRS can be found on this page.

Thursday, October 22, 2009

What is a Dashboard , What is a KPI , Real World Example of Performance Dashboard

I'm reading: What is a Dashboard , What is a KPI , Real World Example of Performance DashboardTweet this !
Most of those who know what is a Dashboard and/or a Key Performance Indicator (KPI), must have read some bookish examples where the book explains that it's similar to the Dashboard available in your car or a flight cockpit. One might have even seen some pictures of what a dashboard would look like, but once you use it you know how management can take Strategic Decisions based on the top-level or summarized data of the organization.

It's said that "A Picture is worth a thousand words" and I say that "A Dashboard is worth a thousand queries". You can get all the definitions and pictures of a Dashboard from Wikipedia or some books. But if you want to get a real feel of how a dashboard looks like, there are two great places where one can take lessons from:

1) MS BI VPC: This one is a good example for BI developers, as it comes with all the great examples of everything that can be developed using Microsoft BI Stack of Technologies. It also includes a dashboard for an AdventureWorks cycle, that is created using Performancepoint Monitoring module. This VPC comes loaded with an evaluation version of all the softwares starting SQL Server and MS Excel to Performancepoint which has one or another feature of Business Intelligence.

2) Realtime Example: This one is for best ways to learn of how a dashboard would look like and how it would help management to take strategic decisions based on the information presented in a form of decision support system. Someone who is fresh might not be able to quickly dive into BI tools, but there's a very nice site where one can see a Dashboard in live action.

This website is called the Federal IT Dashboard. It a magnificient website and the Dashboard presentation is majestic. This website is probably owned by the US government to publish to the public, of their IT spending on different portfolios. Even if you are a seasoned BI Developer or a Novice BI aspirant, I recommend taking a look at this website to get a real feel of how Dashboard can make a difference and how an interactive dashboard can look like.

Tuesday, October 20, 2009

Project Gemini is now SQL Server PowerPivot for Excel and SharePoint

I'm reading: Project Gemini is now SQL Server PowerPivot for Excel and SharePointTweet this !
I posted yesterday regarding the entry of a new tool called powerpivot, but I didn't realize that its Gemini. After a post from Chris Webb's blog, I read it from a post on Microsoft Sharepoint Team Blog that at a Sharepoint Conference, they announced that official name for “Gemini” is SQL Server PowerPivot for Excel and SharePoint.

Below in an excerpt from the post where they declared the same:

Historically, business intelligence has been a specialized toolset used by a small set of users with little ad-hoc interactivity. Our approach is to unlock data and enable collaboration on the analysis to help everyone in the organization get richer insights. Excel Services is one of the popular features of SharePoint 2007 as people like the ease of creating models in Excel and publishing them to server for broad access while maintaining central control and one version of the truth. We are expanding on this SharePoint 2010 with new visualization, navigation and BI features. The top five investment areas:

1. Excel Services – Excel rendering and interactivity in SharePoint gets better with richer pivoting, slicing and visualizations like heatmaps and sparklines. New REST support makes it easier to add server-based calculations and charts to web pages and mash-ups.

2. Performance Point Services – We enhanced scorecards, dashboard, key performance indicator and navigation features such as decomposition trees in SharePoint Server 2010 for the most sophisticated BI portals.

3. SQL Server – The SharePoint and SQL Server teams have worked together so SQL Server capabilities like Analysis Services and Reporting Services are easier to access from within SharePoint and Excel. We are exposing these interfaces and working with other BI vendors so they can plug in their solutions as well.

4. “Gemini” – “Gemini” is the name for a powerful new in memory database technology that lets Excel and Excel Services users navigate massive amounts of information without having to create or edit an OLAP cube. Imagine an Excel spreadsheet rendered (in the client or browser) with 100 million rows and you get the idea. Today at the SharePoint Conference, we announced the official name for “Gemini” is SQL Server PowerPivot for Excel and SharePoint.

5. Visio Services – As with Excel, users love the flexibility of creating rich diagrams in Visio. In 2010, we have added web rendering with interactivity and data binding including mashups from SharePoint with support for rendering Visio diagrams in a browser. We also added SharePoint workflow design support in Visio.

Reference: cwebbbi.spaces.live.com

Monday, October 19, 2009

Data Analysis Tool / Add-In to extract and develop Business Intelligence using Excel and SQL Server : PowerPivot

I'm reading: Data Analysis Tool / Add-In to extract and develop Business Intelligence using Excel and SQL Server : PowerPivotTweet this !
It looks like Microsoft is all set to deliver a new baby in the parlance of Business Intelligence, and its named PowerPivot. Below is an excerpt from the PowerPivot product site:

Overview:

PowerPivot for Excel 2010 is a data analysis tool that delivers unmatched computational power directly within the application users already know and love—Microsoft Excel. It provides users with the ability to analyze mass quantities of data and IT departments with the capability to monitor and manage how users collaborate by integrating seamlessly with Microsoft SharePoint Server 2010 and Microsoft SQL Server 2008 R2.

BI Offerings from PowerPivot:

Give users the best data analysis tool available: Build on the familiarity of Excel to accelerate user adoption. Expand the existing capabilities with column-based compression and in-memory Bi engine, virtually unlimited data sources, and new Data Analysis Expressions (DAX) in familiar formula syntax.

Facilitate knowledge sharing and collaboration on user-generated BI solutions: Deploy SharePoint 2010 to provide the collaboration foundation with all essential capabilities, including security, workflows, version control, and Excel Services. Install SQL Server 2008 R2 to enable support for ad-hoc BI applications in SharePoint, including automatic data refresh, data processing with the same performance as in Excel, and the PowerPivot Management Dashboard. Your users can then access PowerPivot workbooks in the browser without having to download workbooks and data to every workstation.

Increase BI management efficiency: Use the PowerPivot Management Dashboard to manage performance, availability, and quality of service. Discover mission-critical applications and ensure that proper resources are allocated.

Provide reliable access to trustworthy data: Take advantage of SQL Server Reporting Services data feeds to encapsulate enterprise systems and reuse shared PowerPivot workbooks as data sources in new analyses.

Wednesday, October 14, 2009

SSRS Performance Optimization and Analysis

I'm reading: SSRS Performance Optimization and AnalysisTweet this !
Recently, I came across a very good article on SSRS Performance Analysis by Tom Mills using a tool called Fiddler. As the homepage of this tool states, "Fiddler is a Web Debugging Proxy which logs all HTTP(S) traffic between your computer and the Internet. Fiddler allows you to inspect all HTTP(S) traffic, set breakpoints, and "fiddle" with incoming or outgoing data. Fiddler includes a powerful event-based scripting subsystem, and can be extended using any .NET language."

This article demonstrates the process of debugging performance issues of SSRS reports at the client end. When everything seems okay right from the execution of stored procedures related to the report, to network and report server configuration, one is required to look into what's going over the wire i.e. what's going out and coming in from the client network. This issue is not specific to SSRS reporting. In any web-based server side application development, when the web application is to be debugged for performance issues or even when it needs to be benchmarked for certain parameters, this method is used. And the method is placing an HTTP Proxy or HTTP Sniffer.

This tool can be installed on any of the client machines, and any HTTP activity effective due to report execution on the client machine and network can be logged and investigated. The most promising features of this tool are, firstly its a freeware. This tool comes with several add-ons which are very much suited for analyzing the content and requests that flow over the wire, with all the minute details that can possibly be captured. It's a very nice tool for performance debugging and in my viewpoint, BIDS should have shipped with this kind of tool. This tool is developed using Microsoft .NET and also provides the flexibility to extend the code.

Thumbs up for this tool and the article which itself is a great tutorial explaining the methodology of performance debugging of SSRS reports from a client network.


Reference: Tom Mills Blog

Tuesday, October 13, 2009

Use of Data Source Views in SSIS

I'm reading: Use of Data Source Views in SSISTweet this !

Less often I have seen Data source Views (DSV) getting used in an SSIS solution. The biggest trait of DSV is that, once its defined, it doesn't get refreshed by itself. In a way it is desireable, and in a way it is not. This is harmful in the way that, if the underlying source structure has changed, there would be no notification to the package of the same, until and unless you explicitly hit the refresh button. But once refreshed, it catches all the changes from the underlying data source and updates the view.

The main advantage of using a DSV in a SSIS solution would be centralizing the connection logic and the scope of database objects to be used. All the packages in the solution can use the same DSV and connect to objects limited to the scope of the DSV. In case of change in logic, it would get easily cascaded to the respective packages using this DSV. Also one DSV would always run under one connection object, which would mean that by enforcing the use of DSV, uncontrolled creation of private database connections would be implicitly controlled. Also at times, it help in continued package development even if Server is offline or disconnected.

And if one needs to connect to some database objects that one would not wish to include in the DSV, a connection can always be created using the Shared Data Source created for the DSV. This would be a direct connection to the database, without any kind of filtering of database objects that a DSV would have. All in all, DSV is a good practice to centralize and control connection usage in a SSIS Solution.

Tuesday, October 06, 2009

Common mistakes or errors while working with Derived Column transformation component

I'm reading: Common mistakes or errors while working with Derived Column transformation componentTweet this !
1) Many a times columns are compared with some value to determine a column value. But if the column contains NULL values, this component would break as it would not be able to make a comparison. So always make a check on the column that you are using for comparison that the value is not NULL using the ISNULL function and then place the formula using the logical AND operator. This would make your formula bulletproof.

2) When you change an existing formula, the data-type is automatically changed by this component to Unicode until and unless it is specifically casted to ANSI. And this sudden change can break components ahead in the flow like Sort for example, which detects any metadata changes. A better solution can be to edit the formula in the Advanaced Editor to get rid of this buggy behaviour of this transform. This specially is the case in SSIS 2008.

3) While developing a formula, many get confused with the return types. Make sure that across the formula, return type should be of the same data type irrespective of the condition in the formula. Even if the intention is to return NULL, just cast the NULL to specific data type that the other part of the logic is returning.

Sunday, October 04, 2009

Data Cleansing Tutorial : What is Data Cleansing

I'm reading: Data Cleansing Tutorial : What is Data CleansingTweet this !
I have worked on data migration projects where I have been a part of data cleansing activities using SSIS. One of the reader of my blog, requested me to post some tutorials on Data Cleansing. So based on my experiences, I would like to answer a few questions on Data Cleansing. The point of interest is "What is Data Cleansing" and then the next question is "How can Data Cleansing be commenced and concluded" ?

When is Data Cleansing required ?

Generally Data Cleansing is required in a systems integration flavor of project. For example when different systems which might be logically related like back-office and front-office systems, or non-related systems like different business application that might be independent in its own respect but functioning under a common umbrella. In both these examples one thing is common, these systems operate differently and are subject to integration.

What is Data Cleansing ?

Data is the above mentioned systems are not always as expected, or the data is not of the quality that is required to be fed into new system. For example, there can be OLTP applications or front-office systems where data is not fed in the precise manner due to operator error or limited system design. So these data issues needs to be fixed, mostly in the host or intermediate system which is mostly effective to the requirements of the new system where data of a particular quality level is required.

How to do Data Cleansing, using SQL Server in specific ?

There are different ways of doing Data Cleansing, and there is not particular theory or formula to it, in the best of my knowledge. But one thing that applies universally to any data cleansing exercise is, the data should be profiled thoroughly. Thorough data profiling is directly proportional to level of data cleansing. Mostly the hard part is data profiling, than data cleansing. It takes more business knowledge combined with technical analysis of the data for profiling. Once the details about what to fix in the data is catalogued, it's generally not that hard to fix those issues using ETL tools like SSIS. SSIS comes with a dedicated task for Data Profiling and a dedicated tool for viewing the profiled data.
Related Posts with Thumbnails