Sunday, February 27, 2011

Considerations for selecting a technology stack for your BI Solution

I'm reading: Considerations for selecting a technology stack for your BI SolutionTweet this !
Once you have your functional requirements ready, and your BA's would head to you with some powerpoint slides having some fancy graphics of some complex graphs and visualizations, you can consider it as a sign that you would be hit with the question of which technology are we going to use to develop such visually appealing solution. Selecting the right technology stack is a part of every solution development, and it is almost given that SQL Server 2008 R2 and Sharepoint 2010 are NOT always the right answer. There are certain factors that one needs to consider, especially if you are acting as the technical architect for the project. Cost is one of the most important parameter, and till it gets figured out, there is no point in even selecting a technology stack. Apart from this, a list of few such considerations in my viewpoint as listed below:

1) Analytical Visualizations: List down all the type of visualizations required by your solution. Many a times, Business Analysts just get carried away by visualizations, though what they need to analyze is very simple. So get this sorted out first. A very basic example is "Tree Map". PPS 2010 and SSRS 2008 R2 still does not have this visualization. If the limitations of visualizations are too much with PPS and SSRS for your requirements, you might want to use some another reporting solution and continue with SSIS, SSAS and DB Engine with SQL Server 2008 R2.

2) Technology Integration: Mostly whatever you develop out of any technology, it needs to be deployed on a collaborative platform like Sharepoint. If the technology / component cannot integrate on your collaboration platform, it is useless. Most organizations like security configuration using ADFS 2.0 for single sign-on and say that some third-party tool that you selected is not known to be good with claims based authentication. If you select it, you would sign a contract to get yourself roasted.

3) Time to Market: PPS 2010 might be able to cater all your requirements. But ramping up a team having trained skills in development using PPS 2010 can be a challenge. And if you plan to go to market with your solution in a very aggressive timeline, skills like MDX, PPS, etc can be hard to find. And provided you already have skilled resources, it would definitely need time to design - develop - test and - deploy the solution that you would develop from scratch. You might want to use off-the-shelf ready tools / components for a part of your solution.

4) Ease of deployment: Most IT enables services providers and consumers have their own infrastructure where they host their IT solutions and services. This infra can be as huge as few datacenters in different cities, and hardware like SAN, RAID, Load balancers and Clusters. The technology you select should be able to fit in this topology as well as have the potential to exploit the capacity of the existing infra.

5) Performance: Almost every solution provider or consumer would have got burnt little or more with performance challenges. This factor is on the cards for almost every technology stack, before it even gets nominated for evaluation. So the technology that you plan to consider should already have some benchmarked performance results or you should consider creating some POCs to extract some form of performance results to compare against standard benchmarks.

6) User Training: You might want to select a tool that is too rich in feature set and visually appealing, for ex Tableau. But your user base are power users who just wants ready dashboards without any efforts, and they would discuss the results of this dashboards on a collaborative medium like Sharepoint. Though you have the best tool and it might pass all of the above criteria, it might not pass this parameter.

In summary, while selecting the technology stack for your solution, ensure that you go through a rigorous analysis from all aspects, not just feature set.

Thursday, February 24, 2011

Considerations for using ADFS with SSRS and Sharepoint

I'm reading: Considerations for using ADFS with SSRS and SharepointTweet this !
Single Sign-On is generally perceived as a shining beacon of architecture excellence in most of the corporations having an array of IT based services to facilitate a variety of business functions and operations. Windows Identity Foundation / Active Directory Federation Services is one of the mechanisms to facilitate single sign-on i.e. an integrated security across domains using a single authentication credential.

SSRS and Sharepoint are generally used together either in native or sharepoint integrated mode, to make reports available on a collaborative platform. When a claims based authentication mechanism like ADFS is used, users would try to access reports hosted on sharepoint and claims token (which also gets passed in the form of cookies with a particular configuration with ADFS) would get passed from domain to domain through the federation site. The biggest issue is with one of the limitation of this topology / design is to make the token reach SSRS so that it can identify the user. I have been evident of exactly such scenario, and the resolution adopted was to use Windows Integrated Security for the site in question and abandoning ADFS for the same.

As a technical architect, generally you would not find yourself in the limited periphery of technologies like MS BI and Sharepoint only. You would need a good understanding of technologies that touch your solution from any corner, as "Security" is one of the verticals in your architecture design and diagram. Remember that if you are a technical architect, you cannot wash off your hands with the theory that, "I am an architect, and I have no relation with technology !!!". This is true to the extent that this statement is made during solution design, but immediately after the solution design, one might be required to implement the solution and find the right technology stack to implement the solution. If you continue to make this statement even at this phase, the answer you can expect to receive from your program management is "W. T. F...." :)

Here are some resources that can help to understand this limitation and ADFS in a better way.

1)
SharePoint Adventures : Why isn't Claims working with SSRS?

2) SharePoint Adventures : How to identify if you are using Claims Authentication

3) Identity Developer Training Kit - Use this kit to download technical documentation related to Identity Framework

4) Creating VM Lab environment to test ADFS 2.0 and Step by Step Guides to setup ADFS 2.0

5) ADFS 2.0 Home page

Sunday, February 20, 2011

Free AddIn to create Strategy Maps / innovative data driven visualizations

I'm reading: Free AddIn to create Strategy Maps / innovative data driven visualizationsTweet this !
Strategy Maps is a very vital, but often one of the more ignored component of a MIS Dashboard in a MS BI Solution Development methodology. Visio is one of the primary means in MS BI Solution stack to create data driven diagrams - which is also known as strategy maps when used and developed in the context of the dashboard requirements. Visio is ofcourse not a free tool, and it's not the best tool for business users to create visualizations for their data generally stored in MS Office document formats. Visio is not dedicated to creation of strategy maps, though it facilitates it. Visio is meant for a variety of purposes, but still a need is always felt to have a dedicated designer for creation of Strategy Maps. Such designer can/should facilitate to custom tailor innovative visualizations as per your business needs.

One of the partially FREE tools to create data driven visualization is BeGraphic. Tons of different types of charts can be created from your data stored in Excel and Powerpoint. The repository is so rich that one can even learn what kind of business charts / analytical data visualizations exists in the industry. Also it has very broad support for the data sources from where it can fetch data. It offers 10,000 interactive maps for free. I feel that using this tool, you can add a flavor of GeoSpatial support without depending upon the Bing Maps control. It also offers a variety of gauges, meters, sparklines, and other infographics for free. In summary, in my experience and knowledge, this is customization for data visualizations, provided at it's best. In my opinion, it's a must check-out tool. Click on the below image to watch the animated graphic.

Thursday, February 17, 2011

Fast Track Data Warehouse 3.0 - Implementation, Tools, and Planning

I'm reading: Fast Track Data Warehouse 3.0 - Implementation, Tools, and PlanningTweet this !
Fast Track Data warehouse (FTDW) 3.0 got announced a few days back. After reading the official announcement, some of the points which made me happy were:

1) Now partners are encouraged and enabled to create their own flavor of reference architectures. When you let businesses come out with their own creativity / business differentiators, you build an ecosystem of partner businesses attached to your product. This effectively gives your product more strength to survive in a competitive environment.

2) ISVs like WhereScape has already come up with a solution supporting Fast Track Data Warehouse. I interpret this announcement as WhereScape has developed tools and/or software to develop and/or maintain FTDW.

3) The business division to which I belong - Avanade, along with other system integrators are offering services for FTDW.

4) FTDW home page shares useful tools for planning like Fast Track 3.0 System Sizing Tool, Fast Track 3.0 Schema Wizard and more.

When you deep dive in the technical details, many of these documents would look like a typical matrix screen saver to your brain. But the reason for the same is that, without having experience of being involved in a real-time implementation or with almost no background of storage systems, it's hard to plan the configuration of FTDW.

At least what you can do is study reference architectures, and keep the tools in your repository for any future use. FTDW is not a very common implementation, and it's perfectly normal to get confused with it at the first sight. Probably what industry demands is a book on FTDW !

Monday, February 14, 2011

Planning MS BI shared environment for production systems

I'm reading: Planning MS BI shared environment for production systemsTweet this !
Multinational Corporate Clients generally have data centers hosting their BI environments and different production systems are hosted on such environments. Edition upgrade is a hard fact of the IT world and everyone right from developers till production systems needs to go through this change. The reason why I term this as a hard fact, is because synchronizing with this upgrades is quite demanding.

Business analysis is becoming more competitive, predictive, and intelligent with the passage of time, and the same demands more intelligent means of analyzing data. In a corporate environment, there are many discrete business units, each requiring - developing - hosting their own set of applications on a common BI environment hosted in data centers, managed by a centralized operations team. I know of many clients who started hosting their BI environments with the SQL Server 2005 platform, but they forget the fact that this is not the ultimate version. In a shared environment many applications are hosted on the same SQL Server instance, which might be managed using a cluster on SAN volumes.

The real challenge is when a few applications require application upgrades and the rest do not vote in favor of upgrade. So the challenging question faced is, how to isolate your application and instance out of the 50 odd applications hosted on the same instance. Significant amount of hardware resources (Multi core motherboards, Dedicated servers, SAN volumes, Load Balancers, Clusters, etc..) and human resources (DBAs and Operations Support Staff) are invested for maintaining the environment. Added complexity arises from the security configuration, as corporate environments have Single SignOn configuration using mechanisms like NTLM / ADFS. Asking for a separate instance is almost guaranteed to fail approval.

In my experience, I have observed certain exercises that can help to avoid encountering such circumstances. I am sharing a few of them below:

1) Decouple each service to the maximum extent possible. In raw words, at least do not install all the services like SSIS / SSAS / SSRS all on same box.

2) Document the features used in all your artifacts, so that you are always in control of what an upgrade can cause to which entities. For example, documenting features used in each reports would be beneficial, when you are upgrading from 2005 to 2008 R2. If possible try to create entities having common feature usage on dedicated instances, without ending up with a dozen instances. Use your intelligence and create only as much as you can afford to manage.

3) As soon as a new edition is out, create a track of deprecated, discontinued, upgraded, breaking and value added features. Creating a risk analysis sheet should be regular exercise during the release of each new edition. Digging a well when you are faced with fire is a bad exercise. I mean that when you are faced with the urgency of upgrade and then you start planning the migration strategy is a considerably late exercise and lack of long term planning.

4) DBAs generally freak out with installation of a tool as tiny as Upgrade Advisor too, when it comes to installation on a cluster. Shared instance of SQL Server for different applications makes them completely paranoid. So it's always better to keep green zones on production servers, for installation of tools as a part of contracts / SLAs so that shared stakeholders do not convert your SQL Server production instance into a DMZ.

5) Take extra care for components that are considered as shared by SQL Server architecture itself. Most designers ignore the little detail that SSIS is a shared component in the MS BI Stack, and this can be catastrophic when you are faced with an upgrade on an environment where 20 applications are using the same SSIS instance.

6) Use this guide as a detailed reference material to assess the risk that you might be already running with your existing production systems.

If you have more tips to add to this post, please feel free to share your comments or email me your viewpoint.

Tuesday, February 08, 2011

SSRS 2005 to SSRS 2008 R2 Migration Strategy , SSIS 2005 to SSIS 2008 R2 Migration Strategy, SSAS 2005 to SSAS 2008 R2 Migration Strategy

I'm reading: SSRS 2005 to SSRS 2008 R2 Migration Strategy , SSIS 2005 to SSIS 2008 R2 Migration Strategy, SSAS 2005 to SSAS 2008 R2 Migration StrategyTweet this !
Product edition upgrade and migration of solution artifacts from lower to higher edition is quite a challenge and needs careful planning. The more experience you have on different migrations, the more you would have anticipation of possible problems for migration. However deep may be one's experience, data and platform migration is one such area where one can always expect surprises. Everyone has a first time, and in migration you would want to be sure that you have all the supporting tools and some higher level strategy in mind to design your migration. Below are some guidelines which I had found useful in my career experiences.

1) Firstly collect all the tools, at least freewares that can help you in your migration analysis. SQL Server 2008 R2 ships with SQL Server Upgrade Advisor, which can be the best starting point. This tool is also a part of the SQL Server 2008 R2 Features Pack. You can learn more about the same from
here. This tool covers all areas, right from database engine till SSAS.

2) When you start your design, you would have to make a clear distinction between whether you want to perform an in-place upgrade or create a new instance -> deploy solution on the new instance -> ensure synchronization between old and new instance -> abandon old instance. Check out this
article for some more info.

3) You should keep in view where you plan to do the upgrade, i.e. on the same box, in the same domain, or across different servers and different domains. This would throw up the challenge of security configuration.

4) Environment configuration needs to be planned for each service separately. For example, SSIS packages can be expected to use configuration settings from different sources like environment variables, configuration files, database and other sources. SSRS configuration might reside in config files for reports server as well as reports manager. Virtualization is the key factor is testing all such scenarios.

5) Finally the biggest risk factor needs to be calculated, i.e. identifying the right sampling to test on the targeted edition. SQL Server 2005 came with it's first mature BI offering. Several components of different services have undergone architectural changes, several features are discontinued, several features have behavioral changes and several features are guaranteed to break when migrating from lower to higher editions.

a)
Deprecated Features in SQL Server Reporting Services
b) Discontinued Functionality in SQL Server Reporting Services
c) Breaking Changes in SQL Server Reporting Services
d) Behavior Changes in SQL Server Reporting Services

I prefer creating out a consolidated list of these features. Then all the reports should be analysed to check if any reports have used these features, which would mean that these reports qualify to be considered as a sample to test on the targeted edition. This sampling exercise would not only generate right size of samples to test, but also the same sample would act as the Acceptance Testing Procedure.

Generally production environments are handled by operations team, and they remain in charge of migration too. Development teams need to confirm whether migration was successful and works as expected. The successful functioning of sampling identified from the above exercise would act as the Acceptance Testing routine, which is a contract that needs to be agreed between development and operations team in advance before migration is performed. Keep in view, that this exercise needs to be performed for each service individually - DB Engine, SSIS, SSAS and SSRS.

It's a very brief list, but these points can at least help you align your strategy in some direction when you are totally blank on how you would plan your migration. If you have better tips that can add value to this post, please feel free to share your comments.

Monday, February 07, 2011

Download free SQL Server Tools for SQL development, data inspection and schema analysis

I'm reading: Download free SQL Server Tools for SQL development, data inspection and schema analysisTweet this !
My experience have always been that a developers best friends are his debugging and development tools and books. The worst enemies of a developer are his ignorance and lack of discipline in keeping oneself updated with the pace of evolution in technology. Platforms like Codeplex and SQL Server community is like a boon to developers. You would not find every platform / environment pampering enough where you would find tools to help you in your assignments, as generally every automation comes for a price.

We already know some of the famous free SQL Server tools / utilities / add-ins like BIDSHelper, SSMS Tools Pack, RSScripter, SQL Search, Internals Viewer for SQL Server and a few free tools from Idera. A fresh stock of new FREE promising tools are joining the bandwagon. Atlantis Interactive is offering a set of free SQL Server tools that supports SQL Development, Schema Dependency Analysis and Data Store Analysis. These tools are:

1) SQL Everywhere
2) Schema Inspector
3) Data Inspector
4) Data Space Analyser
5) Data Surf
6) Schema Surf

Qure is a product available from DBSophic that caters database performance analysis and tuning in their own unique way. Following the footsteps of Red-Gate they too have come out with a free Trace Analyzer and the same can be downloaded from here. Offering free tools / books to the developer community is a trend that product vendors follow to promote their brand and Line of Business applications, and it helps developers to make their life easy. I am not of the opinion that this is any kind of service to the community, but it's strategic business promotion that works symbiotically for vendors and clients.


Reference: Aaron's Blog

Tuesday, February 01, 2011

MS BI and Azure solution development using virtualization tools

I'm reading: MS BI and Azure solution development using virtualization toolsTweet this !
Virtualization has become a very essential part of every IT professional who needs to deal with multiple systems / environments. While working with a MS BI solution, especially on a personal machine, virtualization is very much required to provide a sandboxed environment to different softwares that we use as a part of solution development. Betas, freewares, third party utilities, community projects, open source applications and other such applications are generally safe to install in a sandboxed environment, so that they do not inadvertently disturb the balance of your host system. Some of the tools that I generally use in my research and development environment are as below:

1) Virtual PC 2007 SP1 / Virtual PC for Windows 7, depending upon the kind of OS you use. Keep in view that you can install VPC on Win 7 only if you have Win 7 Professional or higher editions.

2) VirtualBox: It might come as a surprise to those who do not use VPC, that VPC does not support 64-bit OS. So if you intend to use a 64-bit OS for latest editions of softwares like Sharepoint 2010 / SQL Server Denali that has a requirement of a server OS or a 64-bit OS, use VirtualBox which is a freeware.

3) Azure Emulators: Windows Azure SDK comes with two emulators, Windows Azure Compute Emulator and Windows Azure Storage Emulator. If you do not have these emulators, the only option remains is to sign-up for an Azure account, for which of course you need to pay. A brief tutorial on the way to use these emulators can be read from here.

4) Freely available evaluation VHDs from Microsoft, can save a lot of time, effort and resources that you would spend on installation and configuration of softwares during RnD or similar phase / purpose.

I have multiple VPCs and VirtualBox images on my Win 7 and it helps me to sandbox each solution / software combination that I configure on a particular virtual machine. Try it out for yourself.
Related Posts with Thumbnails