Friday, July 30, 2010

Difference between SaaS , PaaS , Cloud Computing and Grid Computing in Microsoft BI and Analytics

I'm reading: Difference between SaaS , PaaS , Cloud Computing and Grid Computing in Microsoft BI and AnalyticsTweet this !
BI & Analytics space is filled up with a lot of terminologies that sounds very similar and are often used interchangeably by many vendors. But in general when a client comes up with certain requirements, a clear definition of these terminologies helps where a particular product fits in the overall solution design. Also this helps to get an understanding whether a BI & Analytics vendor has all the capabilities for supporting the type of solution that fits the requirement. Below is a list of such terms and their meanings in my understanding and experience.

1) Grid Computing : This term can be put in raw words as "Computing Power". The more computing power you require, the more computing resources you add to the grid. This would sound like cloud computing theory, and indeed it is. In fact, Grid computing can be used to empower Cloud computing or it can be also seen as a service on the cloud.

2) Cloud Computing : It can be seen as a service where you scale-in and scale-out resources as per requirements and you pay as per use. These resources can be computing power as well as storage at the bottom most level, and more services can be built on top of it. Vendor providing cloud environment hosts these services at it's data centers, client subscribe and pay as per needs. Windows Azure and SQL Azure are examples of the same.

3) PaaS : This means Platform As A Service. In this term, platform can be BI & Analytics platform for example. There can be other platforms like Data Cleansing, Data Integration, Data Profiling, Data Mining and others. I see Project Houston and SQL Azure together as a PaaS offering for data storage from Microsoft on Windows Azure Cloud Computing environment.

4) SaaS : This means Software As A Service. In BI & Analytics arena, SaaS BI is an environment that is usually powered by vendor owned data centers or third-party owned cloud computing environment, and SaaS BI Vendor provides PaaS to end clients for BI & Analytics. SaaS BI is a very growing requirement due to its key advantages of almost no hardware or maintenance cost, on demand scaling in or out of resources, pay as per use model of billing, fastest possible BI & Analytics in the most user friendly interface targeted to be operated by Business Users. In simple words, business users would get a web based application using which they can facilitate their BI & Analytics requirements by just making data available to this web based applications.

In my view, I do not see any SaaS BI offerings from Microsoft as of now. Microsoft has cloud offering on Windows Azure platform, and Windows Azure Appliance Solution sounds even better as of now. Project Houston, SQL Azure, Windows Azure, and Powerpivot if available on the cloud (i.e. it should be available on Azure platform and should not need any in-premises installation) can together act as SaaS BI solution, provided all of these work in an integrated fashion. In fact SaaS also provides features for deployment of deliverables created by business users on a collaborative platform that is again on the web. Presently the same is done via Sharepoint in MS BI Stack. An example of the kind of collaborative environment required on the web is Office Web Apps on Skydrive where you can create and share documents on the web.

A well known example of product based on SaaS BI model is "BI OnDemand" solution from SAP BusinessObjects Crystal Solutions.

Thursday, July 29, 2010

Self service Business Intelligence on SaaS Platform Alternatives ( Powerpivot Alternatives )

I'm reading: Self service Business Intelligence on SaaS Platform Alternatives ( Powerpivot Alternatives )Tweet this !
Self-service Business Intelligence is one of the growing markets these days and business users are looking for flexibility and capability to analyze data and design reports or dashboards at will. Products / Services that provide dashboarding solutions, but are IT Developers toys and makes Business Users dependent on IT Staff to request for almost anything facilitated by these products, are miles away from self-service BI.

Any MS BI professional would be aware of Powerpivot which is a "managed" self-service BI tool for business users in the Microsoft BI Stack. But this is not the only star in the constellation of self-service BI tools in the BI galaxy. Below are the alternatives (in my knowledge and in no specific order) that are worthy to be explored, and you would find that one or more products fit your needs depending on at which stage of BI maturity is your enterprise.

Few of these are cost-effective, few are good for enterprise that are starters in the BI space, few are good for better in-memory performance, some are good for better visualizations, and some are good from an overall partner support and product integration perspective. I am reserving a detailed comparison of few or all of these for a future post. If I am asked the top 3 favorites, as a MS BI professional I would be interested in Powerpivot, Qlikview and PivotLink.

1) Powerpivot

2) Qlikview

3) PivotLink

4) Tibco Spotfire

5) IBM Cognos TM1

6) Advizor

7) Altosoft

8) Vizubi

9) SAP BusinessObjects BI OnDemand

Monday, July 26, 2010

Panorama Novaview with Powerpivot for NON managed self-service BI

I'm reading: Panorama Novaview with Powerpivot for NON managed self-service BITweet this !
Recently, someone made me aware about an interesting article titled "Powerpivot & Analysis Services - The value of both" authored by Panorama Software product manager. This article explains how Analysis Services can be useful for enterprise to model the analytical solution for their known requirements and how Powerpivot becomes helpful to extend the analytical solution by self-service. It also states how Novaview can help business users to build similar to the capabilities and potential of Performancepoint Services i.e. build KPIs, charts, etc.

If I think of a poor man's analytical BI solution, I would think about the cost. Cost can be gauged in terms of expensive licenses of feature rich editions, and IT staff required to facilitate and maintain the BI solution for volatile needs of the business users. PPS is a part of Enterprise Edition of Sharepoint 2010. Also this does not fall in the category of self-service BI, as a business user cannot be expected to develop a dashboard using PPS. A economic recipe of a poor man's analytical solution can be to use Powerpivot as a data source ( by building cubes using powerpivot ) as well as analysis engine too for front-end tools like Novaview. This can eliminate the need for Sharepoint 2010 Enterprise Edition and one can use Sharepoint Foundation Edition too for collaboration of deliverables developed using tools like Novaview. This is the first part of the savings.

Second part of the savings can come from the fact that these tools are claimed to be easy to learn and targeted to be used by business users than the IT services providers. So the dependency on IT Staff to manage and extend the solution becomes less, which effectively translates to savings.

Of course, these tools cannot be a complete replacement for Sharepoint 2010. But if you need a middle path where you want the BI services to build just few Dashboards / KPIs / Charts by business users, at the same time you do not want to invest into Sharepoint 2010 as you might not be sure that you would exploit the full potential of services like PPS, Excel Services, Visio Services and others, Powerpivot with Novaview can be one of the options to try out. Powerpivot is managed self-service BI tool, but by adding a layer with tools like Novaview and using Powerpivot as a data-source, the managed quotient can be reduced to a fair extent. I would not stress on which kind of enterprises would benefit from this recipe of solution, but I am sure that there are enterprises who would want to start slowly and steadily before they hire a full fledged IT Staff or a Solution providing vendor to design a huge enterprise class analytical solution using tools like SSAS and Sharepoint 2010 Enterprise Edition. Those enterprises should give a thought on this recipe.

Sunday, July 25, 2010

Missing equivalent of Deployment Server Edition property in SSIS

I'm reading: Missing equivalent of Deployment Server Edition property in SSISTweet this !
I believe that if anything has to become the best, it cannot evolve straight right in the first phase of evolution. SSIS has evolved nicely from 2005 version to 2008 version, and has filled most of the gaps that were present in the older version. But still I feel that to take SSIS to the next level, there are a few features that can be appreciated and borrowed from other products too.

One such feature is the "Deployment Server Edition" project property of SSAS. In SSAS, if your development edition of the product is different from your target deployment edition, in BIDS when you are working on an SSAS project, this is a unique property that helps you with this issue. If you activate this property, then in case if you are using a higher version of the product for development for ex. Enterprise edition, and your target deployment may be a Standard Edition, this property would stop you from using features in your development environment that are not a part of your deployment edition.

To the best of my knowledge, any such equivalent property is not available in the SSIS project development environment in BIDS. So in case if I am developing using Enterprise edition of SSIS and I might use Fuzzy components like Fuzzy Lookup and when I deploy it on a Standard edition, the package would fail and I would discover that I have used a feature of Enterprise edition. Till any such equivalent is not available in SSIS, this is one point that should be kept on your SSIS package unit test list.

If you want to compare features by editions for all the services and features available in SQL Server, the best link for the same is here.

Wednesday, July 21, 2010

Project Houston CTP and Microsoft Quadrant

I'm reading: Project Houston CTP and Microsoft QuadrantTweet this !
Today Microsoft declared CTP1 of project code-named "Houston". Almost everyone who keeps their tab on SQL Azure would be aware of this news. By definition, Project Houston is a web-based silverlight interface managed and hosted in Azure data centers, and it facilitates access and management of SQL Azure data and databases. More about the same can be read from this blog post on SQL Azure Team blog. Jamie Thomson has generously shared credentials which can be used to connect to a SQL Azure database using project "Houston", and the same can be read from here.

Regarding the user interface of project Houston, I feel a scaled down reflection of the tool "Quadrant" which is a part of SQL Server Modeling Services. The user interface is pretty basic as of now, but if the user interface can be made similar to Quadrant, I feel that it would be a great way to browse data from SQL Azure than giving it just the SSMS style of user interface. I was very impressed with the UI of Quadrant, and as project Houston is a silverlight based tool, it has the potential to go much ahead than the traditional grid or windows like user interface for browsing / managing data and/or databases.

Those who are not aware of what is Quadrant and how it looks like, they can go through an article which I authored on the same from here.

Monday, July 19, 2010

MS BI / Data Warehousing Hardware Estimation Tools

I'm reading: MS BI / Data Warehousing Hardware Estimation ToolsTweet this !
HP is one of the leaders in manufacturing performance oriented hardware for data warehousing and related solutions. Even in Parallel Data warehouse and Windows Azure Platform Appliance solution, HP has been one of the hardware vendor partners.

This is very well known information. But one other information that many would not be aware is that HP also provides a set of free tools that helps in sizing the hardware and can be very valuable to estimating tasks. The first tool that HP offered long back was called "HP BI Sizer for SQL Server 2005/2008". Now HP provides three more tools which helps in different kinds of estimation in different environments. All these tools are wizard based, and on each step one is required to answer a few questions relative to the environment. At the end of the wizard you would be provided with the results and recommended capacity of hardware. As it's a HP tool, of course the recommendations would be a HP Server or HP hardware, but the benefit is one can get an estimate of the capacity of hardware that would be required for your target environment.

The great news is that these tools are FREE !! Download and check out these tools:

1) HP Fast Track Data Warehouse Sizer for Microsoft SQL Server 2008

2) HP Business Intelligence Sizer for SQL Server 2005/2008

3) HP ProLiant Transaction Processing Sizer for Microsoft SQL Server 2005/2008

4) HP Integrity Server Transaction Processing Sizer for Microsoft SQL Server 2005/2008

Friday, July 16, 2010

Configuring Kerberos with Sharepoint 2010 for Business Intelligence related features

I'm reading: Configuring Kerberos with Sharepoint 2010 for Business Intelligence related featuresTweet this !
One characteristic that I have different from many other famous bloggers is that I am very miser in appreciating something, and I do not appreciate anything on just the face value. SQL Server is not behind any other competitive products, but still there's a lot to take away and learn from other products too. I find many bloggers posting on their blogs how they are in love with SQL Server and each and every aspect of it. Even my bread and butter depends on MS BI and Sharepoint BI stack of technologies, but when you really love a product, you focus your energies to make it a world class product. Though you may not be on the marketing / sales / development team of the product, but you start feeling like one when your goals are aligned. In my views, though you may not be a famous blogger who would be getting millions of hits, hundreds or thousands of followers, still if you are making even little contribution to nurture the product that earns your bread and butter, I feel that you or I would rather say WE are as valuable as others. In summary, limit singing songs of praise brainlessly and treat your bread winning product as a businessman would treat his business.

Coming to the subject of this post, Microsoft has released a whitepaper titled "Configuring Kerberos Authentication for Microsoft Sharepoint 2010 Products". Sharepoint 2010 has changed the face of Business Intelligence by the addition of Performancepoint Services and Visio Services, along with Excel Services. Identity delegation and/or Impersonation is one of the issues that can block the entire access and execution mechanism of BI solution, if not configured properly.

With Microsoft, keeping up the pace is really hard whether it's with the ongoing releases of newer version like .NET or keeping oneself updated with tons of new information that gets released in the form of whitepapers. This whitepaper contains lot of How To topics related to Sharepoint and Kerberos, which also directly impacts BI solution design. I always fear that how would I deal with a project where entire set of BI products needs to be configured with Kerberos and for times like that, reference material like this is extremely useful.

Below mentioned are some of the general and BI scenarios covered in this whitepaper:

Scenario 1: Core Configuration
Scenario 2: Kerberos Authentication for SQL OLTP
Scenario 3: Identity Delegation for SQL Analysis Services
Scenario 4: Identity Delegation for SQL Reporting Services
Scenario 5: Identity Delegation for Excel Services
Scenario 6: Identity Delegation for Power Pivot for SharePoint
Scenario 7: Identity Delegation for Visio Services
Scenario 8: Identity Delegation for PerformancePoint Services
Scenario 9: Identity Delegation for Business Connectivity Services


You can download this whitepaper from here.

Thursday, July 15, 2010

Free SQL Server Ebook on Dynamic Management Views

I'm reading: Free SQL Server Ebook on Dynamic Management ViewsTweet this !
If you ever hear of a free SQL Server Ebook, the first thing you should do is turn your head towards Red-Gate website and find out if they have come out with a new one. And even this time, the news about a free ebook that I am going to share with you is from the courtsey of red-gate.

This book is on dynamic management views (DMV) and it's called SQL Server DMV StarterPack. What red-gate says about this books is as follows: "The SQL Server DMV Starter Pack" will de-mystify the process of collecting information to troubleshoot SQL Server problems using Dynamic Management Views. I had a quick glance of this ebook, and this is a kind of reference material that I would like to keep in my MS BI project booster kit. It's a 82 page ebook and provides explanation on the use of 28 different DMVs.

You can download this ebook from here.

Tuesday, July 13, 2010

SQL Azure on Windows Azure Platform Appliance Solution

I'm reading: SQL Azure on Windows Azure Platform Appliance SolutionTweet this !
SQL Azure Team blogged today about the availability of two new offerings. First is the Windows Azure Platform Appliance Solution, and second is availability of SQL Azure on the same. Before I pen down the point that I want to make, I would try to put this in one line.

If you don't find Cloud, Cloud will find you !! In my views, those who are not ready to make themselves compliant to the cloud computing environment as they might be having the impression that Cloud is too far to reach their working environment / technology and put their job at stake, Windows Azure Platform Appliance Solution is the new giant ready to reach your premises / technology / jobs.

This solution breaks the barrier that enterprises feel for using and/or migrating to the cloud. And this factor was allowing data out of the premises. With this appliance solution, enterprises can have their private cloud in the enterprise with Windows Azure and SQL Azure.

This is one of the most unique cloud based solution in my knowledge. After Parallel Data warehouse offering, this is the second appliance based offering related to SQL Server from Microsoft. The appliance partners which participated in the Parallel Data Warehouse solution, have participated for this solution too.

What most of us would be interested in, is the question "How does this impact my technology / job ?" Below are a few points from the solution home page:

1) The Windows Azure platform appliance consists of Windows Azure, SQL Azure and a Microsoft-specified configuration of network, storage and server hardware.

2) The appliance is designed for service providers, large enterprises and governments and provides a proven cloud platform that delivers breakthrough datacenter efficiency through innovative power, cooling and automation technologies.

3) The Microsoft Windows Azure platform appliance is different from typical server appliances in that it involves hundreds of servers rather than just one node or a few nodes and It is designed to be extensible, customers can simply add more servers, depending upon the customer’s needs to scale out their platform.

4) The appliance is currently in Limited Production Release to a small set of customers and partners.

Based on above points, it is not hard to understand that this is just a starting point. If this goes well, one can expect enterprises to build their private cloud environment hosted in their datacenters using Windows Azure Platform Appliance solution. I feel like this is the new VMWare in the making for cloud platform. Also as of now, I do not see a competitive product in my knowledge for this kind of solution. And once it creeps into the enterprises, major Microsoft based technologies like .NET and SQL Server would definitely mutate to adapt to this environment.

These winds of change indicate that future versions of technologies like SSIS, SSRS and now with this solution even SSAS can be imagined to be operating on the cloud as SQL Azure is also a part of this solution. And if you are not willing to be cloud-ready and you work with Microsoft based solutions, I do not see time too far when you would start risking you job profile as "Cloud would have found you, coz you never thought of finding the cloud" !! I am looking forward to my readers, to share their thoughts with me on this viewpoint.

Sunday, July 11, 2010

SSIS Synchronous Script transformation is missing output paths enumerator

I'm reading: SSIS Synchronous Script transformation is missing output paths enumeratorTweet this !
Everyone knows what is a Script Transformation in SSIS. You make the output synchronous by configuring SynchronousInputID and ExclusionGroup property. Upon setting this properties, when you edit the script, you would find DirectRowTo[OutputPath] method available in the script for the Row object. Everything is okay till here, but the real issue comes now.

Till I want to redirect the row to a particular output path, you have one method per output path. But if I intend to route the row to all the output paths, we do not have any enumerator readily available for enumerating output paths which I can use for this purpose (to the best of my knowledge). In my views, there should be a collection implementing the IEnumerator and/or IEnumerable interface, and the same can be used in a loop without requiring the need to code for each and every output path. .NET folks should be knowing this very well, however DB / BI folks those who are not well versed with .NET programming might find this alien.

Still it's not a show stopper. Simple workaround is to create a function and code it once there. Use the same function wherever you need to direct rows to all outputs. When a new path is added, just modify the function, instead of modifying the code at individual places. I would still wish that this enumerator should be made available. Do you also think so ?

Thursday, July 08, 2010

Challenges for cloud based SSIS

I'm reading: Challenges for cloud based SSISTweet this !
These days I am busy convincing my management the right people to build Noah's ark. This is to develop readiness against the biggest change that I see coming as of date. This change is Microsoft's entry into cloud based computing. This is a wave very strong and is propagating in all the major Microsoft based products that form key constituency of a solution design.

The next obvious thought is what is this Noah's Ark ? It's the design patterns for adapting all the Microsoft BI Stack of technologies to operate with Cloud based SQL Server i.e. SQL Azure, reducing the need and/or dependency of SQL Server wherever appropriate. In summary, readiness for business intelligence on the cloud. Though SQL Azure is not even near to the capacity of SQL Server, but it's evolving with a very fast pace. Enterprises have already started rolling their eye balls on Azure, and the basic factors of attraction are cost-effectiveness and ease of maintenance. I can foresee a time in the near future when solution providers would be proposing dual flavor of architectures to bid for a project, one with the traditional way of solution design and other would be cloud flavor of the same architecture. If your prospective client is desperately looking for cost-effective architecture design, cloud based architecture can be a silver bullet and a RFP biased with traditional flavor of architecture design is assured to cripple in the views of the client.

Coming back to the subject of this topic, as of this draft, SSIS is not available on SQL Azure or Azure platform. But in my expectation, I can see this need and feature addition coming in the near future. As we know that SSIS is mostly about in-memory transformation of data and secondary memory is the fuel of SSIS. With cloud / Azure platform, to gain flexibility in the scalability of hardware availability and administration, the tradeoff is to give up absolute control over hardware administration. As of now, ETL using SSIS is still tightly coupled to hardware / memory, and it would be very interesting to think of how SSIS on cloud / azure platform would look like. Imagine each and every data flow transform working on the cloud, and I cannot wait to see how similar or different would be the way in which ETL solutions would be developed on cloud platform compared to the way we develop the same on SQL Server.

Theoretically, one thing seems very clear that the tight coupling with hardware will have to be removed for SSIS to be hosted on cloud platform. And this can mean a very major architecture level change, effectively birth of a ETL system on a new world named cloud / Azure platform, and we would might still call this ETL tool / system / service as SSIS !!

Tuesday, July 06, 2010

Certification for Spatial Development using Bing Maps

I'm reading: Certification for Spatial Development using Bing MapsTweet this !
I came across a very interesting Microsoft certification, which I think can be a very boosting value addition to your spatial intelligence skills. Generally we come across certifications about an entire product or a programming language, but rare often we come across certification on just a control / component. I think I would stop my usual habit of talking in riddles, and I would come straight now.

70-544 is a new certification titled "Bing Maps Platform, Application Development". Details of the skills measured for this exam can be viewed from here. Below are the higher level points:
  • Retrieving Location Solution Data
  • Analyzing Location Data
  • Displaying Location Solution Data
  • Administering Your Location Platform
  • Manipulating the Map
  • Creating and Customizing the Map Display

Though this exam is driven towards Bing Maps development from application development perspective as per the description of the exam, but I see a different angle to it. In SSRS 2008 R2, Bing Maps control has been introduced. Most of what can be done programatically can be done using properties in the Map control. There might be situations where you might require more customizing than the Map control supports, and you might want to plug this in your PPS dashboard on sharepoint.

I have done some level of Bing Maps application development for R & D purposes, and in my experience it doesn't require that much level of advanced .NET skills. Also pursuing this exam would mean that you would learn concepts like geo-coding, reverse geocoding, geo-fencing, mapping data to geographical locations and more about how to use Bing Maps to analyze, extract as well as present information / intelligence on a map in an appealing manner.

Bing Maps control in SSRS 2008 R2 is just a starting point, and I have complete hopes that it would go a long way. This track can be quite a booster to your spatial intelligence skill set in general as well as using Bing Maps from a long-term viewpoint. I am open to any conflicting views and curious to hear to same from my readers.

Monday, July 05, 2010

Why a SQL Developer or a BI Developer should learn .NET ( DotNet )

I'm reading: Why a SQL Developer or a BI Developer should learn .NET ( DotNet )Tweet this !
As of this draft, I work as a Technical Lead for designing applications where Microsoft Business Intelligence Stack of technologies are involved and I specialize in the same, but this has not been my background right from the start of my career. Like every fresh graduate from an average college, I started my career with struggle to find a break in the fiercely competitive Indian IT industry. I was a patron of .NET during those days and spent few years in application programming using .NET. The peak point was when I delivered a .NET based solution as a project lead for fortune 500 company. I completed my Microsoft Certified Solution Developer Track over a period of 2 years, and I worked with applications that involved different BI Tools like Netik (you would not have even heard of it, but it was a very popular tool in the banking domain in those days), Business Objects and others. Some of the reasons why I feel that every database professional need to learn .NET are as below:

1) To protect oneself and your database from becoming slave of application: The general tendency of application developers / architects / designers (in my experience) is that they tend to incline more towards keeping application in shape rather than database since application is the face of the solution to the client, and not the database. Effectively application developers start taking higher precedence in dominating the implementation of requirements. Many a times you being a SQL Developer or a SSRS Reports developer might want to challenge the requirement as you might feel that it can be well implemented inside the application and there is not need to stretch yourself and your database. But you won't be able to back it as you don't understand the .NET programming at all to challenge it.

To give a real life example from my experience, once I was asked to implement a complex logic that would have required use of a CTE to recursively loop a huge dataset, but in fact the application never used that sorting at all since the control that was used to display the same dataset was not able to populate itself based on this sorting. My experience in .NET helped me to detect this gap and make the application team digest that you do not need this from database, and this worked in my as well as my database's benefit.

2) .NET is the programming backbone of Microsoft BI Stack: SSIS and SSRS has controls / components that use either VB.NET and/or C# as the programming language. Also if you want to create a class or control library to extend or create a new control / component, knowledge of .NET is a must.

3) Performance Optimization: This might come as a shock, but I would explain it. During the time when .NET was born, it's class libraries were not as powerful / feature rich as they are today. This evolution has brought new way of data access like ADO.NET Entity Framework, Language Integrated Query, Data Access Application Block and other different ways that I might even not know. When your database performance optimization attempt exhausts, one tries scale hardware as the final way to boost performance. But most of the times, the skills that one lacks is .NET programming skills that can teach application, how to be nice to database.

4) Error control: Yes, again this come as a surprise to many. Application developers might think of database just as a box in which data can be packed up as required. But you as a database professional would know how to keep you database happy. For ex, an application developer might pass blank values where they did not receive any input from the user, but you as a database professional understand that NULL is different from Blank ('') values. If you have an average working knowledge of .NET programming, you would immediately advise the application programmer to use SQLDBNULL constant instead of passing such values.

I am not biased with application / OLTP / BI development or developer role, but I have been in all these roles at some point of time in my career. Being a bridge between application programmers and database developers is a real challenge and a very interesting role in itself. If you have a working level knowledge of application programming using .NET, being a database / BI professional, you can smoothly swing across any streams right from application to data warehouse in a Microsoft based solution. I have been in this role and the experiences have been very interesting. If you have been in such role or want to be in such role, and have any feedback or questions on this, feel free to share it with me.

Thursday, July 01, 2010

Business Intelligence on Cloud

I'm reading: Business Intelligence on CloudTweet this !
Business Intelligence on high-end hardware like HP Proliant servers is a well known business scenario / setup / practiced pattern. But business intelligence on cloud is not even a common thought. On my previous post, one of my blog readers commented a query "whether I have used SSIS as a data source for SSRS where SSRS applications runs in a cloud ?" I thought I would blog about it so that I can share my thoughts with everyone.

To the best of my knowledge any BI related services like SSIS / SSRS / SSAS does not exist on the cloud as of now. Presently SQL Azure can be used as a light weight storage for some of the BI related requirements. Some of the examples of a probable use for the same are as below:

1) Many a times either a temporary staging or a permanent staging is required in ETL solutions. Also the use of that kind of staging is volatile. SQL Azure can be one of the best options for this requirements.

2) Many enterprise level applications are used and implemented across geographical areas. Often these applications need replication and synchronization of data. Effectively in-premise version of SQL Server i.e. federated database servers need synchronization. SQL Azure can be used as a server for facilitating standard or customized synchronization of database servers.

3) To answer the query of my blog reader, SSIS can still be used as a data source for SSRS applications. SSRS cannot run on the cloud as this service is not available yet. But data can be stored on SQL Azure, accessed from SSIS package which can be used as a data source for SSRS reports. SSRS can also access data from SQL Azure, and one might want to use SSIS where there are multiple relational and/or non-relational data sources involved to create a dataset.

The pace with which the feature set in SQL Azure is getting healthy, it seems very much probable that business intelligence stack avilability on the cloud does not seems too far. My articles on SQL Azure can be read from here.
Related Posts with Thumbnails