Business Intelligence Blogs

View blogs by industry experts on topics such as SSAS, SSIS, SSRS, Power BI, Performance Tuning, Azure, Big Data and much more! You can also sign up to post your own business intelligence blog.

«August 2015»

Power BI and Big Data

If you’re worked in the wide and diverse field of information technology for almost any amount of time, it probably hasn’t taken you long to discover that the one thing constant about IT is that the technologies and strategies involved change faster than you can learn them. And if you work in business intelligence like I do, you don’t have to look very far at all to see change. The Microsoft Power BI team rolls out a software update every month! If I want to stay learned up on the technology, I have to really be on top of things.

About ten years ago when Hadoop was first being developed at Yahoo, I don’t think anyone could have anticipated the size of the ripples (more likes cannonball sized splashes) being able to access Big Data could and would have on the IT industry. Hadoop (and other advances in hardware and software technologies) gave us something we never had before: The ability to access and report on data in real time on a scale never previously imagined gives an organization to identify and understand trends and patterns in the data and gain previously unknown insights. The organizations that are able to leverage big data will be the organizations that leave their competition in the dust.

Set Up and Configure the Hortonworks Sandbox in Azure

Not only does Power BI Desktop give us the ability to connect to Hadoop Distributed File System (HDFS) for reporting we can also mash it up with other more traditional and structured data sources with minimal effort required. But that’s not what this blog post is all about. This post is about setting up a virtual machine in Azure running Hadoop and connecting to our Hortonworks Sandbox with Power BI Desktop :).

The first thing you do if you don’t have access to a Hadoop cluster is to set up the Hortonworks Sandbox on Azure. The good news is its free (for the duration of the trial) and its super easy. Just follow the instructions at this link to set up the Hortonworks Sandbox.

Hadoop in Azure

Once that’s set up, you’ll need to add mapping for the IP address and host name to your hosts file. Devin Knight has a blog on this that you’ll find helpful.

Connecting to Hadoop with Power BI Desktop

Once your Hortonworks Sandbox is set up, you’re ready to set up your connection to Hadoop with Power BI Query. Start up the Power BI Desktop and click Get Data. Scroll down and select Hadoop File (HDFS) and click Connect.

Get Data with Power BI

From there you can follow the rest of the wizard to load the data into the semantic model.

Load Data with Power BI

Once the data is loaded, you’ll need to modify the query to navigate to the data you wish to use in your model.

In Power BI Desktop, go to the Home ribbon and click Edit Queries.

Read more


Three Best Practices for Power BI

Since the release of Power BI Desktop this past week, I’ve been really spending my extra time digging into the application focusing on learning and experimenting as much as I can. When my wife has been watching Law and Order: SVU reruns at night after the rug rats are in bed, I’ve been right there next to her designing Power BI dashboards like the total data nerd that I am. When my kids have been taking their naps during the weekend, I’ve been writing calculations in the model for my test dashboards. Or when I’ve been riding in the car back and forth to work I’ve been thinking of new things to do with Power BI Desktop.

Since I’ve been spending a decent amount of time with Power BI Desktop, I thought I’d take a moment to share three things to know and remember when designing your Power BI models and dashboards that I think will help you make the most of this tool and be effective at providing the data your business needs to succeed.

1. Optimize your Power BI Semantic Model

It probably hasn’t taken you long to figure this one out if you’ve built Power Pivot/Tabular models or at least it won’t when you do start developing Power BI dashboards. The visualizations in Power BI and Power View are heavily meta-data driven which means that column names, table or query names, formatting and more are surfaced to the user in the dashboard. So if you using a really whacky naming convention in your data warehouse for your tables like “dim_Product_scd2_v2” and the column names aren’t much better, these naming conventions are going to be shown to the users in the report visualizations and field list.

For example, take a look at the following report.

Power BI Dashboard without formatting

Notice anything wonky about it? Check the field names, report titles and number formatting. Not very pretty, is it? Now take a look at this report.

Power BI Dashboard with formatting

See the difference a little cleaned up metadata makes? All I did was spend a few minutes giving the fields user-friendly name and formatting the data types. This obviously makes a huge difference in the way the dashboard appears to the users. By the way, I should get into the movie production business. ;)

My point is that the names of columns, formatting, data types, data categories and relationships are all super important to creating clean, meaningful and user friendly dashboards. The importance of a well-defined semantic model cannot be understated in my opinion. A good rule of thumb is to spend 80% to 90% of your time on the data model (besides, designing the reports is the easy part).

I’d also like the mention the importance of the relationships between the objects in the semantic model. Chance are you will have a small group of power users that will want to design their own dashboards to meet their job’s requirements and that’s one of the beauties of Power BI. But when users began developing reports, they may query your model in unexpected ways that will generate unexpected behaviors and results. I only want to mention this because the relationships between the objects in the model will impact the results your users will see in their reports. Double check your relationships and ensure that they are correct, especially after you add new objects to the model since the

Read more

Power BI Fantasy Football Player Stats Dashboards for Download

Every year at Pragmatic Works some coworkers, including consultants, marketing staff, support team members, software development staff and project management, partake in a company fantasy football league. And with the recent release of the new Power BI Desktop, I thought what better way is there to prepare to completely annihilate my coworkers and friends in an imaginary nonsensical game than by creating some nifty Power BI dashboards based on last years player stats as recorded by Yahoo! Sports. So I thought I’d walk you through some of the steps I followed to leverage the Yahoo! Sports NFL player stats page as a data source and some of the query transformations I applied to prepare the data for reporting.

Power BI dashboard with Power BI Desktop

Click here to download my Fantasy Football Dashboards Power BI .pbix file.

If you’re completed new to Power BI Desktop I highly suggest you watch my video walkthrough of Power BI Desktop or read my blog post which walks you through each step of creating your first Power BI dashboards with Power BI Desktop. Last Friday, I also blogged about my three best practices for designing a killer Power BI solution, so take a look at that.

To create these dashboards, I simply navigated to the Yahoo! Sports NFL stats page and found the page for each position I’m interested in for this fantasy football season. I copied the URL to my clipboard. In Power BI Desktop, click Get Data and then use the Web data source option. Then all you have to do is copy and paste the URL into the text box and click OK.

Get data from web with Power BI Desktop

Then select the HTML table that contains your data and click Edit. We need to edit our query because there are some issues with the data. By clicking Edit, we can apply transformations to our query which will allow us to do things like rename columns, remove unwanted columns, modify data types, create custom columns and much more.

Get data from web with Power BI Desktop

One thing you’ll notice in the above screen grab is that the column names are in the first row, so we need to fix that.

On the Home ribbon of the Query Editor, just click the Use First Row As Headers button. Pre

Read more

Power BI Tip: Use the Treemap Chart as a Colorful Slicer

Power BI Desktop has been out for GA for over a week now and some of the pro’s out there have come up with some pretty cool tricks. For instance:

But if you’re looking for a way to spice up you report filtering with a little color, try using the Treemap chart as a Slicer for those fields that only contain a few unique values. At this point with Power BI, you don’t have any customization options for the Slicer visualization (although I’m sure that is coming down the pipe in a future release). This option won’t work terribly well if the field you would like to use as a slicer has more than a dozen or so unique members, but you can experiment with it and see what you can come up with. Here’s my Treemap Slicer in action.

tree map slice in action

To multi-select tiles in the Treemap slicer, just hold Cntrl as you click. To reselect

This little trick relies on the natural cross filtering between data regions in the Power BI dashboards. First I created a measure that calculates the distinct count of the field that I wish to use as my slicer. In this case the field is Genre.

Power BI Distinct Count DAX calculation

Then I added a Treemap chart to the report using the field Genre as the Group value and the measure Distinct Count Genre as the Values.


Then just resize the Treemap visualization so that the squares are about evenly sized. There’s a few ways you can arrange it, but just play around with it and see what you can come up with.

Power BI Dashboard with Treemap Slicer Power BI Dashboard with Treemap Slicer


What do you think? Leave me a comment below and let me know. Or if you’ve got a neat Power BI trick you’d like to share, let me know, as well. I love to hear new ideas! Thanks for reading!

Read more

Importing Excel Power View Dashboards into Power BI

If your organization is now a Power BI customer, congratulations. You’re now ready to create some very cool dashboards, integrate disparate and disconnected data sources and take advantage of Power BI’s ability to modify and transform your data, build interactive and dynamic dashboards and then share them with your team and organization. But until you create your dashboards to take advantage of the new visualization types and other improvements, you can easily import any existing Power View sheets in Excel into your Power BI site.

Power View dashboard in Excel ready for Power BI goodness

Above you’ll see an example of a Power View dashboard that I will import into my Power BI site.

Importing Power View into Power BI

To import Power View sheets into Power BI, navigate to your team’s Power BI site and click the Get Data button at the bottom left.


Then select where your data exists. In my case, I have my Power View reports in Excel saved in my One Drive folder so I’ll select Files.



You’ll have to log into your One Drive account. Once you’ve done that, just navigate to where your Excel file is and then click Connect.

After the import is complete, you’ll see your new Report and Dataset on the left in your My Workspace explorer. Open the new Report based on your Power View sheet. Now I can browse and interact with my Power View dashboard within my Power BI site.


I can also pin visualization from my Power View report to a Power BI dashboard with other visualizations.


And now that we have our Power View report imported and pinned to a dashboard can use Q&A to quickly generate visualizations using the imported data set.


Pretty cool stuff! It’s nice to know that I can easily jump from my legacy Power View dashboards to the new Power BI.

It’s important to not that in order to import Excel workbooks into your Pow

Read more

#PowerBI and #SSAS Tabular: A Natural Fit with the Power BI SSAS Connector

SSAS Tabular and Power BI In late June last month, the Microsoft Power BI team released the Microsoft Power BI Analysis Services Connector. The Power BI SSAS Connector allows your deployed Power BI reports to utilize your on-prem SSAS data sources. It’s super easy to set up and can be downloaded for free! And I who doesn’t love “free”?

Download the MS Power BI SSAS Connector here

Why Use the Power BI SSAS Connector?

Power BI and SSASOne of the advantages to using Power BI is that the tool has the ability to connect to an incredibly wide variety of data sources including SQL Server Analysis Services instances. With that in mind, what’s the purpose of using the Power BI SSAS Connector? Why not just load our SSAS data into our Power BI semantic model like we do with our Access, Excel, CSV and web data and then schedule the Power BI semantic model to refresh? And that’s a good question.

Power BI: Live & Prime Time with SSAS Tabular

First of all by utilizing the Power BI SSAS connector, we are granted a live connection to our SSAS instance. What this means is that every time a user interacts with a filter, slicer, chart or other data visualization, Power BI quickly generates a DAX query behind the scenes which is sent to your on-prem SSAS Tabular model. Now currently Power BI users are restricted to how often data sources are refreshed. If you’re a free Power BI user you’re limited to one data refresh per day and if you’re a Power BI Pro user you’re limited to Hourly data refreshes. By leveraging the live to connection to your SSAS Tabular instance, you can update the data in your Tabular model as often as you are able.

Because the Power BI SSAS Connector allows you to have a live connection to your SSAS Tabular model, this also means that your users experience less latency between updates to their data. Without the live connection to SSAS, each day the users would have to wait for the SSAS Tabular model to be processed and then for the Power BI semantic model to be refreshed. With the live connection, as soon as the Tabular model has finished processing, the Power BI users have access to the most current data instantly. Data is available to your users with potentially much less time between data refreshes.

Currently, the Power BI SSAS Connector only supports live connections to SSAS Tabular instances, although I would expect a future update to support live connections to SSAS multidimensional cubes.

Enterprise Data in Power BI

SSAS Tabular model partitions Currently Power BI semantic models are restricted by a data capacity limit. If you’re a free Power BI user you’re limited to 1 GB per/user and if you’re a Power BI Pro user you’re limited 10 GB/user. This can be a

Read more

Twitter Analysis with #PowerBI & Plus One

Earlier this week Christopher Finlan put together this awesome Datazen dashboard using Plus One. Christopher has been doing a lot of cool things with Datazen so I recommend that you do like I did and subscribe to his blog. But Christopher’s cool work with Plus One inspired me to create my own Social Media dashboard using Plus One, as well.

powerbi search completePlus One has created this nifty little desktop application that you can download and install on your computer. Once you’ve set the app up, all you need to do is enter a search query. In my case, I wanted to see what people were doing and saying with Power BI on Twitter. Plus One can only recover the previous seven days of data, so you’ll need to periodically refresh your search or schedule the search, which you can do easily with the Plus One application.

access db datasourceAfter running the search, the results of your query are saved in an Access database on your machine in the folder C:\Users\username\Documents\Plus One Social. And then all you have to do is use Power BI Desktop to suck the data into your Power BI semantic model and then start building some awesome dashboards.

Here’s the report and dashboard I created in Power BI. I haven’t scheduled the Plus One application to refresh the data so I don’t have any trend reports yet, but I did create some snapshot visualizations to gain insights into who is talking about Power BI and in what context by analyzing the accompanying hash tags. Pretty cool stuff!





Here’s the resources:


Leave your feedback down below! I’d love to see what kind of dashboards you can come up with using Plus One so feel free to leave a comment and a link to your blog, as well!

Read more

Cleaning Your #PowerBI Power Query Code

image Over the weekend I found this nifty tool called Power Query Management Studio. Someone shared it on Twitter and you’ve probably seen the link to download the tool on technet. Basically this tool is a fancy Excel workbook that allows you to easily clean up your Power Query code and insert it back into your Excel workbook or Power BI semantic model. It’s pretty nifty and easy to use so I figured I’d give you a quick run down on using it to clean up my Power Query code in my Fantasy Football & NFL stats Power BI model, which you can download here.

To begin using the Power Query Management Studio, download it here.

I want to use this tool to clean up my Power Query code in my Power BI model, so the first thing I’ll do is open my Power BI model in Power BI Desktop. Next, we need to capture my Power Query queries so to do this I’ll click the smiley face icon at the very top of Power BI Desktop and click Send Frown.


A little Send Feedback dialogue box will pop up. Uncheck the Include Screenshot (since we really don’t care about that) and leave the Include Formulas box checked. This will allow us to see the Power Query queries. Click OK.


When you click OK, this will open up an email for you that will include your Power Query queries in the body of the email. The code is broken up by queries seperated by semicolons so you can easily see each query.


I copied everything below the line “section Section1;”. Once you’ve copied that code to your clipboard, open the Power Query Management Studio Excel workbook. Clear the sheet called CodePaste (but don’t delete the table) and paste your Power Query queries into the table like so. Then click the Refresh All button up top in the Data ribbon of Excel.


After a few moments, the Excel workbook will have completed its magic. There’s a few sheets in the workbook I’ll point out that you’ll find useful.


CommentTransfer: T

Read more

Here’s the New #Excel 2016 Chart Types!

The Office 2016 Public Preview is now available for download! Included in the preview of Excel 2016 are a handful of new chart types and since I’m a huge fan of awesome data visualizations, I thought I’d take a few moments to play around with them and share my experience with you so you can have a better idea of what to expect in the next version of Excel. But to be honest, if you’re a data & visualizations nerd like me, you’re probably pretty excited!

imageNow one thing to be aware of with these new chart types is that if you attempt to create this chart on top of some data in a pivot table, you’ll get a warning like the one seen here. In order to use these charts, you’ll need to create them on top of data that is not in a pivot table (at least for the time being).

Box and Whisker Chart

The Box & Whisker chart is a really nice visualization for getting a quick look at the distribution of data including outliers, mean, range and quartiles, for example. In the below chart, I pulled in some data from with Power Query and performed some analysis on the yards per game for the top four running backs from last season.


You also have control over the chart formatting through some options specific to the Box & Whisker chart type.


Waterfall Chart

The Waterfall Chart was just added to Power BI so you’ve probably already had a look at that visualization. The neat thing about Waterfall chart is that it allows us to see how the small pieces of a whole contribute to the total. For instance, below I have a Waterfall chart that shows the play stats from the New England Patriots first drive in their conference championship game against the Indianapolis Colts which resulted in a touchdown. Using this chart I can see how each play in the drive led to the total yards gained on the play. Pretty cool!


A Waterfall chart could be really useful for monitoring changes in inventory or for viewing balance sheet data.

Sunburst Chart

The Sunburst chart is good for viewing hierarchical data. So if you wanted to view how individual accounts contribute to their parent accounts in a balance sheet, the Sunburst chart could be a really interesting way to visualize that type of data.

Here I’m using the Sunburst chart to analyze the receivers of a few different teams by player and position.

Read more


Refreshing Excel Power Query & Pivot Tables with SSIS and Task Factory

image With SSIS 2014 and earlier there is currently not native way to refresh an Excel workbook which include Power Query queries. Now that functionality is rumored to be included with SQL Server 2016 but if you’re currently running SQL Server 2014 or 2012 you are out of luck. But that’s why Pragmatic Works put together the Excel Power Refresh component for SSIS.

Configure the Excel Power Refresh Task in SSIS

Configuring the Excel Power Refresh Task is pretty straightforward. There’s not a lot of complexity to this component, which is a good thing.

First create a Connection Manager to your Excel 2013 file that includes your Power Query queries. In my case I have an Excel workbook that has some Power Query queries that query for some data that I used for a blog post on the new Excel 2016 chart types.

Once you create your connection to the Excel file, use the Data Connections and Pivot Table Sheets to select the queries and pivot table sheets that you wish to refresh.


And now I can schedule the refresh of any Power Query data connections or Pivot Tables with SSIS.


Very cool!


Download the free trial of Task Factory here.


If you have any questions or would like some more information on Task Factory, feel free to send me an email or leave a comment below. Thanks!

Read more

Power BI Tip: Use a Scatter Chart to Create a Calendar Report

Power BI Desktop Scatter Chart

The Scatter Chart in Power BI and Excel is very useful chart for visualizing three different metrics in tandem. But with a little bit of work you can use a Scatter Chart to create a Calendar chart for visualizing your metrics across the days of an individual month.

New to Power BI Desktop? Start here!

To configure a Scatter Chart too mimic a Calendar type report, you need the follow:

1) An attribute for the day number of the week (1,2,3,4,5,6,7).
2) An attribute for the day number of the month (1,2,3…29,30,31).
2) An attribute for the week number of the month (1,2,3,4,5,6).
3) An attribute for sorting the week numbers in reverse order.
4) A business metric you wish to represent in the report.

Most of these items you can get from a traditional date dimension. In this example, I’m utilizing the Adventure Works DW database which has a date dimension table.

To set up the Scatter chart correctly, configure the visualization as seen in this screen shot from Power BI Desktop.


I want to point out a couple things here. First, you can optionally add a field to the Legend to differentiate between the weekend and weekdays or to identify holidays, as seen below.

Power BI Desktop Scatter Chart

Secondly, in order to display the Calendar in the correct order, we actually need to reverse the order to the Week numbers so that the first week of the month is numerically higher than the last week of the month. To do this I used a TSQL Case statement to populate the new column in the Adventure Works Date dimension table:

[WeekNumberOfMonth]  AS 
[WeekNumberOfMonthReverse]  AS 
    (case (datepart(week,[FullDateAlternateKey])-datepart(week,dateadd(month,datediff(month,(0),[FullDateAlternateKey]),(0))))+(1) 
        when (1) then (6) 
        when (2) then (5) 
        when (3) the
Read more

Setting Up an HDInsight Cluster (No Scripts Required)

Let me start by saying, I am not a fan of scripting. It definitely has its place and a lot of my peers really like it. It is the easiest way to get functionality out from software vendors such as Microsoft. PowerShell is an incredibly powerful tool which can do just about anything. However, therein lies the problem for me. Scripting solves a lot of problems, however, I just wanted to set up and use a basic HDInsight cluster to create some Power BI demos (posts coming soon). So I started the journey to find the scripts and try to understand the syntax and so on. Then I went to the Azure Portal, here is what I did to set up my cluster and load data with no scripting required. My goal was to go to get a working demo platform up. Would I necessarily recommend this path for production work, not sure yet. But now I can work with HDInsight with considerably less work required to set up the environment.

HDInsight Cluster No Script Setup Requirements

You need an Azure account. You can go to to sign up for a free account if you like. If you have an MSDN subscription you should have some time available as well.

HDInsight Cluster No Script Setup

Once you have your account created, you should go to We will be doing our setup from here. During the process we will be creating a storage account (if this is your first run in azure, you may choose to set up a Resource Group as well) and the HDInsight cluster. Be aware that the cluster has compute costs and the storage has storage costs. At the end we will remove the cluster to save your compute time.

Create the Storage Account

This step can be done during the HDInsight cluster creation, but this limits your ability to share data across clusters. If you are just trying it for fun, you can do this during the cluster set up.

Click the + symbol on the portal, then Data + Storage, then Storage Account. This will open a blade with the set up instructions for a storage account.


When you create your account you will have some options to fill in:

  • Name: this name will need to be a unique name, e.g., joescoolhdinsight
  • Pricing tier: The pricing tier is really important if you are using a limited plan or if you plan to keep the data for a long time. If you are planning to use this as a demo, I would select Locally Redundant as that is the lower cost plan.
  • Resource Group: The resource group lets organize your Azure assets. This is for your benefit, so if you want to keep all of the HDInsight components together, you could create a group for that or stick with the default.
  • Subscription: This lets you choose the subscription you want to use.
  • Location: Be sure to select a location close to you that supports HDInsight. Check to see what Azure services are supported in each region.
  • Diagnostics: This is optional. If you are looking into the diagnostics or need to prep for production, you will find this useful. In most cases, we would not turn this on for demos.

Click Create and it will create your storage account. This may take a few minutes. The notifications section on the portal will alert you when this has been completed. Once that is complete, we will continue with setting up the cluster.

Create a SQL Database for a Metastore

Read more

Uploading Files to an HDInsight Cluster (No Scripting Required)

As I noted in my first post, I am not a fan of scripting. In that post we set up a cluster without using scripts to do so. Now we are going to look at how to upload files without scripts. While this will work for our demo and learning purposes, I would encourage you to use scripting to handle production level loads or even if you want to upload a lot of files. While I am not a fan, it does not mean the scripting may not be a better overall tool. However, when I am trying to learn the functionality or work with system using other tools (in this case Power BI), I find that methods such as these help me be more productive sooner.

Prepping to Load Data Into Your New HDInsight Cluster

A key difference between standard Hadoop and HDInsight is file management. With HDInsight, you can load files into Azure Storage and they can be consumed by the HDInsight cluster. Keeping with the No Scripting Required mantra, we will be using a graphical interface to load files into Azure storage. There are a number of options out there, you need one of them installed. For our example, we will be using the freeware version of CloudBerry Explorer for Azure Blob Storage. Once you have your tool of choice installed you are ready to get some files.

At this point, you need some files to load. I am using some data I created for another demo. My data is in 7 files of daily receipts for my restaurant for a week in March. Once you have the data, we can load that into the cluster.

Loading Data Into Your New HDInsight Cluster

A noted above, the next steps for use will use CloudBerry Explorer to load our data. In this case, I just copied the folder with my files over to the Azure Storage once I connected the tool to Azure.


Once that is done, we will look at working with the data in Hadoop and with Hive.

Creating an External Hive Table and Querying It

You can create two types of tables using Hive – internal and external. An internal table loads the data into a Hive database. An external table applies a schema to the data without moving it. I will be creating an external table. I like this concept because it applies schema to the files that have been uploaded and allows other tools to interact with that data using HiveQL. When you drop an external table, the data remains because the table represents structure only.

In order to help everyone through this (in particular me), the next sections walk through the steps I took to create my table and select data from it. (This is not a detailed look at Hive, but rather a focus on the process of making HDInsight data available using HiveQL.)

Understanding the Files

The first step was to document the structure of the data in the files. Here is the data that I had in each of the files in column order:

  • Ticket Number – int
  • Ticket Date – date
  • Hour of the Day – int
  • Seat Number – int
  • App Amount – int
  • Entrée Amount – int
  • Non Alcoholic Amount – int
  • Alcoholic Amount – int

My structure was fairly simplistic. Each file represented a day.

Creating the Table

Now that I had the structure, I needed to work out the table DDL. (Reference: htt

Read more

Using Power BI with HDInsight Part 1: Power Query and Files

With the rise of HDInsight and other Hadoop based tools, it is valuable to understand how Power BI can help you take advantage of those big data investments. If you need to set up a cluster to work with, check out my previous posts on Setting Up an HDInsight Cluster and Loading Data Into Your New HDInsight Cluster. These posts show how to do this with no scripting required. If you prefer to script, there are a number of resources with sample scripts on doing the same work.

In this article, I will focus on using Power Query to get data from the Hadoop file structure in HDInsight. I will be using Excel 2013 with the Power Query Add-In. I will also be using the restaurant data I loaded as noted in the three previous posts. If you need to create a cluster and load data I encourage you to check the following blog posts:

These posts walk through the process of creating a cluster and loading up data.

Connecting to HDInsight

First, open a new Excel workbook and click the Power Query tab. Once there, you can find the Azure HDInsight source in the From Other Sources dropdown. Select that option to open the following dialog:


You will need your storage account in order to continue. Then you will need the storage account key. Once you have added the key you will see that the Navigator opened in Excel on the right.


It should show the name of your cluster and the default container name. Double click the container name and it will open the Power Query window. It will show all the files available in the container. Even though we have it organized in folders, the view shows all the files. If you have a large amount of files and you don’t want to scroll to find them, you can click the down arrow on the Folder Path column and use the text filter to find the folder you are looking for.


Now I have the files I want to use in Power Query. If you click the binary link it will open a copy of the file. However, this is not how we want to work with the data as we have multiple files. (If you did this, remove steps up to the Filtered Rows step in the Applied Steps section.) I now have the files I uploaded showing.


In order to work with them all together we need to Combine Binaries.

Read more


Time Intelligence Filters in PerformancePoint 2010

  • 10 September 2012
  • Author: Mike Milligan
  • Number of views: 17657

The time intelligence filters provided with PerformancePoint 2010 gives developers an easy way to provide users with a method for specifying time periods using common English terms such as "Last 6 months", "Same Period Last Year", "Rolling 3 months", and so on. These filters can be linked inside your dashboard to control Excel Services reports, SSRS reports, scorecards, and analytic girds and charts. Behind the filters are formulas based on the Simple Time Period Specification (STPS.)

In the text that follows, I hope to demonstrate the use of these concepts:

  1. Setting up and using Time Intelligence with both tabular and multi-dimensional data sources
  2. Using Time Intelligence with KPIs and Scorecards
  3. Using Time Intelligence with Analytic Grids, Charts, Excel Services reports, and SSRS reports
  4. Using both types of Time Intelligence Filters, the standard time intelligence filter and the connection formula
  5. Using the TI Connection formula to provide users with a From Date To Date range functionality

I have modified AdventureWorksDW2008 relational database by adding three views. One to increase our date dimension, one to extrapolate data to the current date, and one to use as a tabular data source. I made some pretty massive changes to the Adventure Works cube to simplify my demonstration process.

You can download those views and the XMLA for the altered SSAS database, here.

Once the cube was processed I had accomplished two things:

  1. My date dimension starts at the beginning of the year (best practice recommendation for working with time intelligence in PerformancePoint; but, not required.)
  2. My fact table has data through the current date.

In order for the time intelligence formulas to work properly, certain things must be set up on the data source connection. We will create two data connections for these examples. One will be a multi-dimensional data source to our cube and the other will be a tabular data source to a view that combines the data I need from the relational database. The process to add the time intelligence to these two data sources is similar; but, different.

Open PerformancePoint and create a new data connection by right clicking the data connection folder in the workspace browser and selecting "New Data Source." Select tabular list and SQL Server table. In the table field select the view included in the project files above named "vw_InternetSalesTabularExample." Select the time tab and select the options checked below in the screenshot.


Next, create the data connection to the OLAP cube. Right click the data connections folder, select "New data source"; but, this time select the "Multidimensional" tab and select "Analysis Services." Below is a screenshot demonstrating what the time tab should look like once complete.


Now that we have both data connections set up, I will first demonstrate using the time intelligence with a tabular source. The tabular data source can only be used with filters, KPIs and scorecards. Right click your PerformancePoint content list in the workspace browser and select "New", "KPI." Name the KPI "Internet Sales."

Within the KPI, rename Actual to MTD and rename Target to MTDLY. Click the data mappings for MTD, change the source, select "SalesAmount" in the measure dropdown and click the "New Time Intelligence Filter" button. Enter the following formula into the dialog:


Now, change the data mappings for MTDLY, change the source, again select "SalesAmount" in the measure drop down, click the "New Time Intelligence Filter" button and enter the following formula into the dialog:


Now, right click your PP content list folder in the workspace browser and select "New", "Scorecard." In the scorecard template dialog, select the "Tabular" tab and "SQL Server Table." Click Ok and the wizard will walk you through the next steps. Select your tabular data source, click next. Click the Select KPI button and choose the Internet Sales KPI we just created. Click Next a couple of times and the click Finish.

Delete the MTD column by right clicking on it. Then right click the MTDLY and select Metric Settings. Rename it to MTDLY vs MTD and select Actual in the "Additional data value" drop down. Click Ok. Drag ProductSubCat above "MTDLY vs MTD." Drag "Order Country" as the parent of the Internet Sales KPI. Click the Edit tab and then the update button. Your screen should look like this:


To create a scorecard using the multidimensional data source; the steps would be pretty much identical.

For the next demonstration we will create a dashboard with an Analytic Grid that uses a standard time intelligence filter. Start out by creating an analytic grid using the SSAS data source that looks like the one in the screenshot below.


Notice the "Date Calendar" is in the background of the grid. Now create a new time intelligence filter using the SSAS data source. We will enter the following Formula/Display Name combinations by clicking the Add Formula button for each one.



Clicking the preview button will show the MDX behind the formulas.


Note: The second row says no results because I ran this on Feb 29th, 2012. There was not a Feb 29th in 2011.

Notice the difference between the monthtodate, yeartodate, quartertodate, and fullmonth formulas and their SSRS compatible counterparts. SSRS can not handle these formulas so I used an alternate syntax to demonstrate how to accomplish the same thing using an alternate syntax.

Now create a new dashboard and drag the Analytic Grid to the design surface. Then drag the TI standard filter to the dashboard and connect it to the grid by selecting Member Unique Name from the TI filter and dropping it onto the drop zone in the analytic grid space. When the connection dialog comes up, make sure you select "Date Calendar" in the "Connect to" drop down.


Deploy your dashboard and test out the filter.

Now, we will create a TI Connection Formula Filter. Right click in your workspace browser and select New, Filter; but, this time select the time intelligence connection formula filter. Add your OLAP data source and click Next and Finish. Next create a new dashboard or a new page on your existing dashboard. We'll use the same analytic grid; but, this time hook up the TI connection formula filter to dashboard. Everything is the same as the screenshot above except this time click the "Connection formula" button and enter the following into the dialog:


This is saying, calculate from the selected date 6 quarters back (1.5 years) and aggregate from there to three quarters out. So if today is 1/1/2012, 6 quarters back would be 7/1/2010 (1.5 years away). That is our start range.

3 quarters back from 1/1/2012 would be 4/1/2011.

Deploy the new dashboard, open SQL Server Management Studio's cube browser and verify the results.





Note: select DATEADD(quarter,-6,'1/1/2012'), DATEADD(quarter,-3,'1/1/2012') returns 7/1/2010 and 4/1/2011. So why did I filter from 7/1/2010 to 6/30/2011 above? Because we are working in quarters. 4/1/2011 starts a new quarter and we include the entire quarter in the results. If this doesn't make sense to you, try an example using month-6:month-3 instead.

Next I will demonstrate how to provide your users with a From Date parameter and a To Date parameter to provide range based queries using the PerformancePoint Reports. You can provide similar functionality by using a single date parameter and using the Multi-Select tree display method; however, you and/or your users may prefer to use range based parameters.

The first step is to copy your existing Analytic Grid and give it a new name. We will work from this copy. Open this copy and click on the Query tab. Locate this line of code and position your cursor just before date.calendar part.

WHERE ( [Measures].[Internet Sales Amount], [Date].[Calendar].DEFAULTMEMBER )

Type FromDate in the parameter text box and click the insert button. Create another parameter in the same fashion by typing ToDate in the textbox and clicking insert. Modify the where clause so it looks like this:

WHERE ( [Measures].[Internet Sales Amount],HIERARCHIZE({<>:<>}) )

Create two new filters, one called TI From Date, and another called TI To Date using the time intelligence connection formula filter.

Now create a new page in your dashboard, drag the copied analytic grid to the design surface and then the two TI connection formula filters. Connect both filters to the analytic chart making sure to connect them to the proper parameters. Use Day:Day in both connection formulas. Publish your dashboard and test.

Next we will create a new report in PerformancePoint that connects to a SSRS report with a date calendar parameter. We can create two dashboard pages to demonstrate this functionality. One using the standard TI filter, and one using the TI connection formula filter. The only caveat you need to be aware of is that the members in the standard TI filter that use the ...toDate or ...Full(Month/Quarter) syntax will not work. You will have to use the alternate syntax describe earlier to get that same functionality.




The YTD (Non SSRS) filter produces the error:

  • Default value or value provided for the report parameter 'DateCalendar' is not a valid value. (rsInvalidReportParameter)


Note: These numbers match the earlier example we did using the Analytic Grid.

Next we will hook the TI filters up to an Excel Services report. The only issue I had when preparing this demonstration was an error that occurred when previewing the dashboard. "Attempted request on an invalid state. Unable to perform the operation" Google to the rescue!

Basically, I had to uncheck the box in SharePoint Central Administration for my Excel Services application that says 'Refresh warning enabled.'

I also received an error when I used the YTD (Non SSRS) filter with Excel Services.


'An error occurred while attempting to set one or more parameters in this workbook. As a result, none of the parameters have been set. Click Ok to return to the workbook.'


TI Standard Filter with Excel Services report


TI connection formula filter with Excel Services report. Numbers match previous examples.

It should be noted that the 'Show Details' and 'Additional Actions' features are greyed out when using the TI filters with a date dimension in the background of the analytic grid and chart. One (not very good) workaround is to put the date dimension on the rows or columns to get this functionality back. The reason this work around is not very good is that your report does not look the same.

Example using analytic grid:

Dashboard page looks like this:image_thumb47

Right clicking a cell has 'Show Details' and 'Additional Actions' grayed out.


I created a copy of the original and dragged the date calendar from the background to the rows underneath the geography to demonstrate what it would take to get the 'Show Details' functionality back.


Not too bad using the YTD (Non SSRS) filter. (Not too good either...)


But, change it to last 10 days, and it becomes very ugly, very fast.


Moving the date calendar above the Geo in the rows helps a bit in some cases.

So nutshell, there are some caveats with working with the time intelligence features in PerformancePoint. Overall, they are a great feature to add some great functionality to your dashboard with minimal effort.

Miscellaneous Facts

Suppose you have a data source that has multiple time dimensions and you want to
use both time dimensions. The solution is to create a new data source for each time
dimension you want to use in your PPS solution. For example, if the cube you are using
has both calendar year and fiscal year dimensions, you can create two data sources
using the same server and cube information with the only difference being the time
dimension selected in the Time tab of each data source. When creating KPIs or filters,
select the data source with the time dimension that makes sense for that object.

Colon (:)
The colon is used to indicate a range of dates. For example, the statement Day-
1:Day-7 selects all the days between yesterday and a week ago inclusively.
The comma is used to combine two members. For example, the statement Day-
1,Day-7 selects today and a week ago today as distinct dates.

You can create two kinds of Time Intelligence dashboard filters:
1. Time Intelligence dashboard filters that include a list of dynamic time periods that you specify
2. Time Intelligence Connection Formula dashboard filters that use a calendar control to specify information as of a particular date. When you create a Time Intelligence Connection Formula dashboard filter, you do not specify a formula until you connect that filter to a report or a scorecard.

Periods-to-date are a NEW type of TI formula added in Office 14. The result of a to-date period is an
aggregation of all time periods to date up to the last completed full period. Incomplete time periods are
automatically excluded. They are evaluated to the lowest degree of granularity in the data source by default. For example, if most granular time period in the data source were days, then the month to date expression will
aggregate all days from the beginning of the month to the last completed full day in the month. (The opposite is true for standard time periods They automatically include incomplete periods

Periods to date are not compatible w/ SSRS (personal experience.)


Here are some links that helped me put this blog post together.

PerformancePoint Relative Date Time Intelligence with Current Date Time

How to use Time Intelligence Filters with Excel Services or How to Pass a Range Selection into your Excel Report

From Date To Date in PerformancePoint Analytical Chart

Time Intelligence Post Formula Filter Template in PerformancePoint Server

PerformancePoint Time Intelligence - BI for the Masses

Create a Time Intelligence Filter by Using Dashboard Designer

Time Intelligence Differences Between Grids and Scorecards

Categories: Blogs
Rate this article:
No rating

Mike MilliganMike Milligan

Other posts by Mike Milligan

Please login or register to post comments.