Business Intelligence Blogs

View blogs by industry experts on topics such as SSAS, SSIS, SSRS, Power BI, Performance Tuning, Azure, Big Data and much more! You can also sign up to post your own business intelligence blog.

«November 2015»

DirectQuery in Power BI Desktop

In the latest Power BI Desktop a new Preview features was released that now allows you to connect using DirectQuery to either SQL Server or Azure SQL Databases.  DirectQuery is a really neat feature that allows you to point to the live version of the data source rather than importing the data into a data model in Power BI Desktop. 

Normally when you want to get an updated dataset in the Power BI Desktop you would have to manually click the refresh button (this can be automated in the Power BI Service), which would initiate a full reimport of your data.  This refresh could take a variable amount of time depending on how much data your have.  For instance, if you’re refreshing a very large table you may be waiting quite a while to see the newly added data. 

With DirectQuery data imports are not required because you’re always looking at a live version of the data.  Let me show you how it works!

Turning on the DirectQuery Preview

Now, because DirectQuery is still in Preview you must first activate the feature by navigating to File->Options and settings->Options->Preview Features then check DirectQuery for SQL Server and Azure SQL Database


Once you click OK you may be prompted to restart the Power BI Desktop to utilize the feature.

Using DirectQuery in Power BI Desktop

Next make a connection either to an On-Premises SQL Server or Azure SQL database.

Go to the Home ribbon and select Get Data then SQL Server.


Provide your Server and Database names then click OK. ***Do not use a SQL statement.  It is not currently supported with DirectQuery***


From the Navigator pane choose the table(s) you would like to use.  I’m just going to pick the DimProduct table for this example and then click Load.  You could select Edit and that would launch the Query Editor where you could manipulate the extract.  This would allow you to add any business rules needed to the data before visualizing it.


Next you will be prompted to select what you want to connect to the data. Again, Import means the data

Read more

The Big Data Blog Series

Over the last few years I’ve been speaking a lot on the subject of Big Data. I started by giving an intermediate session called “Show Me Whatcha’ Workin’ With”. This session was designed for people who had attended a one hour introductory session that showed you how to load data, to look at possible applications … Continue reading The Big Data Blog Series
Read more

SQL Internals Reading Data Records Part 3: Fixed Length Columns

  • 29 June 2012
  • Author: BradleyBall
  • Number of views: 3932

Hello Dear Reader and welcome back to part 3 of our ongoing series on how to Read a Data Record. 

It is Friday just before the weekend, and now we are getting to the part of the record that means a great deal to us. 

Not that the Tag Bytes and Null Bitmap offset weren’t important but now we are finally out of the meta data and into OUR DATA.  The data that we as DBA’s store or that our users store.

So I’ll dispense with the turn of the phrase and dive directly into our data.



First let’s update our chart so we know what part of the Data Record we are tackling.  Once again these images come by way of Paul Randal (@PaulRandal | Blog) and the MCM Video series on Data Structures and the good people from Microsoft.


Our Fixed Length Columns are any data type that always uses the defined amount of space.  They are BIGINT, INT, FLOAT, DECIMAL, MONEY, CHAR, NCHAR, BIT, SMALLMONEY, MONEY, REAL, DATETIME, DATETIME2, DATETIMEOFFSET, and BINARY.  As mentioned these fields will use 100% of the space assigned to them regardless of the length of the data stored within them.  For example if I define a CHAR(500) and we insert ‘Some Product’, which is 12 characters long 1 byte per character means 12 bytes for storage, we will have 488 bytes of left over space.  If we have an Integer it always uses 4 bytes even if the number only needs 1 byte of space.  Before I break down into a Compression blog let’s just go straight into our record.


The scripts we’ve been using from our previous blogs created a fixed length record that was 1504 bytes which is a little much to put on a page, but the second table we created yesterday that had just two fields and took up  9 bytes would work out well.  So here’s the code if you need it.


CREATE TABLE fixedRecord
         myID INT
        ,mychar CHAR(5)
INSERT INTO dbo.fixedRecord(myid, mychar)
VALUES(1, 'X')


Don’t forget we’ll use DBCC IND to get our page number.


DBCC IND(demoInternals, 'fixedRecord', 1)


Then we’ll set Trace Flag 3604 on and use DBCC PAGE to look at the output.


DBCC PAGE('demoInternals', 1, 282, 3)


Here is the output greatly paired down just to what is relevant to us. 

0000000000000000:   10000d00 01000000 58202020 20020000           ........X    ...


The record we inserted was (1,X) our 1 will be highlighted in red and our X in orange.  We need to remember that each section of code grouped together in groups of two hexadecimal digits in 4 byte segments.  So that means everything in red makes up a perfect 4 byte segment because it is and INT data type, our CHAR(5) takes up all of one segment and two digits, a hexadecimal pair, in orange. 


Now that we have them grouped we can start translating them.  The trailing 0’s in 01000000 represent the empty unused space in the Integer field.  Our value 01 translates to 0x01 which equals 1, if you want you can convert it to validate  Our next record will be a little more interesting 58202020 20.  The first part that we are interested in if 58, we’ll use our converter tool to convert that from Hex to binary, 01011000.  Our next step is to find out what the binary value actually translates to in text.  For that we’ll use another conversion tool, just plug in 010110000 and we see that it is binary for the letter X.


“So Balls”,  you say, “What are all the 202020’s for then?”


Dear Reader your keen powers of perception never fail to amaze me!  The 202020 20’s are the white space left over in the CHAR(5) field.  Remember we only need 1 byte worth of space for our letter X, but our CHAR(5) takes up 5 bytes no matter what.  If you take a 20 and place it into our handy Hex converter you get the binary value of 00100000, which when converted to ASCII is a simple space.   So 58 is our X and 202020 20 are the 4 spaces in the record as well.




While we have a choice in using CHAR and VARCHAR data types and whether we waste white space or not, we do not have that option with Integers or Dates and Times, which is a perfect reason to start learning about SQL Server Compression and reclaim some of that wasted space.


Thanks again for stopping by Dear Reader.





Categories: Analysis Services
Rate this article:
No rating


Other posts by BradleyBall

Please login or register to post comments.