Archive for January, 2009

The BI Bombshell

January 28, 2009 1 comment

Microsoft’s recent announcement that it was discontinuing Performance Point Server as a stand-alone product seems to have provoked the usual tide of Redmond bashing from all quarters. “Oh, they’re abandoning all their BI customers!” – Really? Last time I checked, SQL, AS, SSIS and RS were still on the market. And Monitoring and Analytics will be part of the next Sharepoint. We don’t have a planning module anymore though.


So why the fuss ? Because it’s MS, that’s why. In a stunning and equally shameless display of opportunism, one BI Vendor within minutes had an entire spiel on it’s website on how they would step in to ease the pain of Microsoft’s BI customers. Funny, I didn’t see any mention of a planning module there though.


While it is disappointing that planning is canned, overall it looks positive for Microsoft BI. The platform provides tremendous value , and allows customers to create a base on top of which they can add other tools of their choice , if needed. And then there’s Gemini….

Categories: SQL Server

Percentage Aggregations in Analysis Services

January 23, 2009 1 comment

Aggregations, as we all know, are what gives cubes an advantage at query time. But while summing the lower levels of data to produce an aggregate is fine for a lot of measures , some measures are semi-additive. Now obviously there are options in analysis services for semi-additive measures , such as Min, Max and Distinct. What about percentage aggregations ? ( Or average values )

Click on the Cube Structure tab , select a measure and you can set this behaviour in the properties. Thing is , if you have percentage data in your cube, you want the aggregate at the next level up to be an average of the child level. There is an option in the properties called AverageOfChildren, but be warned , this doesn’t always work like you think it would !! If you’re expecting it to return an equivalent of a simple AVG() with a GROUP BY as in SQL, it doesn’t. It seems to return the average ONLY against the time dimension !

So what to do ? Well, we need to implement a calculated member that does a simple sum/count. It will apply this at all levels and give us the behaviour that we require. Lets use good old AdventureWorksDw. I didn’t find any measures with percentage values, so I’ve decided to create a fact table of my own, using the following query :

    SUM(FactRS.SalesAmount) AS SalesAmount,
    SUM(FactRS.TaxAmt) AS TaxAmt,
    SUM(FactRS.OrderQuantity) AS OrderQty,
    MIN(FactRS.SalesTerritoryKey) AS SalesTerrirityKey,
    MIN(DimTime.TimeKey) AS TimeKey,
    SUM(FactRS.SalesAmount) / FactSQ.SalesAmountQuota * 100 AS PercTargetAchieved
FROM dbo.FactResellerSales AS FactRS
INNER JOIN dbo.DimTime
    ON FactRS.OrderDateKey = DimTime.TimeKey
INNER JOIN dbo.FactSalesQuota AS FactSQ
    ON FactSQ.CalendarYear = DimTime.CalendarYear
    AND FactSQ.CalendarQuarter = DimTime.CalendarQuarter
    AND FactSQ.EmployeeKey = FactRS.EmployeeKey
GROUP BY FactRS.EmployeeKey,

I’ve created a simple cube linking this fact table to the SalesTerritory, Employee and Time Dimensions.

Now one of the nicer things about Analysis Services is that it always tries to do things for you – it will normally create a measure called the same as your Fact table + “Count”, in this case Fact Sales Count. If it doesn’t create it you can create it manually by including a running value field and setting that fields aggregation to sum.

Now we need to add the calculated member. Click on the “Calculations” tab, then on the little calculator icon to start. Name your calculated member, select Measures as the parent hierarchy , and in the expression box enter the calculation. This would be :


In other words the sum of the percentage field divided by the sum of the Count field. This should do it , process and browse the cube and verify the results.

So there you go , a bit of work but it seems to do the trick. Now if only we could get a built in function to do this for us….

Update 24/01/2009 : As pointed out to me by a colleague, doing a sum/ count of a percentage value won’t always realize the correct answer that’s needed. What you have to do in that case is bring in the base data and write a calculated member that will return a percentage value at all levels while performing the calculation of the base measures.

Categories: Analysis Services

Custom Code in Reports – Part 1

January 7, 2009 Leave a comment

One of the nicer , although I suppose under-utilised, features of Reporting Services is the ability to enhance your reports with a bit of custom code. Why do this ? Well you could implement some sort of difficult business logic using .NET code and your report fields, parameters and filters could interact with it to take advantage of that functionality. Now there are 2 ways to implement custom code :

  1. Use embedded code in your report. This is simple to implement and the code is deployed in your RDL.

  2. Create a custom assembly which is called by the report. A bit more work I suppose, but this is very powerful.

Lets start off with the embedded code. A simple example – I have here a sales report off adventure works. This uses the same query that I’ve shown previously on this blog and is a matrix with a product category grouping on rows and the year on the columns.

Right , so lets say that we would like to calculate the mean and standard deviation of these figures and stick them at the end of each row. One of the nice features of the tablix is that you can click on the cell of a column and select Insert Column – Outside Group – Right. That will create a parallel column at the end of the tablix .

To work out the mean value , the formula is simply ∑x / n ( The sum of x divided by the number of values ). A simple function that could return that for us would be :

Public Function Mean (ByVal Sum as Decimal, ByVal N as Integer) as Decimal
    Return Sum / N
End Function

To implement the function in your report , right click anywhere outside the design area , select Report Properties and go to the Code tab. Paste the above function into the code area – you can have more than 1 function here.

Now to call the function from a cell we use the [Code] command, so in this example it would be :


As you can see, I’m supplying the Sum of my sales value and a Count Distinct of the Year values ( across the entire dataset ) as the 2 arguments into the function. This is simple enough and will result in the following output :

Now I know what you’re thinking , you’re thinking we didn’t need a function for that simple calculation. Well the real value in this comes into play for something a bit more complex, something that you don’t want to write from scratch in that little cell. And what about if you need the same calculation for many cells in the same report ? You create one function and call it from many places in the report. It’s quite useful, although just be careful about writing too many of these functions in your reports – maintanence can quickly become a nightmare, especially if these functions are hidden in there and not documented. In part 2 , we’ll do something alot more interesting , that is , call a custom component from a report.

Categories: Reporting Services