Archive

Archive for the ‘Analysis Services’ Category

Using Excel to connect to Analysis Services

February 24, 2011 Leave a comment

A customer mailed me with a problem last week – they had some Excel workbooks set up ( normal Excel not Power Pivot ), which sourced data from AS cubes, and some users could not get the workbook to refresh the data.

In this situation many people quickly point the finger to the poor Analysis Services guys , however there are some things that you can check on the Excel side. Remember that when you create the connection in the Excel wizard, it will generate a connection string similar to what you would create in Visual Studio ( For example , when you are connecting from Reporting Services to a Cube ).

So my first port of call would be to check the connection string being generated in a situation like this. To do this , go to >

1) Data ( on the ribbon )

2) Then Connections

3) Select the connection that you are using and click on Properties

4) Click on the definition tab

In the “connection string” box you will see the connection string. It is here that you can play with various options. In the case of my customer , I had suspected that it was a security issue preventing the data refresh , so I added the line “Persist Security Info=True;” to the connection string and that did the trick. It should have been there by default though.

I’ve had some customers remove the “MDX Compatibility=1” and “MDX Missing Mode=Error” and it helped fix problems , but remember it is there for a reason. You can also try other tricks like increasing the time out and packet size here ,”Command Timeout=180″ or however many seconds you wish to specify, although once again do so with caution.

Advertisements
Categories: Analysis Services

Analysis Services UDM – very much alive

November 16, 2010 Leave a comment

There seems to be some anger after the recent announcements at SQL PASS. The big announcement from an Analysis Services perspective was the ability to create whats called the BI Semantic Model ( BISM ), basically a relational model that uses the Vertipaq engine for performance.

Almost immediately people were complaining that after years of investment into the Analysis Services  UDM model ( And MDX skill ) , they are not happy to throw that all away. Well you don’t have to. I think that there was a knee jerk reaction here –  UDM is very much alive and well and people will still be able to write MDX. In fact , one presentation showed that in future you might be able to query the BISM with MDX – so there is that roadmap for the MDX language.

The idea behind this new model seems to be to bring in some simplicity , its aims to be less complex than UDM. UDM will still be there , and will be more powerful but more complex. Perhaps UDM will be used for Tier-1 solutions in future ?

If anything , Analysis Services as a product just got a whole lot better, with a second modeling option now available to you.  This knee jerk reaction takes me back to the Performance Point announcement – Some people don’t realize that PerformancePoint is still alive and actually better since you now get it free with SharePoint Server Enterprise.

If you want to read more , here’s TK Anand’s post

Read
He does mention that there have been some improvements in UDM for Denali.

If anything , I’m surprised that nobody picked up on the Project Crescent limitation where you can only source data from a BISM model. That was more disappointing for me , however lets be honest :

  • The finished product hasn’t launched yet
  • It’s an end user tool so once again simplicity is the focus.

As always , send me your thoughts.

Categories: Analysis Services

Percentage Aggregations in Analysis Services

January 23, 2009 1 comment

Aggregations, as we all know, are what gives cubes an advantage at query time. But while summing the lower levels of data to produce an aggregate is fine for a lot of measures , some measures are semi-additive. Now obviously there are options in analysis services for semi-additive measures , such as Min, Max and Distinct. What about percentage aggregations ? ( Or average values )

Click on the Cube Structure tab , select a measure and you can set this behaviour in the properties. Thing is , if you have percentage data in your cube, you want the aggregate at the next level up to be an average of the child level. There is an option in the properties called AverageOfChildren, but be warned , this doesn’t always work like you think it would !! If you’re expecting it to return an equivalent of a simple AVG() with a GROUP BY as in SQL, it doesn’t. It seems to return the average ONLY against the time dimension !

So what to do ? Well, we need to implement a calculated member that does a simple sum/count. It will apply this at all levels and give us the behaviour that we require. Lets use good old AdventureWorksDw. I didn’t find any measures with percentage values, so I’ve decided to create a fact table of my own, using the following query :

SELECT
    SUM(FactRS.SalesAmount) AS SalesAmount,
    SUM(FactRS.TaxAmt) AS TaxAmt,
    SUM(FactRS.OrderQuantity) AS OrderQty,
    MIN(FactRS.SalesTerritoryKey) AS SalesTerrirityKey,
    MIN(DimTime.TimeKey) AS TimeKey,
    FactRS.EmployeeKey,
    DimTime.CalendarQuarter,
    DimTime.CalendarYear,
    FactSQ.SalesAmountQuota,
    SUM(FactRS.SalesAmount) / FactSQ.SalesAmountQuota * 100 AS PercTargetAchieved
FROM dbo.FactResellerSales AS FactRS
INNER JOIN dbo.DimTime
    ON FactRS.OrderDateKey = DimTime.TimeKey
INNER JOIN dbo.FactSalesQuota AS FactSQ
    ON FactSQ.CalendarYear = DimTime.CalendarYear
    AND FactSQ.CalendarQuarter = DimTime.CalendarQuarter
    AND FactSQ.EmployeeKey = FactRS.EmployeeKey
GROUP BY FactRS.EmployeeKey,
    DimTime.CalendarQuarter,
    DimTime.CalendarYear,
    FactSQ.SalesAmountQuota

I’ve created a simple cube linking this fact table to the SalesTerritory, Employee and Time Dimensions.

Now one of the nicer things about Analysis Services is that it always tries to do things for you – it will normally create a measure called the same as your Fact table + “Count”, in this case Fact Sales Count. If it doesn’t create it you can create it manually by including a running value field and setting that fields aggregation to sum.

Now we need to add the calculated member. Click on the “Calculations” tab, then on the little calculator icon to start. Name your calculated member, select Measures as the parent hierarchy , and in the expression box enter the calculation. This would be :

Sum([Measures].[PercentageMeasure])/Sum([Measures].[FactCount])

In other words the sum of the percentage field divided by the sum of the Count field. This should do it , process and browse the cube and verify the results.


So there you go , a bit of work but it seems to do the trick. Now if only we could get a built in function to do this for us….

Update 24/01/2009 : As pointed out to me by a colleague, doing a sum/ count of a percentage value won’t always realize the correct answer that’s needed. What you have to do in that case is bring in the base data and write a calculated member that will return a percentage value at all levels while performing the calculation of the base measures.

Categories: Analysis Services