Categories
Business Intelligence Geeky/Programming

Scale-Out Querying with Analysis Services

Today, I re-read this best practices article on MSDN – Scale-Out Querying with Analysis Services

The architecture they talk about makes sense. Multiple web servers on the front end, a “query” layer, and processing layer. One thing that I don’t get though is how a setup like that can only support 30-60 concurrent users. I guess I would like to see the numbers on the average concurrent users based on usage. Like how many users on the reports/applications cause concurrent usage? Are connections reused? Etc.

Seems that there could also be a better way than robocopying files around and starting stopping services, but maybe not. Looks like they have an SSIS package that kind of handles it all, which is cool.

Sometimes I wonder though, if these setups are also biased in such a way to maximized the number of SQL Servers (and licenses) that you need to get something like this going. Would be nice to see a study on something like this from a 3rd party, for sure.

Technorati tags: , , ,
Categories
Business Intelligence Geeky/Programming SQLServerPedia Syndication

SSASUtils: SQL Server Analysis Services Utilities – CodePlex

So, today I decided to try out Microsoft’s site: CodePlex. I recently created a utility app to process dimensions and partitions in SSAS (SQL Server Analysis Services) because I didn’t like the way it worked in SSMS (SQL Server Management Studio). I don’t know because I never used it, but I guess in pre-SP2 you could process multiple objects at the same time, just hit process, move on to the next one. In SP2, the process dialog is modal so you can get around it, you have to wait for the object to process – yuck!

Also, I just wanted to try out CodePlex and see how it works, what I like and don’t like. Since becoming a contractor doing Data Warehousing and BI stuff, I miss coding in .NET (and any other language), so I want to keep that up in my free time. This is a result of that. Hopefully once I get used to CodePlex I will contribute to some other Open Source .NET Projects, sounds like fun to me!

Here is the link to the project: http://www.codeplex.com/SSASUtils

Some of the glue code for connecting and processing I gleamed from other examples and projects I have seen around the web, some in VB.net , some that didn’t even work, etc. Hopefully more SSAS Utils can be added to this project and it can become way more useful that it is now (although I do use it pretty much every day right now).

CodePlex: I like it so far. There are few things that could be done better, but I am using the Team Explorer source control which fits nicely into VS2005, really cool. I will probably post up more on CodePlex later. If I get the urge, I might create project for some other stuff I have lingering around my hard drive (FFMC, PocketBlogger, other random utils). I am anxious to see what the community participation is like, I tired GotDotNet back in the day and it was weak, we will see with CodePlex.

Technorati tags: , , , , , , , , , , , , , ,
Categories
Business Intelligence Geeky/Programming SQLServerPedia Syndication

SSAS: Create a Calculated Measure always getting the value for "Yesterday"

If you want to create a calculated measure in SSAS (SQL Server Analysis Services) that will always get the value for yesterday, this is what I have come up with:

 

CREATE MEMBER CURRENTCUBE.[MEASURES].[My Yesterday Measure]
 AS (StrToMember(‘[Time].[Calendar Date].&[‘+ VBA!Format(vba!dateadd(“d”, -1, vba![date]()),’yyyy-MM-dd’) +’T00:00:00]’),[Measures].[My Cool Value]),
FORMAT_STRING = “$#,##0.000”,
VISIBLE = 1  ;

What is really cool is you can test this before you even make the measure, using an MDX query:

 

WITH
   MEMBER Measures.[Yesterday Measure Test1] 
   AS ([Time].[Calendar Date].&[2007-07-09T00:00:00],[Measures].[My Cool Value])
   MEMBER Measures.[Yesterday Measure Test2] 
    AS (StrToMember(‘[Time].[Calendar Date].&[‘+ VBA!Format(vba!dateadd(“d”, -1, vba![date]()),’yyyy-MM-dd’) +’T00:00:00]’),[Measures].[My Cool Value]),FORMAT_STRING = “$#,##0.000”
SELECT {
Measures.[Yesterday Measure Test1]
,Measures.[Yesterday Measure Test2]
} ON COLUMNS
FROM [My Cube]

and now you can take that calculated measure and tweak for any days back (30, etc)

Cool! 🙂

Technorati tags: , , , , , , , , , ,
Categories
Business Intelligence Geeky/Programming SQLServerPedia Syndication

SSAS 2005 – Named Sets – TopCount – Static/Dynamic and Ordering

Named Sets in SSAS 2005 (SQL Server Analysis Services 2005) are a good thing. You can set up named sets for things like “Last 3 months” or “Yesterday” which is really nice for end users. The nature of dates is that you can make the MDX dynamic but yet the named set is still “static”.

You can also create sets when running MDX queries, and use them later in queries you run. These are dynamic named sets. The problem is, when you are working on your cube, you cannot define dynamic named sets in the “Calculations” tab. The named sets there are static. If you try to define a named set that say, for instance, tries to get the “Top X Salespeople by Revenue” or something, you can create it, but what happens is when you deploy your cube, the named set gets run for one slice of the cube, usually just the default (http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=817566&SiteID=1   and http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=1304964&SiteID=1).

I did read on a site, that in SSAS 2008, you will be able to create dynamic named sets in the cube editor, so that is good.

One other thing with named sets, if you create a named set that is using TopCount, when you bring it into an Excel pivot table, you will see the Top X values you are looking for (sliced incorrectly, as I stated above, but still there). Thing is, they are sorted by the SET Expression, not the Numeric Expression. I haven’t found a way to get them to sort by the numeric expression. Example from above, you have the Top 3 Salespeople by Revenue. You want to see that sorted by revenue. If you run a MDX query using TopCount

SELECT
{[Measures].[Revenue]} ON COLUMNS,
{TOPCOUNT([Employees].[Salespeople].MEMBERS,3,
    [Measures].[Revenue])} ON ROWS
FROM
    [MyCube]

you will see it ordered in the query results correctly

John Smith           $1000
Amanda Jones      $950
Homer Simpson     $500

etc

Now if you take the TopCount out –

TOPCOUNT([Employees].[Salespeople].MEMBERS,3,
    [Measures].[Revenue])

and create a named set in your MDX Script of the cube editor

CREATE SET CURRENTCUBE.[Top 3 Salespeople by Revenue]
 AS TOPCOUNT([Employees].[Salespeople].MEMBERS,3,[Measures].[Revenue]) ;        

or even like this as the forums above say

CREATE SET CURRENTCUBE.[Top 3 Salespeople by Revenue]
 AS TOPCOUNT(Existing [Employees].[Salespeople].MEMBERS,3,[Measures].[Revenue]) ;  

When you pull that named set into your pivot table in Excel, they wont be ordered by Revenue, instead, by name

Amanda Jones      $950
Homer Simpson     $500
John Smith           $1000

Funky stuff. My recommendation – if you are going to be needed dynamic named sets like the ones created above, just create a SSRS (SQL Server Reporting Services) report instead, or wait until SSAS 2008 comes out.

Note:

Now, the above is just what I have found. There might some crazy way that I am unsure of in MDX or somewhere to get this to work, but I can’t find anything that would let me do what I want.

Technorati tags: , , , , , , , , , , , , , , ,
Categories
Business Intelligence Geeky/Programming SQLServerPedia Syndication

SSAS: AverageOfChildern and Semi-additive measure requires a time dimension

In SSAS (SQL Server Analysis Services), if you have a measure group, with some measures as AggregateFunction of Sum and some of AverageOfChildern, you will see a red squiggly line on the measure group and if you highlight it (or build the cube), you will get this error:

Semi-additive measure requires a time dimension.

The problem starts with the AggregateFunction property of your measures, where some are Additive, some are Semi-Additive. You can see which are which by looking at MSDN here. What I found will get rid of the error, but still might not help you is this: if you go to the properties of your Time dimension, and check its type. By default dimensions are created with type “Regular”. If you change your Time dim to type “Time”, then rebuild, you won’t get any errors, and you can deploy.

Now, this might be all good, but in the case I ran into, it still didn’t work for me. AverageOfChildern only applies to the Time dim. Not what you would expect from looking at it. Say you have a measure, that you want the Average (AVG) instead of Sum, you would think that picking AverageOfChildren as a AggregateFunction property would do the trick, well it doesn’t. If you do this, deploy, and browse your cube, you will still see that the values are Sum. WTF right? Well like it sparsely says in the docs, it only applies to the Time dim.

In order to get the Average of a measure, here is the best way to go about it that I have found.

1) Create a Measure from the field for Sum
2) Create a Measure from the field for Count
3) Create a Calculated Member (Measure) for the field, taking Sum/Count from step 1 and 2.
3a) you can check for 0’s in the count by using IIF()
3b) for best performance, check your NON_EMPTY_BEHAVIOR and if it applies to you, it should speed up your results tremendously. Usually you can just choose the fields you are using in your calculation and it should be fine from what I have seen (I am no expert though)
4) make sure to hide your Sum and Count measures if you don’t want your clients to use them by setting the Visible property to false.
5) Build, Deploy, Process, Report

 

Technorati tags: , , , , , , , ,
Categories
Business Intelligence Geeky/Programming SQLServerPedia Syndication

SSAS: Deploying Cube OLE DB Error – Login Failed

If you are deploying/building a cube in SQL Server Analysis Services (SSAS), and you get tons of errors, and littered throughout you see

OLE DB error: OLE DB or ODBC error: Login failed for user ‘NT AUTHORITYANONYMOUS LOGON’.; 28000.

You should take a look at your data sources, and on the impersonation tab, play around with the different options. Usually it is set to “Use the service account” or “default”. I have tried “Use the credentials of the current user” but got errors back when deploying saying that the impersonation method was invalid for processing. I have found that “Use a specific user name and password” works. As I sit and wait for the cube I just finally got to deploy to finish, I figured I would write this up in a post. Your mileage may vary.

Technorati tags: , ,
Categories
Geeky/Programming

Executing MDX – Errors

So, a while back I fired up Visual Studio 2005, and was making some Reporting Services reports. I need to do some custom MDX, and was getting errors trying to exceute it. WTF right?

Query preparation failed. -> Error in the application. (msmgdsrv)

Nice error, as always. So, did some digging, and actually found a hack around it, since running the update that was supposed to fix it didnt seem to want to work.

Query preparation failed.
-> Error in the application. (msmgdsrv)

Brian Welcher has a blog post describing the work hack around it to make it work
“To manually correct this, ensure the versions of msmdlocal.dll and msmgdsrv.dll in the following directories match the versions in the %Program Files%Common FilesSystemOle DB directory.

  • %Program Files%Micorosft Visual Studio 8Common7IDEPrivateAssemblies
  • %Program Files%Microsoft SQL Server90ToolsBinnVSShellCommon7IDE
  • %Program Files%Microsoft SQL ServerMSSQL.*Reporting ServicesReportServerbin”

I did that and it worked, which is great. But now I sometimes see some odd behavior with SQL and Visual Studio and I always wonder if something is jacked because of changing those dll’s. Well, hopefully I don’t see the issue the next time I reload my machine.

After installing Office 2007 Beta, SQL 2005 and then Excel Add On’s and God knows what else, it’s just amazing anything works on my machine. Fun stuff!!

Categories
Geeky/Programming

SSRS, SSAS – Auto Pick Yesterday

If you ever have written any Reporting Services reports off a cube (Analysis Services), and want to have a drop down that auto picks a day, this is for you. There is probably a more elegant way to do this, but it works.

=”[Time].[Calendar Date].&[“& Year(DateAdd(“D”,-1,DateTime.Now())) &”-“& IIf(Len(Month(DateAdd(“D”,-1,DateTime.Now()))) < 2,Month(DateAdd("D",-1,DateTime.Now())),"0"&Month(DateAdd("D",-1,DateTime.Now()))) &"-"& IIf(Len(Day(DateAdd("D",-1,DateTime.Now()))) < 2,"0"&Day(DateAdd("D",-1,DateTime.Now())),Day(DateAdd("D",-1,DateTime.Now()))) &"T00:00:00]"

Remember, months and days need to have a zero in the front of them.