Categories
Business Intelligence Geeky/Programming SQLServerPedia Syndication

SQL Server 2005 Performance Dashboard Reports

Check these out. http://www.microsoft.com/downloads/details.aspx?familyid=1d3a4a0d-7e0c-4730-8204-e419218c1efc&displaylang=en

If you are a DBA, these are pretty much money. I have seen some many custom implementations and DBA’s trying to get reports like this, and if you have SQL Server 2005 SP2, you can download that and run a SQL script and then you have these reports. They give you insight into pretty much everything going on with your server. Why pay $1000’s for a product that will give you the same info when it is built right into SQL Server!!

Technorati Tags: ,,,,
Categories
Business Intelligence Geeky/Programming SQLServerPedia Syndication

SQL 2005 SSAS Deployment Wizard – Where Did My Roles Go?

Ok.. if you are using SSAS 2005 and using the deployment wizard (not using BIDS) – probably because you want to retain your partitions or something. But anyways, if you use it and choose the option to retain your roles, you will deploy, and no one will be able to get to your SSAS DB or Cubes. What gives?

See this link on Microsoft. Doesn’t make sense to me but I guess in Microsoft speak Retain roles and members really equals Screw up roles and members and leave you scratching your head.

Technorati Tags: ,,,,
Categories
Business Intelligence Geeky/Programming SQLServerPedia Syndication

SQL 2005: OpenRowset, Dynamic MDX and Variable Scope

So, the other day I had to create something in T-SQL that called a MDX query using OpenRowset – this is pretty easy to do, you can query around the openrowset and get the values back you need in a T-SQL Query. This was fine when the MDX query was a static string.

The format of the query would be like this:

SELECT * FROM OpenRowset(‘MSOLAP’, ‘DATA SOURCE=MySSASServer; Initial Catalog=MySSASDB;’,’MyMDXQuery HERE’) as a

Now, the * will give you the columns from your MDX query, in the example above MyMDXQuery would be replaced with your actual MDX query.

The problem comes in, if you want your MDX query (which is a string), to contain some variable, so that you can pass something into the OpenRowset (say a date, or some other variable)…

The problem is, you need to execute the whole query (not just the OpenRowset) as a string, and the scope of variables is lost. You cannot declare a variable outside the TSQL string you want to EXEC, then set it inside the TSQL statement, then use it after. This makes it tough to get data out of the OpenRowset execution. Now if you just are executing the TSQL and getting a result set back for a report or something, it will work without doing what I am doing here, but if you need a scalar value back or something to use in a query later in your proc, then you need to do this. I tried different solutions and this was the only one I could get to work. Like I said, declaring a var before and trying to use in the TSQL exec wont work. Also, a RETURN wont work, it will give you an error saying it doesn’t work in the scope or something similar, here is an example of what does work – using a temp table.

 

DECLARE @TSQL varchar(max)

CREATE TABLE #results
(
  mytempresult DECIMAL(10, 3)
)

SET @TSQL = ‘
DECLARE @myVar AS DECIMAL(10,3)

SELECT @myVar =
SELECT [Measures].[MyMeasure] FROM
OpenRowset(
”MSOLAP”,
”DATA SOURCE=MySSASServer; Initial Catalog=MySSASDB;”,
”WITH
   MEMBER Measures.[MyMeasure]
  AS (‘ + @SomeDynamicString + ‘)
SELECT
{[Measures].[MyMeasure]} ON COLUMNS
FROM [MyCube]
”)
as a

INSERT INTO #results VALUES (@myVar)

EXEC ( @TSQL )

DECLARE @myVarForReal AS DECIMAL(10, 3)
SELECT  @myVarForReal = mytempresult
FROM    #results

DROP TABLE #results

as you can see, I CREATE the temp table outside the TSQL var, then I actually declare a var inside the TSQL statement, set it in my OpenRowset call, which I pass in some other var (@SomeDynamicString) and then insert that value into my temp table.

I then EXEC that TSQL statement, and then grab my variable for real from the temp table, and drop the temp table. You would think that I could just reference @myVar after the EXEC, but it doesn’t exist, and if I declare it outside the TSQL var, it will be empty after, and it won’t get set when I EXEC the TSQL.

Just a “gotcha” if you ever run into executing dynamic MDX from TSQL and you need to get a scalar value back from the MDX.. whew 🙂

Technorati tags: , , , , ,
Categories
Business Intelligence Geeky/Programming SQLServerPedia Syndication

SSRS and SSAS Formatting Dates

Ok, add this one to the annoying bin. If you are writing SSRS (SQL Server Reporting Services) reports of a SSAS (SQL Server Analysis Services) cube, and there is a time dim, you need to format the date parameters in the SSRS report to match the SSAS date format, as a string.

I started out doing this a while ago, like so:

=”[Time].[Calendar Date].&[“& Year(DateAdd(“D”,-1,DateTime.Now())) &”-“& IIf(Len(Month(DateAdd(“D”,-1,DateTime.Now()))) < 2,Month(DateAdd(“D”,-1,DateTime.Now())),”0″&Month(DateAdd(“D”,-1,DateTime.Now()))) &”-“& IIF(Len(Day(DateAdd(“D”,-1,DateTime.Now())).ToString()) < 2,”0″ & Day(DateAdd(“D”,-1,DateTime.Now())).ToString(),Day(DateAdd(“D”,-1,DateTime.Now())).ToString()) &”T00:00:00]”

Wow.. Ugly, hard to read, hard to debug. Just ran into issues, etc.

I decided to finally just write a function to do the same thing and make sure it is correct once and for all..

 

Function GetSSASDate(ByVal DateToRun As DateTime) As String
    Dim result As String = “[Time].[Calendar Date].&[”

    result = result & Year(DateToRun).ToString() & “-”

    If Month(DateToRun).ToString().Length < 2 Then
        result = result & “0”
    End If

    result = result & Month(DateToRun).ToString() & “-”

    If Day(DateToRun).ToString().Length < 2 Then
        result = result & “0”
    End If

    result = result & Day(DateToRun).ToString() & “T00:00:00]”
    Return result
End Function

then, I go to my report properties, Custom Code and add that function, then in my parameter I call it like so:

=Code.GetSSASDate(DateAdd(DateInterval.Day,-1,DateTime.Now()))

Much cleaner, easier and just all around good. If I wanted to take this a step further, I would actually just make an assembly with some useful functions and deploy to my SSRS server, but I will save that for a later time.

Technorati tags: , , , , , , ,
Categories
Business Intelligence Geeky/Programming SQLServerPedia Syndication

SSAS Cubes – Speeding up Queries and Performance

So, you have a SSAS 2005 (SQL Server Analysis Services) Cube, and its not performing how you want it to. What do you do?

The first thing to look at is your attribute relationships (http://technet.microsoft.com/en-us/library/ms174557.aspx). Make sure you don’t have redundant relationships (AS should warn you as well) By default when you create a cube, and go through the wizard, only the default attribute relationships are set up. What this means is that all attributes are related to the key attribute by default. If you have related attributes, you want to set up those relationships!! This will help in a couple of ways. First, when creating hierarchies, it won’t warn you that you don’t have relationships set up. Second, your aggregations will actually have a better design, Analysis Services will be able to take into account the relationships and your aggregations should hopefully rock!

What’s next? Look at your reports and queries off the cube. Can you redesign them? Are you running some crazy query? Here is a good tip for using sp_create_plan_guide to influence query plans.

My suggestion is to run SQL Profiler and see how your cube is being accessed. Are the aggregations being used? Are things being read from cache? Use the BIDS Helper to take a look at your aggregations and see the row counts and make sure everything is working as you have planned.

Ok, what else? Cache Warming.(http://cwebbbi.spaces.live.com/blog/cns!7B84B0F2C239489A!994.entry) After running Profiler, or even setting up something automated to capture the queries being ran into another table, you can use this info to “warm your cache”. Basically when you process your cube, run the most frequently ran queries automatically so then the next time users run them, it will pull from cache.

Balancing your cache, aggregations and storage design modes is something you seriously want to dig into. Should you be purely MOLAP? Or MOLAP with Proactive Caching? Or do you want to go purely ROLAP or all the different designs in between the two? You want to minimize data latency.

Some of these things often get overlooked when first designing a cube, which then could make it tough to do once you have things deployed to end users. Also, SSAS 2005 isn’t the best at letting you tweak some of this stuff, I watched a presentation on SSAS 2008 by Donald Farmer and things look A LOT better in SSAS 2008, which I cannot wait for!

But that is a post for another day 🙂

Technorati tags: , , , , , , , , , , , ,
Categories
Business Intelligence Geeky/Programming

Scale-Out Querying with Analysis Services

Today, I re-read this best practices article on MSDN – Scale-Out Querying with Analysis Services

The architecture they talk about makes sense. Multiple web servers on the front end, a “query” layer, and processing layer. One thing that I don’t get though is how a setup like that can only support 30-60 concurrent users. I guess I would like to see the numbers on the average concurrent users based on usage. Like how many users on the reports/applications cause concurrent usage? Are connections reused? Etc.

Seems that there could also be a better way than robocopying files around and starting stopping services, but maybe not. Looks like they have an SSIS package that kind of handles it all, which is cool.

Sometimes I wonder though, if these setups are also biased in such a way to maximized the number of SQL Servers (and licenses) that you need to get something like this going. Would be nice to see a study on something like this from a 3rd party, for sure.

Technorati tags: , , ,
Categories
Business Intelligence Geeky/Programming SQLServerPedia Syndication

Business Intelligence and Analysis Services in the "Real World"

A reader sent me an email this weekend:

I wonder if I could as your advice as a BI / Data warehousing specialist.I have been studying Analysis Services recently having read a couple of books Which step me through the process of building the cubes etc but as I don’t come From a DB background one thing is not clear to me is how does one determine that They need to use BI / Analysis Services etc in the real world? As you,  I am a .NET developer with a background of building thick client apps and am  Familiar with creating layered architectures etc building on frameworks like NHibernate  Etc to abstract out the DB stuff into my more familiar object world.   My question how does one Generally interface with this Data Warehousing / Analysis Services stuff in the real world? I am looking for answers from people who have used these technologies in anger and not Like me from canned text book examples scenarios etc. Thanks for your time it is would be appreciated.

 

And I wrote an response, but after reading the response I figured I could post it up here as it is pretty general.

Basically what I have seen is this. You make your application (either web or windows) that saves transactional data, or logs, or something like that. End users usually want reports off of that data. At first, developers will report directly off that data (OLTP database). It becomes slow, and unusable after tons of data is there, so then developers tweak timeouts, things like that. Then the next step is custom summarizations into other tables, and then archiving off transactional data. Usually, because developers don’t know about data warehousing/BI stuff, all this stuff is custom up to this point.

 

Eventually, someone realizes that, “hey, there is a way to do this better”, which involves a structured ETL either using stored procs or SSIS or something similar. Also, instead of just ad-hoc or custom summarization tables, a well designed data warehouse (OLAP) database is key.
Now, from there, you can write your reports off your OLAP database, which is OK because the data is summarized, scrubbed, etc. But you really give it an adrenaline boost when you create a cube off that data warehouse OLAP db, it takes care of the summarization, the relationships, all that. You put the reporting in the hands of your end users (excel 2007 for example) – let them pivot and slice and dice the data, its all ready and set for them to do it, with really nothing you have to do on your end except make sure the cube is processed regularly off the DW.

You are basically abstracting your OLTP data up two levels… In all reality you should be able to query your OLTP data for a time frame and get say revenue, and then query the OLAP and the CUBE and get the same results. Now, with your cube, you can still access data from it in your .NET apps using AMO, which is cool as well, or you can write Reporting Services reports directly off the cube as well, makes it a lot easier than writing custom .NET reports.
So, interfacing with your Data Warehouse, the best options to get data in is usually SSIS packages or stored procedures, no .NET coding really. To get data out, you would probably want to use Reporting Services, or you can query it with SqlClient in .NET apps like you would a regular OLTP database.
The cube, you can get data out using AMO objects in .NET, you can query using MDX, XMLA, etc in management studio, or you can write reporting services reports, but the best client is Excel 2007, its built to be tied to SSAS 2005, so it really works out well.

One thing about all the technologies (SSIS, SSRS, SSAS, AMO, etc) is that there really isn’t a lot online as far as examples and documentation, or it is scattered. It is getting better though. Most of the stuff I do I just have to figure out on my own, through trial and error, but it is OK.

Maybe not the cover all response but it kind of covers what I have seen in the past, from my own experiences, and others.

Technorati tags: , , , , , , , , , , , , , , , , , ,
Categories
Business Intelligence Geeky/Programming SQLServerPedia Syndication

SSASUtils: SQL Server Analysis Services Utilities – CodePlex

So, today I decided to try out Microsoft’s site: CodePlex. I recently created a utility app to process dimensions and partitions in SSAS (SQL Server Analysis Services) because I didn’t like the way it worked in SSMS (SQL Server Management Studio). I don’t know because I never used it, but I guess in pre-SP2 you could process multiple objects at the same time, just hit process, move on to the next one. In SP2, the process dialog is modal so you can get around it, you have to wait for the object to process – yuck!

Also, I just wanted to try out CodePlex and see how it works, what I like and don’t like. Since becoming a contractor doing Data Warehousing and BI stuff, I miss coding in .NET (and any other language), so I want to keep that up in my free time. This is a result of that. Hopefully once I get used to CodePlex I will contribute to some other Open Source .NET Projects, sounds like fun to me!

Here is the link to the project: http://www.codeplex.com/SSASUtils

Some of the glue code for connecting and processing I gleamed from other examples and projects I have seen around the web, some in VB.net , some that didn’t even work, etc. Hopefully more SSAS Utils can be added to this project and it can become way more useful that it is now (although I do use it pretty much every day right now).

CodePlex: I like it so far. There are few things that could be done better, but I am using the Team Explorer source control which fits nicely into VS2005, really cool. I will probably post up more on CodePlex later. If I get the urge, I might create project for some other stuff I have lingering around my hard drive (FFMC, PocketBlogger, other random utils). I am anxious to see what the community participation is like, I tired GotDotNet back in the day and it was weak, we will see with CodePlex.

Technorati tags: , , , , , , , , , , , , , ,
Categories
Business Intelligence Geeky/Programming Product Reviews SQLServerPedia Syndication

Reporting Services Scripter: Sync Reporting Services Instances and Objects

The other day, I was tasked with moving all objects from one SQL Server Reporting Services instance to another. I know you can move the database itself, but then you run into issues with encryption keys etc. I just wanted to move the objects. I know that pretty much everything in SSRS is extensible, so I knew I could write something to do it, but before I went out and coded it, I Googled first to see if someone else had something. Turns out there is an awesome utility: Reporting Services Scripter. Works like a charm!

Technorati tags: , , ,
Categories
Business Intelligence Geeky/Programming SQLServerPedia Syndication

SSAS: Create a Calculated Measure always getting the value for "Yesterday"

If you want to create a calculated measure in SSAS (SQL Server Analysis Services) that will always get the value for yesterday, this is what I have come up with:

 

CREATE MEMBER CURRENTCUBE.[MEASURES].[My Yesterday Measure]
 AS (StrToMember(‘[Time].[Calendar Date].&[‘+ VBA!Format(vba!dateadd(“d”, -1, vba![date]()),’yyyy-MM-dd’) +’T00:00:00]’),[Measures].[My Cool Value]),
FORMAT_STRING = “$#,##0.000”,
VISIBLE = 1  ;

What is really cool is you can test this before you even make the measure, using an MDX query:

 

WITH
   MEMBER Measures.[Yesterday Measure Test1] 
   AS ([Time].[Calendar Date].&[2007-07-09T00:00:00],[Measures].[My Cool Value])
   MEMBER Measures.[Yesterday Measure Test2] 
    AS (StrToMember(‘[Time].[Calendar Date].&[‘+ VBA!Format(vba!dateadd(“d”, -1, vba![date]()),’yyyy-MM-dd’) +’T00:00:00]’),[Measures].[My Cool Value]),FORMAT_STRING = “$#,##0.000”
SELECT {
Measures.[Yesterday Measure Test1]
,Measures.[Yesterday Measure Test2]
} ON COLUMNS
FROM [My Cube]

and now you can take that calculated measure and tweak for any days back (30, etc)

Cool! 🙂

Technorati tags: , , , , , , , , , ,