ADOMD.NET: Could not load file or assembly Microsoft.AnalysisServices.AdomdClient

If you are developing a .NET/ASP.net solution using ADOMD.NET locally, and everything works great, and you go to release to a production server and end up with this error:

Could not load file or assembly ‘Microsoft.AnalysisServices.AdomdClient, Version=9.0.242.0, Culture=neutral, PublicKeyToken=89845dcd8080cc91’ or one of its dependencies. The system cannot find the file specified.

You need to install the “Feature Pack for Microsoft SQL Server 2005” which you can find here: http://www.microsoft.com/downloads/details.aspx?FamilyID=50b97994-8453-4998-8226-fa42ec403d17&DisplayLang=en

Microsoft BI: Creating Local OLAP Cubes using XMLA and ascmd.exe

Most people, when using OLAP cubes, are hitting the live version that is located on SQL Server Analysis Services (SSAS). They hit it with Excel, or some other reporting tool, etc. I have blogged previously about offline cubes and how you could possibly use them.

Now, the blog I did before, I talked about creating them with Excel, and that works great for end users. But what about automating the process? What about filtering dimension members, or measure groups, etc?  Now that you can use the Panorama Gadget for Google Apps/iGoogle (http://google-pivot-tables.blogspot.com/2008/04/panorama-analytics-gadget-for-google.html) you can upload .cub files and do BI in the cloud, how cool is that!

506479481_683e31e6db

Well, one option is purchase CubeSlice – http://www.localcubetask.com/  and use that to create your .cub files. CubeSlice works great, and is a good option if you want something with a ton of options and ease of use.

You can also create .cub’s using CREATE GLOBAL CUBE syntax in MDX, and you can also use XMLA to create a .cub. Options galore! Chris Webb blogged a few years about about using XMLA to create .cub’s here – http://cwebbbi.spaces.live.com/blog/cns!7B84B0F2C239489A!877.entry  He was using the MDX Sample App (I believe that comes with SQL 2000?)

What I was looking for was an easy way to us GLOBAL CUBE or XMLA and automate it, CubeSlice is nice, but there has to be a way to replicate (at least in a basic way) what they are doing. Thinking about XMLA – as to me it seems more powerful, that was the route I chose. I didn’t want to install the MX Sample App from SQL2K, and what Chris Webb says in his blog, and other things I read, basically the problem with using XMLA to create you .cub in SQL Server Management Studio is that you can’t specify a connection string, you have to connect to an SSAS instance. Using the MDX Sample App, you can specify a local file, and then run the XMLA and it will create the .cub file. So I just need to replicate that. 

I have also blogged about using ascmd.exe here . ascmd comes with the SSAS Samples with SQL 2005 (usually located here: C:Program FilesMicrosoft SQL Server90SamplesAnalysis ServicesAdministratorascmd) . You just need to compile it and you can use it to execute XMLA. So I decided to try that. I created an XMLA file to create my cube, and then executed it with ascmd.exe

ascmd -S c:MyOfflineCube.cub -i c:MyOfflineCube.xmla

In seconds, you have an offline cube. Now, in your XMLA you can have filters, etc. You could create an XMLA file that has some kind of variable, and you could create a replace task to replace that with what you wanted filtered, think maybe based on location, or employee, etc, and then kick out personal cubes for users, etc.

One thing I didn’t really get into is how to get your XMLA script. You could figure it out for yourself, but I actually just used CubeSlice for this example. You can create an offline cube with CubeSlice and actually see the XMLA code and use that. Maybe creating the XMLA script manually/yourself would be good info for another blog post 🙂

SQL 2005, SSAS 2005: Using ascmd.exe To Create SQLAgent Jobs That Give You Completion Status

In SQL2005, you can create SQL Agent jobs. They can be scheduled, have multiple steps, alert, notify, etc. Pretty great setup. There are some downfalls though.

Like in order to call a job from a job, you need to execute the second job with T-SQL. Thing is, it doesn’t run synchronously. It runs asynchronously, which really stinks if you want to wait to see if the second job completes successfully or what not.

Another thing, if you call an XMLA script from a SQL Agent job, if the XMLA query or command fails, the SQL Agent job still reports success – that’s no good! What can you do? Use ascmd.exe!

ascmd.exe is a utility that you actually have to download the SQL Samples from CodePlex (http://www.codeplex.com/SqlServerSamples/Release/ProjectReleases.aspx?ReleaseId=4000) and then build the ascmd solution, to get ascmd.exe A few notes. The samples are VS2005, and I don’t have that installed ,so I had to open with VS2008, then it is digitally signed, and when building couldn’t find the .snk file to sign it, so I turned that off as well, after I had it built, I did some testing locally and made sure it would work as I wanted it to.

You can test by just calling it from the cmd line, for example:

ascmd.exe -S SSASServerName -d SSASDatabaseName -i MyXMLA.xmla

you can use -Q and try to pass in XMLA, but then you have to handle all the special characters and what not, which is a pain. Now, if you just put this on your C drive, (or wherever), then create a SQL Agent Job to call this command, it will fail the job if the ascmd.exe reports back failure. Exactly what we want!!

You will notice in the SQL Agent job setup that you can specify the value for a failed command  – 0 is the default and that is what we want.

Now, get some XMLA that you know works, set it up in the xmla file and test it, the job should succeed. Now just change something in the xmla (like CubeId) to some fake value and test it, the job should fail, and you can have it alert your DBA’s or whoever.

Pretty sweet, but I wish SQL Agent would handle failed XMLA like a failed query and report failure, I am not sure if in SQL 2008 it does that or not, but it would make life a lot easier. Otherwise you could be scratching your head trying to find where stuff is failing, looking in logs, etc, but not seeing any failures. The only way you would be able to tell is to run the XMLA manually, ugh 🙂

 

Server Move: Hosting My Own Site(s)

Well, I have been on HostMySite for a few years now, and I just wasn’t liking it. I didn’t have control, and couldn’t do everything I wanted to. Last week I signed up with ServerBeach and I have my own dedicated server. I am running Windows 2003, IIS, SQL, MySQL, PHP, etc.

I changed for now my DNS to GoDaddy from Active-Domain, since GoDaddy has SPF records (for Google talk federation, etc). I might change that up here in the future as well.

Transferring a domain takes way to long! Took like 4 days overall. Also, I first went with EasyCGI , a VPS solution, but the box couldn’t get to the Internet! I put in tickets, etc, they just wouldn’t fix it, so I canceled and went to ServerBeach. I am liking them so far, they had the server provisioned in about 4 hours.

One thing I can do now as well, is write a site for myself and import all the log files (Firewall, SMTP, All the Sites, FTP, etc) and have a nice little reporting solution.

ServerBeach is good, very professional. If you want to sign up, use referrer code BW27Q37B6D  (http://www.serverbeach.com/)

To get PHP/MySQL/WordPress, I loosely followed these three tutorials

How to Install PHP on IIS 6.0
How to Install MySQL On IIS 6.0
How to Install WordPress on IIS 6.0

For my ASP.NET Site, it was just copying over files and configuring IIS a little. For my blog, I used this dbbackup plugin (http://www.ilfilosofo.com/blog/wp-db-backup/) and then once I had my WordPress files over, and MySQL/PHP running, I connected to the instance of a DB I created and ran the SQL backup script, which created everything the way I needed it for WordPress.

For Mail, I use the built in SMTP in Windows 2003. I just had to turn on relaying, but yeah, spammers, so I just allowed the local server to send mail, and it works. Tested it using this (http://support.microsoft.com/kb/323350) and it worked.

Other than that, its pretty much basic server admin stuff. Firewall, FTP, etc. Everything is running smooth… for now 🙂

I will just have to make sure I have backups!

 

SQL Server 2005 Performance Dashboard Reports

Check these out. http://www.microsoft.com/downloads/details.aspx?familyid=1d3a4a0d-7e0c-4730-8204-e419218c1efc&displaylang=en

If you are a DBA, these are pretty much money. I have seen some many custom implementations and DBA’s trying to get reports like this, and if you have SQL Server 2005 SP2, you can download that and run a SQL script and then you have these reports. They give you insight into pretty much everything going on with your server. Why pay $1000’s for a product that will give you the same info when it is built right into SQL Server!!

SQL 2005: Using Data Management Views (DMV's) to View Status of Database and Log Restores

SQL 2005 introduced some cool "data management views" (DMV’s) that let a DBA see what is happening on their database. Previously you had to use undocumented system tables and information schemas to get the info, and a lot of the info wasn’t even available. Well recently I was doing some backup/restore stuff on a database and wanted to know when it would finish. The first question I ask is – why isn’t this built into the SSMS view of a db, when it says "DatabaseName (Restoring)" why couldn’t they just add the % done at the end of something? Anyway’s – if you want to see what is restoring on your server and where it is at, go ahead and use this query:

use master
go

SELECT
    percent_complete AS ‘PctComplete’,
    start_time AS ‘StartTime’,
    command AS ‘Command’,
    b.name AS ‘DatabaseName’,
    DATEADD(ms,estimated_completion_time,GETDATE()) AS ‘EstimatedEndTime’,
    (estimated_completion_time/1000/60) AS ‘EstimatedMinutesToEnd’
    FROM sys.dm_exec_requests a
    INNER JOIN sys.databases b ON a.database_id = b.database_id
    WHERE command like ‘%restore%’
    AND estimated_completion_time > 0

I did notice that it says 100% complete but then the restore still takes about 20-30 seconds to finish, so be warned there. Also, if you are restoring a DB, the database name will be master (from what I have seen, it was a fresh restore). If you are just restoring logs, it will show you the database name of the database where the logs are restoring.

Take this and create a quick SSRS (Reporting Services) report. Or even better schedule this to run every 2 minutes or something and maybe once it sees some records, kick off a report, or make a cool little desktop app that has balloon popups and statuses, so as  DBA you can see what is going on with your restores. Maybe I will create that app one Saturday if am bored or something 🙂

SQL 2005 SSAS Deployment Wizard – Where Did My Roles Go?

Ok.. if you are using SSAS 2005 and using the deployment wizard (not using BIDS) – probably because you want to retain your partitions or something. But anyways, if you use it and choose the option to retain your roles, you will deploy, and no one will be able to get to your SSAS DB or Cubes. What gives?

See this link on Microsoft. Doesn’t make sense to me but I guess in Microsoft speak Retain roles and members really equals Screw up roles and members and leave you scratching your head.