Categories
Business Intelligence

App Store Pivot Viewer – ZoomAppy

Good ideas are hard to stop from happening. After dorking around with Pivot Viewer, I was thinking of things I could “pivot”, and the iTunes App Store was one of them. No real API though, there are some out there that have created APIs, or you could scrape the web, etc but nothing solid.

Looks like someone took that idea and ran with it, called ZoomAppy (http://zoomappy.com/). RIght now it looks like it is just the app store, but they have more in store.

I agree that Pivot Viewer is “Business Intelligence”, but it is a different way of thinking. Where in traditional BI, you think of looking at metric/measures like sales/inventory, etc. With PivotViewer, you are looking at “objects” and filtering them based on properties. Teams, Bikes, Apps, Cars, People. Also PivotViewer doesn’t give you any type of aggregations besides counts, so it is limited in that regard.

What else would be a good candidate for Pivot Viewer?


Categories
Geeky/Programming Product Reviews

Tool of the Day: Sysinternals ProcMon

Funny how you might not EVER use a given tool, and some days you might end up using it twice. Sysinternals Procmon was that tool today.

It is the successor from old utils from Sysinternals – Filemon and Regmon.

What does it do? It monitors all processes and services and watches what they are doing on your system. File, Registry, etc, etc. Open/Close, Read/Write, what user, status, etc. You can filter and pause and find out pretty much anything going on in windows.

So early in the day, running into a website issue, not loading in IIS. No idea what is going on. Fire up Procmon and filter to the website directory on disk. Lo and behold, the site is trying to impersonate a user, and that user doesn’t have permissions. The site still didn’t work, and if I would have dug a bit more with Procmon, would have found that the user impersonating on the site also needed rights to the ASP.NET Temporary files, but after seeing the impersonation and the site still not working, I guessed it didn’t have rights to the temp folder.

Things like the scenario above I have seen people waste a support call with Microsoft with.

Second thing today. Trying to install a extension to SSRS. The installer isn’t even seeing that SSRS is installed, yet it clearly is and functioning correctly on the box. Some how the installer must be reading something or looking somewhere and not finding something. Procmon to the rescue. Fire it up, watch msiexec.exe. Seeing registry reads, it finds the SSRS instance names, then looks to a registry area with that instance name and tries to find more details. Was failing on finding the details because there was no reg keys in the second location (for whatever reason). But there was info in the first location, the same info it was looking for. I exported it out, changed the reg path of the keys, and imported. Re-ran setup and it found the instance this time and I could install the extension.

Without Procmon would have been flying blind or just guessing randomly on what to do. Could have been hours on tech support with a company, or again, a support call with Microsoft.

Procmon saved the day. Check it out and try to use where applicable in your day to day troubleshooting.

Categories
Business Intelligence Geeky/Programming Reviews

Microsoft Live Labs Pivot Viewer – Rich Internet Application

So, I previously blogged about using PivotViewer in your Web Applications, but you can also just consume Pivot collections using the “Pivot” tool from Microsoft Live Labs

You can download it here

What does this tool offer? Well first it has a library/homepage of collections you can browse

You can do some slicing and dicing on a collection of Presidents, or athletes, or Sports Illustrated covers. This tool and technology really fasinates me. It is “Business Intelligence” but in a different way – it is based on “objects” (images) instead of “metrics”. I like it.

What are some cool things I think this could be used for? Company Directory? Online Catalog? Beer selection at Eddie’s? the list goes on and on..

Additionally there is now an add in for SQL Server Reporting Services and SharePoint 2010 you can download here

Once I have an environment in which I can test that set up, I will blog about it.

Categories
Business Intelligence SQLServerPedia Syndication

Where does PowerPivot Fit?

Now that SQL Server 2008 R2 is out, and Excel 2010 is out. You can get PowerPivot (http://powerpivot.com/) and create your own in memory cubes!

…. Or something like that.

I still haven’t figure out where PowerPivot fits in a business scenario. Why?

Well first you have what are now being called the “old school” BI users, that use Excel to connect to an SSAS cube and create fancy pivots and reports, maybe convert to formulas and create some nice reports/dashboards.. analytics.

Then you have people who only consume canned/standardized reports, through SSRS mostly, or maybe Excel Web Services… but they don’t create. Just consume.

You might even have power users, who take Report Builder and create those SSRS reports for other users. Awesome.

But then, you have this tool, PowerPivot. What can you do? Hit databases (mostly… cubes and other sources as well), bring back data, relate it, and create pivot reports/graphs off of it.

But you better be pretty dang advanced as a business user to use PowerPivot. I could count on one hand the users (that I have dealt with over the last 10 years) I would feel comfortable giving it to and not ending up with more of a headache.

What do I think is still missing from the Microsoft BI toolset? Looking at Business Objects, the Web Intelligence. Universes. You create a Universe off of a datasource and expose it out to the user, they can create reports/ad-hoc whatever off of it.

Kind of like the ever elusive “Report Model” in the Microsoft stack that no one ever uses, ever will use, or has no reason to use. But in BO, they make it useful.

I don’t see PowerPivot taking the place of a Report Model/Universe, so where does it fit? IT Analysts making “pre” cubes before you actually make cubes for your users that just want to hit it with Excel and not care about anything else?

Or people who just want to create their own cubes in silos. Tell me how that lends itself to “one version of the truth”?

Either way, we will see how it evolves and hopefully find some good use for it. 🙂

Categories
Business Intelligence Geeky/Programming SQLServerPedia Syndication

SQL Server Agent – Query To Find Long Running Jobs

I use SQL Agent a ton. It is like my go to place to schedule jobs and tasks. Problem is, there are pieces “missing”. Like if a job hangs, you can have it auto stop after a given amount of time. It doesn’t alert on changes to a job, etc, etc.

I asked on twitter what the DBA’s think, @SQLRockstar thinks using OpsMgr is the way, which probably would be great, if I had it. I need a quick and dirty solution.

What I did was query the sysjobs, sysjobactivity, and sysjobhistory tables to get some kind of “look” into the running jobs. That way, if a job that is normally 45 minutes has been running for 5 hours, I should be able to catch it early on, just trying to be more proactive instead of reactive.

SELECT
	j.job_id AS 'JobId',
	name AS 'JobName',
	start_execution_date AS 'StartTime',
	stop_execution_date AS 'StopTime',
	avgruntimeonsucceed,
	DATEDIFF(s,start_execution_date,GETDATE()) AS 'CurrentRunTime',
	CASE WHEN stop_execution_date IS NULL THEN
			DATEDIFF(ss,start_execution_date,stop_execution_date) ELSE 0 END 'ActualRunTime',
	CASE
		WHEN stop_execution_date IS NULL THEN 'JobRunning'
		WHEN DATEDIFF(ss,start_execution_date,stop_execution_date)
			> (AvgRunTimeOnSucceed + AvgRunTimeOnSucceed * .05) THEN 'LongRunning-History'
		ELSE 'NormalRunning-History'
	END 'JobRun',
	CASE
		WHEN stop_execution_date IS NULL THEN
			CASE WHEN DATEDIFF(ss,start_execution_date,GETDATE())
						> (AvgRunTimeOnSucceed + AvgRunTimeOnSucceed * .05) THEN 'LongRunning-NOW'
			ELSE 'NormalRunning-NOW'
		END
		ELSE 'JobAlreadyDone'
	END AS 'JobRunning'
 FROM msdb.dbo.sysjobactivity ja
	INNER JOIN msdb.dbo.sysjobs j ON ja.job_id = j.job_id
INNER JOIN (
	SELECT job_id,
	AVG
	((run_duration/10000 * 3600) + ((run_duration%10000)/100*60) + (run_duration%10000)%100)
	+
	STDEV
	((run_duration/10000 * 3600) + ((run_duration%10000)/100*60) + (run_duration%10000)%100) AS 'AvgRuntimeOnSucceed'
	 FROM msdb.dbo.sysjobhistory
	WHERE step_id = 0 AND run_status = 1
	GROUP BY job_id) art
	ON j.job_id = art.job_id
WHERE
(stop_execution_date IS NULL) OR
	(DATEDIFF(ss,start_execution_date,stop_execution_date) > 60
	AND
	CAST(LEFT(start_execution_date,11) AS DATETIME) = CAST(LEFT(GETDATE(),11) AS DATETIME))
ORDER BY start_execution_date DESC

The inner query looks at the history to get the average runtime, and I added the standard deviation to that to make it more realistic, in case you have a one off that throws the average off. Of course this all depends on you having more than 3 entries in your job history as well.

I then just compare the average to the current runtime (if the job is executing) and to the actual (if it is complete).

You could use this query in a report, to view throughout the day to catch long running jobs, you could also use it to just give you the one that are currently running and over the time threshold, and alert you, so you can catch them right away.

I tested this on a few servers running SQL 2005, but I am guessing it will work on SQL 2008 as well. Find and kill those long running jobs!!

Categories
Business Intelligence SQLServerPedia Syndication

Office 2010: Excel 2010, What-If Analysis aka Microsoft Finally has Built in Cube/OLAP Writeback!

Digging into the blog post from earlier this summer I wanted to see what was new and exciting in Excel 2010.

Recently I have been working on an cube and we want to be able to budget right from the cube. There are also many other cubes/scenarios where the ability to writeback to the cube would be awesome. Some BI tools have had this for many years! Microsoft had something similar with a Excel 2002/2003 add-in, but it has been removed. Also, there are many 3rd party tools to allow this. You could also write your own macros or VBA/.NET code to do this as well, but what was always missing was the ability to do writeback directly from an Excel (OLAP) PivotTable. With Excel 2010, this functionally finally shows up.

First, you need a cube, and you need to enable writeback on a partition. This will automatically create a table in your database where you have your data warehouse. Lets say you have a table FactBudgets, and you enable writeback, SSAS will create a table WriteTable_Budgets. This is a trivial example I went through to test this functionality, but I just wanted to exhibit the feature of the writeback.

01_writeback

Turn on writeback, deploy and process your SSAS cube, and then open Excel 2010. Connect to your cube, and then in the PivotTable ribbon menu, on options, there is a button to turn on “What-If Analysis”. Turn it on. 🙂
02_whatif

Once you turn on the setting, you can then begin writing back values to your cube, right from Excel. If you click on a cell in your writeback value, you can just change it. See on the screen shot below, the value I changed, the little purple triangle, tells me it has a changed value from what is in the data source.

03_valuechanged
04_valuechangedmenu

You can see its telling me the value changed, I can Discard the change and other options. After you have your values set, you want to publish them, which you do back on the PivotTable ribbon menu:

05_publishchanges

As you can see, Microsoft has finally created a viable solution for writing back values into your OLAP cubes without the need for 3rd party software or coding. Finally!

One thing to note, if you try to write back to a value that is in a partition that doesn’t have writeback enabled, you will get an error.

06_error

Now, think of the possibilities with SSAS OLAP writeback and Excel, now that we can actually use it out of the box!

Categories
Business Intelligence Geeky/Programming SQLServerPedia Syndication

SSASMeta – C# App to Log Info About SSAS Objects

I manage some servers that have many cubes. OK, a lot of cubes (60+ on one). I needed some way to output a report of last processed time, last schema update, etc. Now, there are about 5 different ways to do this (one being the SSAS Stored Procedure Project), but this is what I came up with. I wrote a 100 line C# app to take a server name, loop through the SSAS DB’s, cubes, measures, partitions, and dimensions and log info about them.

Here is a c# code snippet of a function that just outputs to the console, the app I have actually logs the info to a SQL Server database and then I can write reports off that.

     private static void LogSSASInfo(string serverName)
        {
            var server = new Server();
            server.Connect(serverName);

            foreach (Database database in server.Databases)
            {
                Console.WriteLine(database.Name + " " + database.LastUpdate + " " + database.EstimatedSize / 1024 + " " + database.CreatedTimestamp);

                foreach (Cube cube in database.Cubes)
                {
                    Console.WriteLine("     Cube: " + cube.Name + " " + cube.LastProcessed + " " + cube.LastSchemaUpdate);

                    foreach (MeasureGroup measureGroup in cube.MeasureGroups)
                    {
                        Console.WriteLine("         Measure Group: " + measureGroup.Name + " " + measureGroup.LastProcessed);

                        foreach (Partition partition in measureGroup.Partitions)
                        {
                            Console.WriteLine("             Partition: " + partition.Name + " " + partition.LastProcessed);
                        }
                    }
                }

                foreach (Dimension dimension in database.Dimensions)
                {
                    Console.WriteLine(" Dimension: " + dimension.Name + " " + dimension.LastProcessed);
                }

                Console.WriteLine("");
                Console.WriteLine("------------------------------------------------");
                Console.WriteLine("");
            }

            server.Disconnect();
        }

As you can see, it isn’t the most elegant code in the world, but it works. In order to get this to work in your project, you need to reference the Microsoft.AnalysisServices assembly.

ssasmeta

Use your imagination, you could make an app wrap that function above and log info for all the SSAS instances on your network. There have been a few times already in the last year where I have found some cube or measure group not updating correctly and a report like the one I can get now will help dealing with that challenge.

Categories
Business Intelligence Geeky/Programming

ADOMD.NET: Could not load file or assembly Microsoft.AnalysisServices.AdomdClient

If you are developing a .NET/ASP.net solution using ADOMD.NET locally, and everything works great, and you go to release to a production server and end up with this error:

Could not load file or assembly ‘Microsoft.AnalysisServices.AdomdClient, Version=9.0.242.0, Culture=neutral, PublicKeyToken=89845dcd8080cc91’ or one of its dependencies. The system cannot find the file specified.

You need to install the “Feature Pack for Microsoft SQL Server 2005” which you can find here: http://www.microsoft.com/downloads/details.aspx?FamilyID=50b97994-8453-4998-8226-fa42ec403d17&DisplayLang=en

Categories
Business Intelligence Geeky/Programming SQLServerPedia Syndication

ETL Method – Fastest Way To Get Data from DB2 to Microsoft SQL Server

For a while, I have been working on figuring out a “better” way to get data from DB2 to Microsoft SQL Server. There are many different options and approaches and environments, and this one is mine, your mileage may vary.

Usually, when pulling data from DB2 to any Windows box, the first thing you might think of is ODBC. You can either use the Microsoft DB2 driver (which works, if you are lucky enough to get it configured and working), or the IBM iSeries Client Access ODBC Driver (which works well), or another 3rd party ODBC driver. Using ODBC, you can access DB2 with a ton of different clients. Excel, WinSQL, any 3rd party SQL Tool, a MSSQL linked server, SSIS, etc. ODBC connects just fine, and will work for “querying” needs. Also, with the drivers you might install, you can usually set up an OLE DB connection if your client supports it (SSIS for example) and query the data using OLEDB – this works as well, but there are some caveats, which I will talk about.

In comes SSIS, the go to ETL tool for MSFT BI developers. You want to get data from DB2 to your SQL Server Data Warehouse, or whatever. You try with an OLEDB connection source, but it is clunky, weird, and sometimes doesn’t work at all (PrimeOutput Errors Anyone?). If you do manage to get OLEDB configured and working, you still probably will be missing out on some performance gains compared to the method I am going describe.

Back to SSIS, using ODBC. It works. You have to create an ADO.NET ODBC connection, and use a DataReader source instead of an OLEDB source. Everything works fine, except one thing. It is slow! Further proof?

http://social.msdn.microsoft.com/Forums/en-US/sqlintegrationservices/thread/162e55e5-b64b-423e-94c1-dd764ca1f683

http://www.sqlteam.com/forums/topic.asp?TOPIC_ID=96977

http://social.msdn.microsoft.com/Forums/en-US/sqlintegrationservices/thread/cfade7e7-50d5-4447-9821-35c5d5ae1b66

http://www.sqlservercentral.com/Forums/Topic702042-148-1.aspx

http://www.sqlservercentral.com/Forums/Topic666993-148-1.aspx

Ok, enough links. But if you do read those. SQL 2000 DTS is faster than using SQL 2005/2008 SSIS. WTF? The best I can guess is that it is because of the .NET wrapper around ODBC. DTS is using “native” ODBC.

So, now what? Do we want to use DTS 2000? No. What to do though?

Well, after a few days of research, and just exploring around, I think I have found a good answer.. Replace DB2 with SQL Server.. just kidding. Here is what you need to do:

Install the IBM Client Access tools. There is a tool called “Data Transfer From iSeries Server” which the actual exe is "C:Program FilesIBMClient Accesscwbtf.exe"

image

This little tool allows you to set up data transfers from your DB2 system to multiple output choices (Display, Printer, Html, and Text). We want to export to Text file on our filesystem. You have to set up a few options, like the FileName, etc. In “Data Options” you can set up a where statement, aggregates, etc.

If you output to a file, you can go into “Details” and choose a file type, etc. I use ASCII Text, and then in the  “ascii file details” I uncheck all checkboxes. You set up your options and then hit the “Transfer data from iSeries” button and it will extract data to the file you chose in the filename field. Pretty sweet. But this is a GUI, how can I use this tool? I am not going to run this manually. Well, you are in luck.

If you hit the “Save” button, it will save a .dtf file for you. If you open this .dtf file in a text editor, you will see all options are defined in text, in a faux ini style. Awesome, we are getting somewhere.

Now, how do you run this from a cmd prompt? Well, we are in luck again. Dig around in C:Program FilesIBMClient Access and you will find a little exe called “rxferpcb”

image

What this tool allows you to do, is pass in a “request” (aka a DTF file), and a userid/password for your DB2 system, and it will execute the transfer for you. Sweet!

Now what do we do from here?

1) Create an SSIS package

2) Create an execute process task, call rxferpcb and pass in your arguements.

3) Create a BULK Insert task, and load up the file that the execute process task created. (note you have to create a .FMT file for fixed with import. I create a .NET app to load the FDF file (the transfer description) which will auto create a .FMT file for me, and a SQL Create statement as well – saving time and tedious work)

Now take 2 minutes and think how you could make everything generic/expression/variable driven, and you have yourself a sweet little SSIS package to extract any table from DB2 to text and bulk load it.

image

What is so great about the .DTF files is that you can modify them with a text editor, which means you can create/modify them programmatically. Think – setting where statements for incremental loads, etc.

image

 

You can see from the two screenshots above, that is all there is. Everything is expression/variable drive. Full Load, and Incremental Load. Using nothing but .dtf files, rxferpcb, a little .NET app I wrote to automatically create DTF’s for incremental (where statements), truncate, delete, and bulk insert. I can load up any table from DB2 to SQL by just setting 3 variables in a parent package.

After you wrap your head around everything I just went over, then stop to think about this. The whole DTF/Data Transfer/etc is all exposed in a COM API for “Data Transfer Automation Objects’”

http://www-912.ibm.com/s_dir/slkbase.NSF/643d2723f2907f0b8625661300765a2a/0c637d6b03f927ff86256a710076ab22?OpenDocument

With that information at your disposal, you could really do some cool things. Why not just create a SSIS Source Adapter that wraps that COM object and dumps the rows directly to the SSIS Buffer, and then does an OLEDB insert or Bulk Insert using the SQL Server Destination?

I have found in my tests that I can load over 100 million row tables – doing a full complete load, in about 6-7
hours. 30-40 million row tables in 4 hours. 2 to extract, 2 to BULK insert. Again, your mileage may vary depending on the width of your table, network speed, disk I/O, etc. To compare, with ODBC, just pulling and inserting 2 million records was taking over 2 hours, I didn’t wait around for it to finish. Pulling 2 Million records with my method described in this blog takes about 3-5 minutes (or less!)

I know I have skimmed over most of the nitty gritty details in this post, but I hope to convey from a high level that ODBC/OLE DB just aren’t as fast as the method here, I have spent a lot of time over the last few weeks comparing and contrasting performance and manageability. Now, if I could just get that DB2 server upgrade to SQL Server 2008. . . Happy ETL’ing!

Categories
Geeky/Programming Life Product Reviews

Apple Time Capsule Rocks – Microsoft needs to make one.

Since I make a living and love working on Windows, SQL, Windows Server, Office, Exchange, etc, it is kind of weird I am a Mac guy. Never thought it would happen. iPhone, Mac, Macbook Pro, Apple TV, Time Capsule, accessories…

Anyways, since Ella was born, I have TONS of pics and videos of her, on my laptop. And it auto backs up to my Apple Time Capsule. Well, iMovie auto imports videos from iPhoto, and then removes them? I am not sure, but long story short, I was missing some videos from when we got home from the hospital.

So what do I do? Freak out? No.. just fire up the Time Machine on my Mac, go back to February, and there is my iPhoto Library, I restore it, and get the videos back.

Now, I just need to offload it offsite someone, there is MacMini Colo – Transport – http://www.macminicolo.net/transport/ which looks promising, just a little too much $$, but maybe my next step.

But really, Windows needs this. We have a few Windows machines here at home, and I just feel like, umm, stuff is volatile. Thank god for flickr..that is all I have to say. I know there is Windows Home Server, but that is another BOX and updates, and whatever. I just want a device, a dumb device. I know I can get NAS and whatever, but I just want simple. Where is the Microsoft Time Capsule?