Categories
Geeky/Programming

Create a Word Cloud From Your Twitter Feed

I love playing with data. My data makes it even more fun. Wordle has been around for a long time, and so has Twitter (in Internet years anyways). I have always been fascinated by word clouds and visualizing text patterns, etc.

I figured that hey, there has got to be some analyzer for your twitter stream, and I am sure there are a ton, but I didn’t stumble upon any with some easy Googling, so I did it the hard way.

First goal? Get your Twitter feed and/or data somehow. Multiple ways to do this, but I stumbled upon a pretty cool site. http://tweetbook.in that let’s you create an eBook from your Twitter feed and favorites. It let’s you publish out as a PDF or XML file, so I figured that would work. It is a busy site and you may have to wait to get in but once you do you just oAuth it up to Twitter and grab your data.

Now, once you have your data, you need to do something with it. The data would be in XML so you need to parse out the data you want, for instance, I wanted to analyze my “favorites” so I wanted to get the text out of the XML. Here is my first favorite on Twitter (by the way, it will only go back 3200, I think – I only have 2600 or so faves)

  881539697
  A "Manager" class is like my grandmother's junk drawer.
  Fri Aug 08 14:23:56 +0000 2008
  web
  jeremydmiller
 

Well I just want to grab that <text> value, and without having to do any programming or powershell or C#, I fired up trusty old LINQPad (more on this tool in future posts for sure). I then just wrote a quick little query against the XML file like so:

var xml = XElement.Load (@"c:fave.xml");

var query =
  from e in xml.Elements()
  select e.Element("text").ToString().Replace("","").Replace("","").Replace("RT ","");

query.Dump();

As you can see, I am just loading up the xml file and doing some text cleanup (removing the xml text blocks and removing RT’s, the old syntax which muddies up the results). Note in LINQPad you need to change the query type to C# Statements instead of the default C# Expression.

Once I had my values in the results I wanted, I did a quick CTRL+A, CTRL+C (it still baffles me how many people don’t know CTRL+A is “Select All”) and then pasted it into notepad++, to view, and cleaned up some html characters there (quotes, etc) and then pasted it into Wordle. Here is what I got back:

wordle_of_my_favorites

You can see I really like to favor SQL Server, Microsoft, iPhone, Blogs, sqlpass, Google, SharePoint, Twitter, and pretty much everything geeky. Pretty dang cool. Why doesn’t Twitter offer something like this? I think it would be cool. What other cool things have you done with your “data” – what cool things would you like to see?

Categories
Geeky/Programming Product Reviews

Winforms DevExpress Grid

In the current agile sprint for the dev team I manage, we decided to start replacing ListViews (custom ones at that) and grids with the XtraGrid from DevExpress.

Now, normally I shy away from 3rd party controls, or want to vet them, but I knew there was no way we could do the same functionality in the default grid in .NET. One of the guys on the team did a story the sprint before to do a proof of concept comparing various grids and showing the pros and cons. DevExpress came out ahead, functionality and performance.

What we are seeing now is huge gains. We can use the new grid and functionality we couldn’t even begin to think of, is there by default. Grouping, searching, filtering, FAST performance, print preview and formatting, etc, etc, etc. Tons of options.

In fact, there are TOO many options. It makes it hard for us to digest all the possibilities in what we want to turn off, or how to integrate with existing forms, etc. A good problem to have.

All I know, is if you are doing any serious .NET work (Winforms, WPF, even Web/Silverlight) – it might make some sense to take a look at DevExpress. Focus on your business rules and integrating other parts of your systems, not reinventing the wheel with a crazy custom grid.

In the coming months I will try to talk about other areas that you can “outsource” to 3rd parties, where it makes sense. (And no, I don’t get anything from DevExpress for this post. Just calling it how I see it).

Categories
Geeky/Programming Work

VB6 to .NET Migration: Preparation and Migration

In my previous post on this topic, VB6 to .NET Migration: Decision and Analysis, I talked about the decision to migrate instead of rewrite, and also the analysis portion of the project.

Once you are ready to go, what happens? You need to prepare for the migration process. What ArtInSoft needs in Test Cases and Test Scripts so they can run through your application on their end in QA (the VB6 version) and then once they migrate it, run through the same scripts again (once migrated to .NET).

My guess is shops have some kind of test scripts, but nothing too extensive. This is where we were scrambling, we had things all over the place and not very complete test scripts/cases. This ended up biting us in the end as a lot of the app wasn’t covered by test cases so it was hard for ArtInSoft to test and migrate it without tons of exploratory testing.

Since the migration, we have shored up our test scripts and test cases by using a QA firm, Beta Breakers, but that is for another post.

After ArtInSoft gets your test scripts, they run through them and ask any questions, and in the mean time, their dev team is migrating your code and looking for areas they can automated in the migration, that aren’t in the main migration scope of the their migration tool.

ArtInSoft sets up a SharePoint portal for your project so you can share documents and other things, as well as a discussion forum for “technical queries” – or things they want to know on a technical level what they should do during the migration.

As you can guess, many things in .NET aren’t the same as VB6, the way some controls react, etc. They want to know what to do in those cases. Also, you might have some crazy things going on in VB6 that have no direct translation to .NET, so they need to know what to do there, overall though it shouldn’t’ be too bad

Along the migration, you can get the migrated code as they are doing it every once in a while and look at it and get into source control on your end if you’d like. Also, up to to a certain point you can send minor changes to ArtInSoft, so you don’t have a to have a code freeze for the entire migration. Obviously at a given point, you can’t be sending changes, if you have them, you can do a “change control” request, but it is going to cost you, which is expected.

As they work on the migration, you can work on other things on your end, whatever you’d like. Towards the end of the migration (development), you will send 1-2 people down to Costa Rica to help finish the migration and deep dive into testing and development.

The final phase of the of the migration comes when your team becomes the “testing team” and you do User Acceptance Testing (UAT). You have a specified amount of time to test on your end and send bugs back to ArtInSoft. This is a good back and forth for 1-2 months possibly depending on the size of your application. Once you are OK with the code and app, you sign off and the code in your hands. Next up.. your own “fixing” period.

Categories
Geeky/Programming

Japanese Character Sets – Input, Identification and Comparison

As a byproduct of some work our team is doing, we needed to test and handle different character sets for Asian languages (Katakana, Hiragana, etc). A guy on the team came up with a pretty sweet proof of concept test application so we can trial combinations and see how things work based on ignoring widths, etc. even some database interaction. We figured this would be useful to others in the .NET community so we are giving back

Check out http://jpcharsets.codeplex.com/ for more info, but basically you can play around with the different character sets to your hearts content and look at the source code behind the comparisons of different sets. Enjoy!

Categories
Geeky/Programming

VB6 to .NET Migration: Decision and Analysis

Recently completed a huge project migrated a fairly decent size code base from Visual Basic 6 (VB6) to C# .NET. Yes, I said “migrated”, not rewrite – let’s get that out of the way up front. The code was migrated.

How did we go about doing this? Well, I’d like to detail it out over a few posts here and in the near future.

First, you need to make the decision to migrate instead of rewrite. Not very often do dev shops get the chance to do a rewrite, and even when they do, most rewrites go over time and budget and end up losing business logic and functionality, that is why the decision to migrate instead of rewrite is a such a good one.

So, after making that decision, which hopefully you do early, or before you are in the middle of a rewrite (as we did), you can get started. There are a few options, and I won’t go into complete detail on them, but just go over them high level..

1. Use the Visual Studio Migration tool and convert the code yourself and go at it alone

2. Look at the various companies out there that do migrations and pick one, and create a project that way.

We went with #2. After some research it was pretty obvious who the main player was out there, ArtInSoft. After researching them and watching their webcasts and reading and doing as much looking into as we could, we contacted them.

Their model is pretty simple, they send someone out to you for a week (on your dime) and they analyze the VB6 project to see how much it would take to migrate.

What I learned shortly after talking with ArtInSoft and bringing one of their analysts here is that ArtInSoft actually created the VB6->.NET conversion wizard in Visual Studio. What it is, is a lightweight version of the VBUC (Visual Basic Upgrade Companion) that they licensed to Microsoft. The VBUC in Visual Studio is very limited though. It will only migrate to VB.NET and has many other limitations, but it does still work. If you had a very small project (lines of code, complexity) – you could easily use the VBUC in Visual Studio, but most projects aren’t simple and have many lines of code, that is where you would want to use ArtInSoft and their “enterprise” version of the VBUC, as well as assistance from ArtInSoft.

Back to analysis. What they do is actually migrate your project, find any ancillary projects you might have and make sure everything gets converted. Their VBUC puts in EWI’s, Errors, Warnings and Issues, within comments in the newly converted code.

They then go throw the code and do find’s on all the EWI’s, dump them and parse them out, put into excel and pivot them to analyze how many of each EWI you might have. That way they can estimate how much effort/time/money it might be to do your project. They have done this many times, so they have it pretty much down to a science.

An example would be, you have 2500 warnings for warning XYZ – we know it takes 2 minutes to fix one of those, and they are usually all unique, so then we know the total time it would take to resolve them all.

After doing their analysis, they give you a project estimation, and then it is up to you to decide if you want to move forward, and when you would want to move forward..

 

To Be Continued…

Categories
Work

Looking to Hire…

I have been managing two different groups @ Trek -  (Business Intelligence and .NET Software Development) for a while now, and we have some openings we are trying to file, so that is why I am putting this out here.

First role, looking to hire a Microsoft .NET Windows Forms developer, or someone with web experience looking to get into Windows Forms and eventually WPF/Silverlight, and also Windows Services. C# is the language.

Second role, looking for a Microsoft Business Intelligence Developer/DBA – SSAS/SSIS/SSRS, DBA experience preferred. Working on cubes, and ETL’s and reports and DBA stuff.

Shoot me an email at steve_novoselac@trekbikes.com with your resume and info.

Categories
Geeky/Programming

Facebook Graph API – Getting Friends and Gender in C#

I recently blogged about the Facebook Graph API and if you have the Facebook C# SDK you can start making applications.

After I had my Facebook app set up, I started making a C# Console application to just get my friends and see what I could do. Here is a snippet to get my friends and their gender.

       string token = ;

            Facebook.FacebookAPI api = new Facebook.FacebookAPI(token);

            JSONObject f = api.Get("/me/friends");

            KeyValuePair friends = f.Dictionary.ElementAt(0);


            for (int i = 0; i < friends.Value.Array.Count(); i++)
            {

                Console.WriteLine("Friend #" + i.ToString());

                JSONObject friend = api.Get("/" + friends.Value.Array[i].Dictionary["id"].String);

                Console.WriteLine(friend.Dictionary["name"].String);

                try
                {
                    Console.WriteLine(friend.Dictionary["gender"].String);
                }
                catch(System.Collections.Generic.KeyNotFoundException knfe)
                {
                    Console.WriteLine("No Gender Specified");
                }

                Console.WriteLine();
            }


            Console.ReadLine();

There is a probably a better way to do this, but getting the JSONObject back and then getting the values you get back from that, I just kind of brute forced it. Also, handling friends that don’t have information set, the quick and dirty way was to just catch the exception. I know there has to be a better way but for now it works.

Categories
Geeky/Programming

Fun with the Facebook Graph API

Lazy Sunday afternoon, so I decided to dig a bit into the Facebook Graph API. What is the FB Graph API? Well it allows you to create an application and use OAuth to connect and then get information about yourself and your friends and do things in Facebook using JSON objects and requests. Easy way to read/write from Facebook.

But, the kicker seems to be getting things started. To start, Facebook has documentation and links and wikis all over the place, so it is hard to get a handle on what you want or need to do.

First you need to create an application. Once you do that you need to set up some things so you can actually use Facebook data.

Once you have that done, you can see your apps here

You will want to take note of a few things and change some settings..First might be to put your app in “sandbox” mode:

Also, under “Authentication” you might want to change Authentication Callback URLs.. and then under “Connect” change your “Connect URL”, and you want it in form http://blah.com/

So to test your app and see what you can do, you need to know your

ApplicationId
API Key
Secret

now, you can make unauthenticated calls to the Graph API, try it..

https://graph.facebook.com/56011561

That’s me. You should be able to see public info on me (don’t try it in IE, it pukes.. Chrome it was working. You might need to be logged into FB)

Anyways, if you want your app to be able to get more than the “public” unauthorized view, you need to make some calls and get some access tokens..

First you need to get an “access code”

https://graph.facebook.com/oauth/authorize?client_id=&redirect_uri=&scope=user_photos,email,user_birthday,user_online_presence,offline_access,friends_birthday,friends_education_history,friends_hometown,friends_location,friends_relationships,friends_religion_politics,friends_likes,friends_interests,friends_groups

You can see in that URL, you have to supply your app id, and your redirect url. Keep it simple, your redirect url should be something like http://blah.com/ .. don’t have querystring params..

If you have things set up, facebook should authenticate you, and ask you to allow your app to have permissions to pretty much “EVERYTHING” on your profile and friends info. To see what “extended permissions” you can use, see here: http://developers.facebook.com/docs/authentication/permissions

You *should* get redirect to your redirect_url with a param in the url called code=

Grab that code, you will need it.

Now to get the access_token

https://graph.facebook.com/oauth/access_token?client_id=&redirect_uri=&client_secret=&code=

once you hit the above URL, with the code from step one you will get an access token

take that token, and then try

https://graph.facebook.com/me?access_token=

you can also get your friends ids

https://graph.facebook.com/me/friends?access_token=

Once you get your friends list, you can use their id's and get info back from the graph.

If you get errors or did something wrong, you might have screwed up your first requests. There are sites and forums saying to add type=client_cred - don't do that, it doesn't work. It will give you a shorter access token, which doesn't work.

Once you have all that working *MANUALLY*, then check out Facebook's officall C# sdk for the graph - http://github.com/facebook/csharp-sdk

Basically then you can replace the access token there with your token and test things.

Then just tie it all together in an app so you can do it programatically, but that is for another blog post 🙂

Categories
Business Intelligence Geeky/Programming

Microsoft Silverlight PivotViewer: Getting Started and Business Case

I have been reading about Microsoft’s PivotViewer lately, and decided to try to get it going for myself. What is PivotViewer? Think of it as visual data slicing through a web page.

What you do is take some data, and then tie records to images, and then publish out your “collection” and you can consume it via a webpage using the Silverlight PivotViewer control. One awesome example of this is here http://netflixpivot.cloudapp.net/. But what I have been really trying to wrap my head around is how to use the as a “business” tool. Because, it is easily technically doable, but you have to have a *reason* to do it.

Working with widgets and customers and locations – what do you do? There are two things I could think of quickly. One – peruse your “master data” very fast and visually. The other is looking at some kind of metrics for your widgets, or logos of your customers you might sell too, or ? ..Well, you could..

  • Master Data/Catalog
  • Show pictures of your widgets, and create filters (they call them facets) for things like size, color, weight, model, etc. You have “one” of each and you just want to see what you offer. Almost could be a pretty sweet online catalog browser

  • Sales/Metrics
  • Do the same as a master data catalog but allow filtering by some kind of metric. Shipped items over a given time or something.

  • Something Else I Haven’t Thought Of?

Anyways, the first thing you should do before anything is get some kind of data feed. Run a query, get some data from somewhere. Start small to test. 500-1000 records.

Then the fun begins. Starting from absolute scratch..

Ok, yeah, tons of setup. Biggest thing is in IIS you need to set some MIME types: .cxml, .dzi and .dzc need to be “text/xml”

Once you have all that setup, you can do 2 things.. create your collection, and create your app. Create a blank silverlight project first:

Once you have that, there isn’t a ton you have to do to get things going with PivotViewer.

  1. MainPage.xaml
  2. Add in your MainPage.xaml, a namespace line for Pivot, and add the control

    Your end MainPage.xaml should look like this:

    
    
    
    
    
    
    
    
    
  3. Reference Assemblies
  4. For good measure, just reference them all, located here: C:Program FilesMicrosoft SDKsSilverlightv4.0PivotViewerJun10Bin

  5. Load Collection:
  6. pvWidgets.LoadCollection("http://localhost/SilverlightApplication1.Web/MyWidgets.cxml", null);
    

Note, you have to make your web part of your project IIS based instead of the build in web browser. Why? Because the .cxml HAS to be hosted on a web server, it just works that way.

Now, you need to create your collection.. you can use a cmd line tool they offer on the PivotViewer site, they also have a c# library for automating things, but it is best to first just do it manually. So I used the tool they have as an add on for excel. It adds a nice little “Pivot Collections” tab

You can use this to put some data in, you probably want to add more columns than what they give you by default. For my test I just used the same image for all records to get started. I have a feeling that the biggest barrier to entry to corporate BI teams and developers is going to be the imagery. You usually don’t have someone on your BI Team that knows how to use photoshop well and do all the high res imagery, so you are handcuffed there.

For testing sake, I Published my collection to the root of my website, with the name “MyWidgets”.

I loaded up my webpage, and I can slice and dice by all my columns I had in my collection, visually.. pretty dang awesome. (Note, I just made some fake data based on attributes I am used to seeing and with an image of a bike to see what it would look like – the goal was figuring out how would this work in conjunction with current BI offerings (cubes/Pivot Tables, SSRS, PowerPivot, etc))

Now, if you think where you could take this. Each “image” is clickable and brings up the image right in front of you. You could have all the specs of that widget there, and a link to “buy”, or deeper analytics for that widget.

Some other things I found out.. using the Excel tool for Pivot Collections is dog slow. Especially with a ton of records. It has to process the images for the “deep zoom” technology and it just takes a while. Like, hours.

There are tons of possibilities here with PivotViewer, both for an external website and also internal corporate business intelligence. It will give people another way to delve into the data and turn it into information.

Categories
Business Intelligence Geeky/Programming SQLServerPedia Syndication

SSIS – Custom Control Flow Component – Execute SQL Job And Wait

Sometimes you have some pretty complex ETL’s going in SSIS, and you might have multiple projects/solutions that need to call other SSIS Packages or SQL Agent Jobs and you have a pretty big production going on. You might have an ETL solution that needs to kick off other packages, and you can either import those into your solution or call them where they lie on the file system/SQL server, etc. You might have to call some SQL agent jobs, and most often they are async calls (you dont need to wait for them to come back) and this works nicely, I do this all the time. The Execute SQL Agent Task in SSIS works nice, or you can just call the SQL statement to execute a job, either way, it kicks off the job and then just comes back successful right away, and doesn’t care if the job actually succeeds. You might want this in some scenarios, and the built in functionality works great.

But what if you want to just call an existing SQL Agent job and actually wait for the job to finish (success or failure)? There isn’t anything that I could see built in to SSIS to do this, sp_start_job is asynchronous, so you are out of luck there. I figured I could call sp_start_job, then create a for loop in SSIS and just check the status every X seconds/minutes, but I would have to either make this a package I could use everywhere or reproduce the same logic in multiple solutions, so I shied away from that solution.

What I decided to do was build a custom SSIS control flow task in .NET that will execute a SQL agent job and check the status and wait until it finishes. A disclaimer: This is going to be a lot of code 🙂 also, it could be improved (but what couldn’t?) – this was a 1.5-2 hour experiment.

First, I created a VS2008 C# class library. I tried adding a UI to my task, but I couldn’t get it working so there is some code there for that but it’s commented out.

here is what my solution looks like:

Capture

import the correct namespaces:

using System;
using System.Collections.Generic;
using System.Text;
using Microsoft.SqlServer.Dts.Runtime;
using System.Net;
using System.Net.NetworkInformation;
using System.Xml;
using Microsoft.SqlServer.Dts.Runtime.Design;
using System.Data.SqlClient;

Next, you need to create the actual skeleton/wrapper for your component. You can see I have two properties, job name, server name. It could be expanded to have the connection string or use an existing connection in SSIS, I wasn’t that ambitious. The “Execute” method basically just calls some functions and waits for result.

namespace ExecuteSQLJobAndWaitControlTask
{

    [DtsTask(
        Description = "Execute SQL Job And Wait",
        DisplayName = "Execute SQL Job And Wait",
        TaskContact = "Steve Novoselac",
        TaskType = "SSIS Helper Task",
        RequiredProductLevel = DTSProductLevel.None)]
    public class ExecuteSQLJobAndWaitControlTask : Task, IDTSComponentPersist

    {
        private string _jobName;
        private string _serverName;

        ///

        /// The sql job name
        ///

public string JobName { get { return _jobName; } set { _jobName = value; } } ///
/// The sql server name ///

public string ServerName { get { return _serverName; } set { _serverName = value; } } public override DTSExecResult Execute(Connections connections, VariableDispenser variableDispenser, IDTSComponentEvents componentEvents, IDTSLogging log, object transaction) { try { StartJob(); System.Threading.Thread.Sleep(5000); do { System.Threading.Thread.Sleep(5000); } while (IsJobRunning()); if (DidJobSucceed()) { return DTSExecResult.Success; } else { return DTSExecResult.Failure; } } catch (Exception ex) { Console.WriteLine(ex.Message); return DTSExecResult.Failure; } } public override DTSExecResult Validate(Connections connections, VariableDispenser variableDispenser, IDTSComponentEvents componentEvents, IDTSLogging log) { if (string.IsNullOrEmpty(_serverName) || string.IsNullOrEmpty(_jobName)) { componentEvents.FireError(0, “You must specify a JobName and ServerName in the properties”, “”, “”, 0); return DTSExecResult.Failure; } else { return DTSExecResult.Success; } } void IDTSComponentPersist.LoadFromXML(System.Xml.XmlElement node, IDTSInfoEvents infoEvents) { if (node.Name != “ExecuteSQLJobAndWaitTask”) { throw new Exception(string.Format(“Unexpected task element when loading task – {0}.”, “ExecuteSQLJobAndWaitTask”)); } else { this._jobName = node.Attributes.GetNamedItem(“JobName”).Value; this._serverName = node.Attributes.GetNamedItem(“ServerName”).Value; } } void IDTSComponentPersist.SaveToXML(System.Xml.XmlDocument doc, IDTSInfoEvents infoEvents) { XmlElement taskElement = doc.CreateElement(string.Empty, “ExecuteSQLJobAndWaitTask”, string.Empty); XmlAttribute jobNameAttribute = doc.CreateAttribute(string.Empty, “JobName”, string.Empty); jobNameAttribute.Value = this._jobName.ToString(); taskElement.Attributes.Append(jobNameAttribute); XmlAttribute serverNameAttribute = doc.CreateAttribute(string.Empty, “ServerName”, string.Empty); serverNameAttribute.Value = this._serverName.ToString(); taskElement.Attributes.Append(serverNameAttribute); doc.AppendChild(taskElement); }

And then I have some helper methods, this is where the meat and potatoes are for this task. Now of course I could have the connection string once, etc. Like I said, it was a quick thing :). The heart of it is though, starting the job, checking if it is still running, and then after, if it succeeded. Pretty simple.


        private bool DidJobSucceed()
        {
            SqlConnection dbConn = new SqlConnection("Integrated Security=SSPI;Persist Security Info=False;Initial Catalog=msdb;Data Source=" + ServerName);
            SqlCommand dbCmd = new SqlCommand("exec msdb.dbo.sp_help_job @job_name = N'" + JobName + "' ;", dbConn);
            dbConn.Open();

            SqlDataReader dr = dbCmd.ExecuteReader();
            dr.Read();
            int status = Convert.ToInt32(dr["last_run_outcome"]);
            dr.Close();

            dbConn.Close();

            if (status == 1)
            {
                return true;
            }
            else
            {
                return false;
            }
        }

        private bool IsJobRunning()
        {

            SqlConnection dbConn = new SqlConnection("Integrated Security=SSPI;Persist Security Info=False;Initial Catalog=msdb;Data Source=" + ServerName);
            SqlCommand dbCmd = new SqlCommand("exec msdb.dbo.sp_help_job @job_name = N'" + JobName + "' ;", dbConn);
            dbConn.Open();

            SqlDataReader dr = dbCmd.ExecuteReader();
            dr.Read();
            int status = Convert.ToInt32(dr["current_execution_status"]);
            dr.Close();

            dbConn.Close();

            if (status == 1)
            {
                return true;
            }
            else
            {
                return false;
            }

        }

        private void StartJob()
        {
            SqlConnection dbConn = new SqlConnection("Integrated Security=SSPI;Persist Security Info=False;Initial Catalog=msdb;Data Source=" + ServerName);
            SqlCommand dbCmd = new SqlCommand("EXEC dbo.sp_start_job N'" + JobName + "' ;", dbConn);
            dbConn.Open();
            dbCmd.ExecuteNonQuery();
            dbConn.Close();
        }
   }

Now, to install this you need to register it in the GAC (global assembly cache), and then copy to the DTS/Tasks folder. Depending if you have VS2005 or VS2008 (or both) your gacutil path might be different.

cd
c:
cd C:Program FilesMicrosoft SDKsWindowsv6.0Abin
gacutil /uf "ExecuteSQLJobAndWaitTask"
gacutil /if "C:ProjectsSSISCustomTasksExecuteSQLJobAndWaitbinDebugExecuteSQLJobAndWaitTask.dll"
copy "C:ProjectsSSISCustomTasksExecuteSQLJobAndWaitbinDebugExecuteSQLJobAndWaitTask.dll" "C:Program FilesMicrosoft SQL Server90DTSTasks"

I have found once you have done that, you need to actually restart your SSIS service to make it work, but then you can use it in new Visual Studio SSIS packages.

Capture

Once you drag it on your package, you can set the JobName and ServerName property (from the properties window – remember, no GUI). and it should run.

Some notes:

If you kill the job, the SSIS task will fail (obviously). If you kill the SSIS package, the job will keep running. Maybe a future enhancement will be to capture the SSIS package fail/cancel and kill the job. Maybe 🙂

Attached is the source code for the task (Vs2008 C#) https://onedrive.live.com/redir?resid=ac05d3c752d3b50a!187358&authkey=!APJ06uEMqWiLlIM&ithint=file%2crar

This has been testing with BIDS VS2005. I take no responsibility if this blows up your system, computer, server, the world, etc.

Happy ETL’ing!