Figured out today that in SSRS, if you add a filter to your dataset, and you want to use “Like”, that the operand for everything is * instead of % like you might be used to in SQL, sometimes it is just the little things 🙂
Tag: linkedin
In many shops, I am guessing there are multiple servers. Development, Production, maybe a Staging, etc.
With SQL Server Analysis Services (SSAS) and SQL Server Integration Services (SSIS), you set up data sources, connection to databases. In SSAS you usually have a connection which then you build your data source view off of, and in SSIS you have connections from which you push data to and pull data from.
Another thing, in SSAS you can “deploy” right from Visual Studio (BIDS). All these things have a server name. What we have run into is this:
You develop on your local machine, pointing at development server. You deploy to development, your connections are pointing to development, and everything works great. When you deploy to production (usually planned, every 2 weeks, or whatever) you deploy your stuff and what ends up happening?
In SSIS your config files should have a connection string (or however you store it) and it should point to production. But in SSAS, if you deploy from BIDS, your data source will have to change and in the cube project properties you need to change your deployment server.
I have seen countless times, a cube or a connection in SSIS without a config that is running in production, yet pointing at development. We keep our dev data as fresh or very close so sometimes we don’t even notice, but then it happens, something weird is reported and we dig into it, and we find the erroneous connection string.
Here is my solution to the problem:
Developers – go to C:WindowsSystem32DriversEtc and open your Hosts file with notepad or text editor. You then add a couple of entries
#production
#xxx.xxx.xxx.xxx datawarehouse
#development
yyy.yyy.yyy.yyy datawarehouse
where xxx is the ip of your production system, and yyy is the ip of your dev system. the # is the rem/comment out symbol. You can see above I have everything commented out but the line for the dev system. But notice each is pointed to “datawarehouse” so if I ping or connect to “datawarehouse” from Management Studio, or whatever, it goes to the IP I have commented out.
Now, go on to each server, but only add the line that corresponds to that server in the hosts file, or better yet just
127.0.0.1 datawarehouse
Now, when you deploy to either server, and your connections, etc are set to connect to “datawarehouse” you ensure it will always connect to the local server. Brilliant!
Last night I configured up SQL Server Master Data Services (MDS) on a test box. It looks good so far. I ran into a few issues with the box/setup that I had to tweak in order to get it working. I had to allow “handlers” and “modules” in the applicationHost config on the machine. IIS was also inadvertently set up with only anon access, which was an issue, after I got windows auth installed and turned on everything seemed to work.
The app/system itself is pretty slick. Very basic, but lets you do complex things. Once you get some users set up, and a few models (think: Product, Customer), you can add entities (think: Category1, Category2, etc) you can set up hierarchies, business rules, etc.
I haven’t played much more with it, but it seems like it could get the job done. I would say some things aren’t intuitive enough. Example – they could say “drag this over to this area” but there is nothing as far as what to do, its kind of guess and check.
I’m excited to see where MDS goes.
Just updated my work laptop to Windows 7. Sweet right? Except now the fun of getting everything to work.
First issue I ran into, a Web App in development, not being able to load up “Microsoft.AnalysisServices.AdomdClient” because it was looking for version “9.0.242.0” – the version you get with Office 2007, SQL 2005. But I have 2010 running and SQL 2008 stuff on this new build, so saw that issue. Of course I could install the stuff from SQL 2005, the drivers, but also I can redirect my assembly in the Web.config
I was also getting some weird stuff about permission denied to any third party assembly when trying to debug/run it locally. Since the web app is impersonating a user, I figured I need to give that user rights to the temporary internet files for ASP.NET, which worked, but in Win7 (and Vista?) the path is different, it is
%LocalAppData%TempTemporary ASP.NET Files
I gave the user modify and my app finally builds!
Hopefully not too many more roadblocks going forward..
SQL 2005 introduced DDL Triggers, where you could catch events on your DDL statements in SQL, and hopefully most DBA’s are catching them, at least to a table and have some kind of report on who is adding, changing and deleting what objects.
What I wanted to do was capture that, but also keep and automatic running log in SVN (subversion) source control. Here is how I went about it.
First, you need SVN set up somewhere. We use unfuddle (http://unfuddle.com/) – which we also use for Agile BI stuff, but that is another post 🙂 and unfuddle also let’s us do SVN in the cloud, but you could do it locally or internally or whatever you’d like. I had to create a user for this, to do automatic commits, and give it rights to commit to the repo.
Second, you probably already want your DDL triggers set up to capture events on your databases, and write to a table. Writing to central DB on a server is probably a good idea. If you have that, or something similar, I then created a trigger on that table, to capture the INSERTED action and take that record that was just inserted and parse what I need, and automatically commit to SVN.
But first, you want to script out all your objects for a database to a folder, say C:DBSchemas
I created a folder structure to help me
DBSchemasServerNameDatabaseNameObjectType
and then scripted out all the objects to each folder for ObjectType (Tables, StoredProcedures, Functions, and Views)
Once that was done, I did an initial import into SVN, and make the folder I was working with a working copy. Then the funs starts.
I created a couple of procs (which I found online, links to the blogs are below) to CREATE and DELETE files from T-SQL using OLE automation
CREATE/APPEND (http://sqlsolace.blogspot.com/2009/01/ole-automation-write-text-file-from.html)
CREATE PROCEDURE [Utils].[usp_OLEWriteFile] (@FileName varchar(1000), @TextData NVARCHAR(MAX),@FileAction VARCHAR(12)) AS BEGIN DECLARE @OLEfilesytemobject INT DECLARE @OLEResult INT DECLARE @FileID INT EXECUTE @OLEResult = sp_OACreate 'Scripting.FileSystemObject', @OLEfilesytemobject OUT IF @OLEResult 0 PRINT 'Error: Scripting.FileSystemObject' -- check if file exists EXEC sp_OAMethod @OLEfilesytemobject, 'FileExists', @OLEresult OUT, @FileName -- if file esists IF (@OLEresult=1 AND @FileAction = 'APPEND') OR (@OLEresult=0) BEGIN IF (@FileAction = 'CREATENEW') PRINT 'New file specified, creating...' IF (@OLEresult=1 AND @FileAction = 'APPEND') PRINT 'File exists, appending...' IF (@OLEresult=0 AND @FileAction = 'APPEND') PRINT 'File doesnt exist, creating...' -- open file EXECUTE @OLEResult = sp_OAMethod @OLEfilesytemobject, 'OpenTextFile', @FileID OUT, @FileName, 8, 1 IF @OLEResult 0 PRINT 'Error: OpenTextFile' -- write Text1 to the file EXECUTE @OLEResult = sp_OAMethod @FileID, 'WriteLine', Null, @TextData IF @OLEResult 0 PRINT 'Error : WriteLine' ELSE PRINT 'Success' END IF (@OLEresult=1 AND @FileAction = 'CREATENEW') PRINT 'File Exists, specify APPEND if this is the desired action' EXECUTE @OLEResult = sp_OADestroy @FileID EXECUTE @OLEResult = sp_OADestroy @OLEfilesytemobject END GO
DECLARE @Result int DECLARE @FSO_Token int EXEC @Result = sp_OACreate 'Scripting.FileSystemObject', @FSO_Token OUTPUT EXEC @Result = sp_OAMethod @FSO_Token, 'DeleteFile', NULL, 'C:delete-me-file.txt' EXEC @Result = sp_OADestroy @FSO_Token
You need to make sure OLE Automation is on. You need to make sure that the account you are running SQL as has modify rights to your DBSchemas folder.
But the crux of the solution is the trigger that gets the DDL info, and writes/deletes the files and SVN Add/Del/Commit’s the file. Now this is some ugly 1 hour SQL script craziness, tons of IF statements, etc. It could be improved, but it works, and it is a start, it can be modified and tweaked to do whatever you want. Note, if your SVN repo isn’t authenticated you don’t need the username/password for the SVN commands.
You can see, it gets the DDL, checks the events (and I have it limited to one database), and it checks what type of object and what operation, and for and add, it adds and commits, for a updated, deletes file, recreates it, and commits, and for a delete it does and svn delete and commit. Pretty easy 🙂
CREATE TRIGGER DDLRevisionHistory
ON dbo.DDLEventLog
AFTER INSERT
AS
BEGIN
SET NOCOUNT ON;
DECLARE @EventType VARCHAR(50)
DECLARE @DatabaseName VARCHAR(50)
DECLARE @ServerName VARCHAR(50)
DECLARE @ObjectName VARCHAR(100)
DECLARE @SchemaName VARCHAR(10)
DECLARE @CommandText VARCHAR(MAX)
DECLARE @LoginName VARCHAR(50)
SELECT
@EventType = EventInstance.value('(//EventType)[1]', 'varchar(50)'),
@DatabaseName = EventInstance.value('(//DatabaseName)[1]', 'varchar(50)'),
@ServerName = EventInstance.value('(//ServerName)[1]', 'varchar(50)'),
@ObjectName = EventInstance.value('(//ObjectName)[1]', 'varchar(50)'),
@SchemaName =EventInstance.value('(//SchemaName)[1]', 'varchar(50)'),
@CommandText = EventInstance.value('(//TSQLCommand//CommandText)[1]', 'varchar(max)'),
@LoginName = EventInstance.value('(//LoginName)[1]', 'varchar(50)')
FROM inserted
DECLARE @filepath VARCHAR(8000)
SET @filepath = 'C:DBSchemas' + @ServerName + '' + @DatabaseName + ''
IF (
@EventType = 'CREATE_VIEW' OR @EventType = 'ALTER_VIEW' OR @EventType = 'DROP_VIEW'
OR @EventType = 'CREATE_TABLE' OR @EventType = 'ALTER_TABLE' OR @EventType = 'DROP_TABLE'
OR @EventType = 'CREATE_PROCEDURE' OR @EventType = 'ALTER_PROCEDURE' OR @EventType = 'DROP_PROCEDURE'
OR @EventType = 'CREATE_FUNCTION' OR @EventType = 'ALTER_FUNCTION' OR @EventType = 'DROP_FUNCTION'
)
AND @DatabaseName = 'YourDatabase' BEGIN
-- write out new file to correct folder
IF CHARINDEX('VIEW',@EventType) > 0 BEGIN
SET @filepath = @filepath + 'Views' + @SchemaName + '.' + @ObjectName + '.View.sql'
END
IF CHARINDEX('TABLE',@EventType) > 0 BEGIN
SET @filepath = @filepath + 'Tables' + @SchemaName + '.' + @ObjectName + '.Table.sql'
END
IF CHARINDEX('PROCEDURE',@EventType) > 0 BEGIN
SET @filepath = @filepath + 'StoredProcedures' + @SchemaName + '.' + @ObjectName + '.StoredProcedure.sql'
END
IF CHARINDEX('FUNCTION',@EventType) > 0 BEGIN
SET @filepath = @filepath + 'Views' + @SchemaName + '.' + @ObjectName + '.UserDefinedFunction.sql'
END
IF CHARINDEX('CREATE',@EventType) > 0 BEGIN
-- create file
EXEC dbo.usp_OLEWriteFile @filepath,@CommandText,'CREATENEW'
-- svn add
DECLARE @instrAdd VARCHAR(4000)
SET @instrAdd='svn add ' + @filepath + ' --username dbschema --password yourpassword'
EXEC xp_cmdshell @instrAdd
-- svn commit
DECLARE @instrCommitAdd VARCHAR(4000)
SET @instrCommitAdd='svn commit ' + @filepath + ' --message "added by '+ @LoginName +'" --username dbschema --password yourpassword'
EXEC xp_cmdshell @instrCommitAdd
END
IF CHARINDEX('ALTER',@EventType) > 0 BEGIN
--delete and readd file
EXEC dbo.usp_OLEDeleteFile @filepath
EXEC dbo.usp_OLEWriteFile @filepath,@CommandText,'CREATENEW'
-- svn commit
DECLARE @instrCommitChange VARCHAR(4000)
SET @instrCommitChange='svn commit ' + @filepath + ' --message "changed by '+ @LoginName + '" --username dbschema --password yourpassword'
--PRINT @instrCommitChange
EXEC xp_cmdshell @instrCommitChange
END
IF CHARINDEX('DROP',@EventType) > 0 BEGIN
-- svn delete
DECLARE @instrDel VARCHAR(4000)
SET @instrDel='svn delete ' + @filepath + ' --username dbschema --password yourpassword'
EXEC xp_cmdshell @instrDel
-- svn commit
DECLARE @instrCommitDel VARCHAR(4000)
SET @instrCommitDel='svn commit ' + @filepath + ' --message "deleted by '+ @LoginName +'" --username dbschema --password yourpassword'
EXEC xp_cmdshell @instrCommitDel
END
END
END
as you can see you can create a homegrown revision history of your DDL objects in SQL . I have tested this on the basic operations, no renames, etc using the GUI, but if you do use it, you might want to wrap it all in exception handling just to be on the safe side.
Happy DBA’ing 🙂
I use SQL Agent a ton. It is like my go to place to schedule jobs and tasks. Problem is, there are pieces “missing”. Like if a job hangs, you can have it auto stop after a given amount of time. It doesn’t alert on changes to a job, etc, etc.
I asked on twitter what the DBA’s think, @SQLRockstar thinks using OpsMgr is the way, which probably would be great, if I had it. I need a quick and dirty solution.
What I did was query the sysjobs, sysjobactivity, and sysjobhistory tables to get some kind of “look” into the running jobs. That way, if a job that is normally 45 minutes has been running for 5 hours, I should be able to catch it early on, just trying to be more proactive instead of reactive.
SELECT j.job_id AS 'JobId', name AS 'JobName', start_execution_date AS 'StartTime', stop_execution_date AS 'StopTime', avgruntimeonsucceed, DATEDIFF(s,start_execution_date,GETDATE()) AS 'CurrentRunTime', CASE WHEN stop_execution_date IS NULL THEN DATEDIFF(ss,start_execution_date,stop_execution_date) ELSE 0 END 'ActualRunTime', CASE WHEN stop_execution_date IS NULL THEN 'JobRunning' WHEN DATEDIFF(ss,start_execution_date,stop_execution_date) > (AvgRunTimeOnSucceed + AvgRunTimeOnSucceed * .05) THEN 'LongRunning-History' ELSE 'NormalRunning-History' END 'JobRun', CASE WHEN stop_execution_date IS NULL THEN CASE WHEN DATEDIFF(ss,start_execution_date,GETDATE()) > (AvgRunTimeOnSucceed + AvgRunTimeOnSucceed * .05) THEN 'LongRunning-NOW' ELSE 'NormalRunning-NOW' END ELSE 'JobAlreadyDone' END AS 'JobRunning' FROM msdb.dbo.sysjobactivity ja INNER JOIN msdb.dbo.sysjobs j ON ja.job_id = j.job_id INNER JOIN ( SELECT job_id, AVG ((run_duration/10000 * 3600) + ((run_duration%10000)/100*60) + (run_duration%10000)%100) + STDEV ((run_duration/10000 * 3600) + ((run_duration%10000)/100*60) + (run_duration%10000)%100) AS 'AvgRuntimeOnSucceed' FROM msdb.dbo.sysjobhistory WHERE step_id = 0 AND run_status = 1 GROUP BY job_id) art ON j.job_id = art.job_id WHERE (stop_execution_date IS NULL) OR (DATEDIFF(ss,start_execution_date,stop_execution_date) > 60 AND CAST(LEFT(start_execution_date,11) AS DATETIME) = CAST(LEFT(GETDATE(),11) AS DATETIME)) ORDER BY start_execution_date DESC
The inner query looks at the history to get the average runtime, and I added the standard deviation to that to make it more realistic, in case you have a one off that throws the average off. Of course this all depends on you having more than 3 entries in your job history as well.
I then just compare the average to the current runtime (if the job is executing) and to the actual (if it is complete).
You could use this query in a report, to view throughout the day to catch long running jobs, you could also use it to just give you the one that are currently running and over the time threshold, and alert you, so you can catch them right away.
I tested this on a few servers running SQL 2005, but I am guessing it will work on SQL 2008 as well. Find and kill those long running jobs!!
Crazy error today. In Analysis Services, (SSAS), some jobs got hung, sync command got hung, and what ended up happening is a cube on the default instance was totally unusable.
Try to delete, rename, resync, redeploy, anything, and I would get the error..
Errors in the metadata manager. The dimension with ID of..
Couldn’t do anything. The solution? I had to stop the SSAS service, go to the OLAP data directory under my SQL install and remove the folder for that cube. I then started the SSAS Service, and the SSAS Database was gone (finally!). I redeployed and reprocessed and everything is working again. Whew 🙂
Sometimes you have some pretty complex ETL’s going in SSIS, and you might have multiple projects/solutions that need to call other SSIS Packages or SQL Agent Jobs and you have a pretty big production going on. You might have an ETL solution that needs to kick off other packages, and you can either import those into your solution or call them where they lie on the file system/SQL server, etc. You might have to call some SQL agent jobs, and most often they are async calls (you dont need to wait for them to come back) and this works nicely, I do this all the time. The Execute SQL Agent Task in SSIS works nice, or you can just call the SQL statement to execute a job, either way, it kicks off the job and then just comes back successful right away, and doesn’t care if the job actually succeeds. You might want this in some scenarios, and the built in functionality works great.
But what if you want to just call an existing SQL Agent job and actually wait for the job to finish (success or failure)? There isn’t anything that I could see built in to SSIS to do this, sp_start_job is asynchronous, so you are out of luck there. I figured I could call sp_start_job, then create a for loop in SSIS and just check the status every X seconds/minutes, but I would have to either make this a package I could use everywhere or reproduce the same logic in multiple solutions, so I shied away from that solution.
What I decided to do was build a custom SSIS control flow task in .NET that will execute a SQL agent job and check the status and wait until it finishes. A disclaimer: This is going to be a lot of code 🙂 also, it could be improved (but what couldn’t?) – this was a 1.5-2 hour experiment.
First, I created a VS2008 C# class library. I tried adding a UI to my task, but I couldn’t get it working so there is some code there for that but it’s commented out.
here is what my solution looks like:
import the correct namespaces:
using System; using System.Collections.Generic; using System.Text; using Microsoft.SqlServer.Dts.Runtime; using System.Net; using System.Net.NetworkInformation; using System.Xml; using Microsoft.SqlServer.Dts.Runtime.Design; using System.Data.SqlClient;
Next, you need to create the actual skeleton/wrapper for your component. You can see I have two properties, job name, server name. It could be expanded to have the connection string or use an existing connection in SSIS, I wasn’t that ambitious. The “Execute” method basically just calls some functions and waits for result.
namespace ExecuteSQLJobAndWaitControlTask
{
[DtsTask(
Description = "Execute SQL Job And Wait",
DisplayName = "Execute SQL Job And Wait",
TaskContact = "Steve Novoselac",
TaskType = "SSIS Helper Task",
RequiredProductLevel = DTSProductLevel.None)]
public class ExecuteSQLJobAndWaitControlTask : Task, IDTSComponentPersist
{
private string _jobName;
private string _serverName;
///
/// The sql job name
///
public string JobName { get { return _jobName; } set { _jobName = value; } } ///
/// The sql server name ///
public string ServerName { get { return _serverName; } set { _serverName = value; } } public override DTSExecResult Execute(Connections connections, VariableDispenser variableDispenser, IDTSComponentEvents componentEvents, IDTSLogging log, object transaction) { try { StartJob(); System.Threading.Thread.Sleep(5000); do { System.Threading.Thread.Sleep(5000); } while (IsJobRunning()); if (DidJobSucceed()) { return DTSExecResult.Success; } else { return DTSExecResult.Failure; } } catch (Exception ex) { Console.WriteLine(ex.Message); return DTSExecResult.Failure; } } public override DTSExecResult Validate(Connections connections, VariableDispenser variableDispenser, IDTSComponentEvents componentEvents, IDTSLogging log) { if (string.IsNullOrEmpty(_serverName) || string.IsNullOrEmpty(_jobName)) { componentEvents.FireError(0, “You must specify a JobName and ServerName in the properties”, “”, “”, 0); return DTSExecResult.Failure; } else { return DTSExecResult.Success; } } void IDTSComponentPersist.LoadFromXML(System.Xml.XmlElement node, IDTSInfoEvents infoEvents) { if (node.Name != “ExecuteSQLJobAndWaitTask”) { throw new Exception(string.Format(“Unexpected task element when loading task – {0}.”, “ExecuteSQLJobAndWaitTask”)); } else { this._jobName = node.Attributes.GetNamedItem(“JobName”).Value; this._serverName = node.Attributes.GetNamedItem(“ServerName”).Value; } } void IDTSComponentPersist.SaveToXML(System.Xml.XmlDocument doc, IDTSInfoEvents infoEvents) { XmlElement taskElement = doc.CreateElement(string.Empty, “ExecuteSQLJobAndWaitTask”, string.Empty); XmlAttribute jobNameAttribute = doc.CreateAttribute(string.Empty, “JobName”, string.Empty); jobNameAttribute.Value = this._jobName.ToString(); taskElement.Attributes.Append(jobNameAttribute); XmlAttribute serverNameAttribute = doc.CreateAttribute(string.Empty, “ServerName”, string.Empty); serverNameAttribute.Value = this._serverName.ToString(); taskElement.Attributes.Append(serverNameAttribute); doc.AppendChild(taskElement); }
And then I have some helper methods, this is where the meat and potatoes are for this task. Now of course I could have the connection string once, etc. Like I said, it was a quick thing :). The heart of it is though, starting the job, checking if it is still running, and then after, if it succeeded. Pretty simple.
private bool DidJobSucceed()
{
SqlConnection dbConn = new SqlConnection("Integrated Security=SSPI;Persist Security Info=False;Initial Catalog=msdb;Data Source=" + ServerName);
SqlCommand dbCmd = new SqlCommand("exec msdb.dbo.sp_help_job @job_name = N'" + JobName + "' ;", dbConn);
dbConn.Open();
SqlDataReader dr = dbCmd.ExecuteReader();
dr.Read();
int status = Convert.ToInt32(dr["last_run_outcome"]);
dr.Close();
dbConn.Close();
if (status == 1)
{
return true;
}
else
{
return false;
}
}
private bool IsJobRunning()
{
SqlConnection dbConn = new SqlConnection("Integrated Security=SSPI;Persist Security Info=False;Initial Catalog=msdb;Data Source=" + ServerName);
SqlCommand dbCmd = new SqlCommand("exec msdb.dbo.sp_help_job @job_name = N'" + JobName + "' ;", dbConn);
dbConn.Open();
SqlDataReader dr = dbCmd.ExecuteReader();
dr.Read();
int status = Convert.ToInt32(dr["current_execution_status"]);
dr.Close();
dbConn.Close();
if (status == 1)
{
return true;
}
else
{
return false;
}
}
private void StartJob()
{
SqlConnection dbConn = new SqlConnection("Integrated Security=SSPI;Persist Security Info=False;Initial Catalog=msdb;Data Source=" + ServerName);
SqlCommand dbCmd = new SqlCommand("EXEC dbo.sp_start_job N'" + JobName + "' ;", dbConn);
dbConn.Open();
dbCmd.ExecuteNonQuery();
dbConn.Close();
}
}
Now, to install this you need to register it in the GAC (global assembly cache), and then copy to the DTS/Tasks folder. Depending if you have VS2005 or VS2008 (or both) your gacutil path might be different.
cd c: cd C:Program FilesMicrosoft SDKsWindowsv6.0Abin gacutil /uf "ExecuteSQLJobAndWaitTask" gacutil /if "C:ProjectsSSISCustomTasksExecuteSQLJobAndWaitbinDebugExecuteSQLJobAndWaitTask.dll" copy "C:ProjectsSSISCustomTasksExecuteSQLJobAndWaitbinDebugExecuteSQLJobAndWaitTask.dll" "C:Program FilesMicrosoft SQL Server90DTSTasks"
I have found once you have done that, you need to actually restart your SSIS service to make it work, but then you can use it in new Visual Studio SSIS packages.
Once you drag it on your package, you can set the JobName and ServerName property (from the properties window – remember, no GUI). and it should run.
Some notes:
If you kill the job, the SSIS task will fail (obviously). If you kill the SSIS package, the job will keep running. Maybe a future enhancement will be to capture the SSIS package fail/cancel and kill the job. Maybe 🙂
Attached is the source code for the task (Vs2008 C#)Â https://onedrive.live.com/redir?resid=ac05d3c752d3b50a!187358&authkey=!APJ06uEMqWiLlIM&ithint=file%2crar
This has been testing with BIDS VS2005. I take no responsibility if this blows up your system, computer, server, the world, etc.
Happy ETL’ing!
Sometimes, I just wonder why things that are useful in previous versions of things get removed. In SQL 2000, in DTS, there was an ODBC destination, you could write (insert, update, etc) to a foreign (non MSFT SQL) system easily. DB2 – no prob, MySQL – heck yea. For whatever reason, in SQL 2005 Integration Services (SSIS), that ability was removed (in SQL 2008 SSIS there is an ADO.NET Destination that can update ODBC sources, so they brought some functionality back).
I need to write to a MySQL database pulling data from a SQL Database, using SSIS 2005. What are the options? Well, the best I could come up with was a Script Component Destination in my DataFlow, this is how I did it:
1) Create a new SSIS Package, throw a DataFlow on the Control Flow, Add your connections (let’s say SourceDB_SQL – your source data, and then DestDB_MySQL – your MySQL destination, it needs to be a ADO.NET Connection, you need to install the MySQL connection – I installed this http://dev.mysql.com/downloads/connector/odbc/5.1.html)
2) In your DataFlow, create your OLEDB Source and get your query returning data, throw a Script Component on the Data Flow and make it a Destination.
3) Open the Script Component, set your input columns up, and then your Connection Manager
4) Open the actual script, and you just have to add a few lines of code:
' Microsoft SQL Server Integration Services user script component
' This is your new script component in Microsoft Visual Basic .NET
' ScriptMain is the entrypoint class for script components
Imports System
Imports System.Data
Imports System.Math
Imports Microsoft.SqlServer.Dts.Pipeline.Wrapper
Imports Microsoft.SqlServer.Dts.Runtime.Wrapper
Imports System.Data.Odbc
Public Class ScriptMain
Inherits UserComponent
Dim mySQLConn As OdbcConnection
Dim sqlCmd As OdbcCommand
Public Overrides Sub AcquireConnections(ByVal Transaction As Object)
mySQLConn = CType(Me.Connections.MySQLDatabase.AcquireConnection(Nothing), OdbcConnection)
End Sub
Public Overrides Sub Input0_ProcessInputRow(ByVal Row As Input0Buffer)
sqlCmd = New OdbcCommand("INSERT INTO steve_test(ShipTo, YearValue) VALUES(" & Row.ShipTo & ", '" & Row.YearValue & "')", mySQLConn)
sqlCmd.ExecuteNonQuery()
End Sub
Public Overrides Sub ReleaseConnections()
Me.Connections.MySQLDatabase.ReleaseConnection(mySQLConn)
End Sub
End Class
Run it and you are done! Easy. Now you can write data into MySQL using SQL 2005 SSIS (or any ODBC destination if you can get it to work :))
Digging into the blog post from earlier this summer I wanted to see what was new and exciting in Excel 2010.
Recently I have been working on an cube and we want to be able to budget right from the cube. There are also many other cubes/scenarios where the ability to writeback to the cube would be awesome. Some BI tools have had this for many years! Microsoft had something similar with a Excel 2002/2003 add-in, but it has been removed. Also, there are many 3rd party tools to allow this. You could also write your own macros or VBA/.NET code to do this as well, but what was always missing was the ability to do writeback directly from an Excel (OLAP) PivotTable. With Excel 2010, this functionally finally shows up.
First, you need a cube, and you need to enable writeback on a partition. This will automatically create a table in your database where you have your data warehouse. Lets say you have a table FactBudgets, and you enable writeback, SSAS will create a table WriteTable_Budgets. This is a trivial example I went through to test this functionality, but I just wanted to exhibit the feature of the writeback.
Turn on writeback, deploy and process your SSAS cube, and then open Excel 2010. Connect to your cube, and then in the PivotTable ribbon menu, on options, there is a button to turn on “What-If Analysis”. Turn it on. 🙂

Once you turn on the setting, you can then begin writing back values to your cube, right from Excel. If you click on a cell in your writeback value, you can just change it. See on the screen shot below, the value I changed, the little purple triangle, tells me it has a changed value from what is in the data source.
You can see its telling me the value changed, I can Discard the change and other options. After you have your values set, you want to publish them, which you do back on the PivotTable ribbon menu:
As you can see, Microsoft has finally created a viable solution for writing back values into your OLAP cubes without the need for 3rd party software or coding. Finally!
One thing to note, if you try to write back to a value that is in a partition that doesn’t have writeback enabled, you will get an error.
Now, think of the possibilities with SSAS OLAP writeback and Excel, now that we can actually use it out of the box!









