Categories
Business Intelligence Geeky/Programming SQLServerPedia Syndication

Reporting Services: Can't Uninstall – The setup failed to read IIsMimeMap table. The error code is -2147024893

Ran into this error tonight trying to uninstall SQL Server Reporting Services. Not sure if it is just Vista, or XP and other OS’s as well, but the fix is to stop IIS and then re-run the Uninstall.

Categories
Business Intelligence Geeky/Programming SQLServerPedia Syndication

SSAS: Changing Object Id's Breaks Report Builder Reports

Ugh. Found this one out the hard way. Usually when you change underlying object id’s in SQL Server Analysis Services, it shouldn’t cause any harm. You might have some XMLA to process dim’s and measure groups, if so you would have to change those, etc. But all reporting services reports and excel 2007 pivot tables, and MDX should keep working. What breaks? Report Builder reports.

You can build reports with Report Builder (the link is in Reporting Services to open Report Builder) off a cube model. They are paired down reports, you can’t do as much as you can with SSRS, but for advanced end users, they do the trick.

Thing is, the use SemanticQuery XML behind the scenes for the query and data source is to the model, and the XML is build off the object id’s of the cube. Ugh again. Even worse is that all parameters that were set as drop down lists (in this list) type are converted to IN formulas, and all the other params are converted to formulas. Graphs break. Matrix Reports break. Tabular Reports break. It just sucks. They shouldn’t build the query off the underlying id’s of objects, they should build them off the displayed names, like everything else does. Whew 🙂

 

Categories
Business Intelligence Geeky/Programming SQLServerPedia Syndication

SQL Server 2005 Performance Dashboard Reports

Check these out. http://www.microsoft.com/downloads/details.aspx?familyid=1d3a4a0d-7e0c-4730-8204-e419218c1efc&displaylang=en

If you are a DBA, these are pretty much money. I have seen some many custom implementations and DBA’s trying to get reports like this, and if you have SQL Server 2005 SP2, you can download that and run a SQL script and then you have these reports. They give you insight into pretty much everything going on with your server. Why pay $1000’s for a product that will give you the same info when it is built right into SQL Server!!

Technorati Tags: ,,,,
Categories
Geeky/Programming SQLServerPedia Syndication

SQL 2005: Using Data Management Views (DMV's) to View Status of Database and Log Restores

SQL 2005 introduced some cool "data management views" (DMV’s) that let a DBA see what is happening on their database. Previously you had to use undocumented system tables and information schemas to get the info, and a lot of the info wasn’t even available. Well recently I was doing some backup/restore stuff on a database and wanted to know when it would finish. The first question I ask is – why isn’t this built into the SSMS view of a db, when it says "DatabaseName (Restoring)" why couldn’t they just add the % done at the end of something? Anyway’s – if you want to see what is restoring on your server and where it is at, go ahead and use this query:

use master
go

SELECT
    percent_complete AS ‘PctComplete’,
    start_time AS ‘StartTime’,
    command AS ‘Command’,
    b.name AS ‘DatabaseName’,
    DATEADD(ms,estimated_completion_time,GETDATE()) AS ‘EstimatedEndTime’,
    (estimated_completion_time/1000/60) AS ‘EstimatedMinutesToEnd’
    FROM sys.dm_exec_requests a
    INNER JOIN sys.databases b ON a.database_id = b.database_id
    WHERE command like ‘%restore%’
    AND estimated_completion_time > 0

I did notice that it says 100% complete but then the restore still takes about 20-30 seconds to finish, so be warned there. Also, if you are restoring a DB, the database name will be master (from what I have seen, it was a fresh restore). If you are just restoring logs, it will show you the database name of the database where the logs are restoring.

Take this and create a quick SSRS (Reporting Services) report. Or even better schedule this to run every 2 minutes or something and maybe once it sees some records, kick off a report, or make a cool little desktop app that has balloon popups and statuses, so as  DBA you can see what is going on with your restores. Maybe I will create that app one Saturday if am bored or something 🙂

Technorati Tags: ,,,,,,,,
Categories
Business Intelligence Geeky/Programming SQLServerPedia Syndication

SQL 2005 SSAS Deployment Wizard – Where Did My Roles Go?

Ok.. if you are using SSAS 2005 and using the deployment wizard (not using BIDS) – probably because you want to retain your partitions or something. But anyways, if you use it and choose the option to retain your roles, you will deploy, and no one will be able to get to your SSAS DB or Cubes. What gives?

See this link on Microsoft. Doesn’t make sense to me but I guess in Microsoft speak Retain roles and members really equals Screw up roles and members and leave you scratching your head.

Technorati Tags: ,,,,
Categories
Geeky/Programming SQLServerPedia Syndication

SQL Server: sp_add_job – parameter passing

I am sure there are other SQL procedures that take parameters the same way, but I recently have been dealing with sp_add_job. You can call the procedure and pass in parameters

sp_add_job @job_name = ‘My Job Name’

for an example. But..you cannot do this

DECLARE @test AS VARCHAR(50)

SET @test = ‘ Test ‘

sp_add_job @job_name = ‘My Job Name’ + @test

You will get an error, because you cannot concatonate a string while passing it in as a parameter. What you need to do is this.

DECLARE @test AS VARCHAR(50)

SET @test = ‘My Job Name Test’

sp_add_job @job_name = @test

* Note there are other params you need to pass to the procedure, but I just passed the one for example’s sake

Technorati Tags: ,,
Categories
Geeky/Programming SQLServerPedia Syndication

.NET – Fastest Way to Load Text File To SQL – SqlBulkCopy

An exercise in coding. Loading a text file to SQL. There are a billion different ways to do it, and depending on your source data format and such, you can parse the file a million ways too. But how fast can you get data from disk to SQL using .NET code? (VB.NET or C#) . This post is going to show the differences in some of the ways you can load data from disk to SQL.

I am sure I could do more, but this is a good sampling. Lets assume a 1,00,000 row file, comma separated with 3 columns, string, int, string of variable length. Lets assume our destination is SQL Server 2005, table already created, no keys or anything on the table.

We will call our table LoadedData. Our test app will be a VB.NET Console Application, running on the same box as SQL 2005 is loaded. Now, there are many ways to load files. A few are: Reading them line by line, ReadToEnd() and also using the JET engine to read in a CSV, etc. From the testing I have been doing, all of these seem to work fairly fast, maybe a comparison on these is for another blog post, but for brevity’s sake, lets just say they are all comparable. Now, I chose 3 methods of inserting data.

1) StreamReader.ReadLine, Insert Line By Line

Sub Method1()
    Dim i As Long = 0

    Dim sr As StreamReader = New StreamReader(filename)
    Dim line As String = sr.ReadLine()
    Dim dbConn As SqlConnection = New SqlConnection(ConfigurationManager.ConnectionStrings("MyDB").ToString())
    Dim dbCmd As SqlCommand = New SqlCommand()
    dbCmd.Connection = dbConn

    Dim wholeFile As String = sr.ReadToEnd()

    Do
        Dim fields() As String = line.Split(",")

        dbCmd.CommandText = "INSERT INTO dbo.TestData (Column1,Column2,Column3) " & _
        " VALUES (‘" & fields(0) & "’," & fields(1) & ",’" & fields(2) & "’)"

        dbConn.Open()
        dbCmd.ExecuteNonQuery()
        dbConn.Close()
        i = i + 1
        line = sr.ReadLine()

    Loop While Not line = String.Empty

End Sub

2) StreamReader.ReadLine, Batch Insert With DataAdapter

Sub Method2()
    Dim i As Long = 0
    Dim dbConn As SqlConnection = New SqlConnection(ConfigurationManager.ConnectionStrings("MyDB").ToString())
    Dim sr As StreamReader = New StreamReader(filename)
    Dim line As String = sr.ReadLine()

    Dim strArray As String() = line.Split(",")
    Dim dt As DataTable = New DataTable()
    Dim row As DataRow

    For Each s As String In strArray
        dt.Columns.Add(New DataColumn())
    Next

    Do

        row = dt.NewRow()
        row.ItemArray = line.Split(",")

        dt.Rows.Add(row)

        i = i + 1
        line = sr.ReadLine()

    Loop While Not line = String.Empty

    Dim dataAdapter As New SqlDataAdapter()

    dataAdapter.SelectCommand = New SqlCommand("SELECT TOP 1 Column1,Column2,Column3 from dbo.TestData", dbConn)

    Dim cmdBuilder As SqlCommandBuilder = New SqlCommandBuilder(dataAdapter)

    dbConn.Open()

    Dim ds As DataSet = New DataSet
    dataAdapter.Fill(dt)

    dataAdapter.UpdateBatchSize = 1000
    dataAdapter.Update(dt)

    dbConn.Close()
End Sub

3) StreamReader.ReadLine, SqlBulkCopy

Sub Method3()
    Dim i As Long = 0
    Dim dbConn As SqlConnection = New SqlConnection(ConfigurationManager.ConnectionStrings("MyDB").ToString())
    Dim sr As StreamReader = New StreamReader(filename)
    Dim line As String = sr.ReadLine()

    Dim strArray As String() = line.Split(",")
    Dim dt As DataTable = New DataTable()
    Dim row As DataRow

    For Each s As String In strArray
        dt.Columns.Add(New DataColumn())
    Next

    Do

        row = dt.NewRow()
        row.ItemArray = line.Split(",")

        dt.Rows.Add(row)

        i = i + 1
        line = sr.ReadLine()

    Loop While Not line = String.Empty

    Dim bc As SqlBulkCopy = New SqlBulkCopy(dbConn, SqlBulkCopyOptions.TableLock, Nothing)
    bc.DestinationTableName = "TestData"

    bc.BatchSize = dt.Rows.Count

    dbConn.Open()
    bc.WriteToServer(dt)
    dbConn.Close()
    bc.Close()
End Sub

The results of the 3 methods are surprising. The thing is, most people are going to use Method1 because it just is the first thing you think of doing, and maybe the easiest to code (everyone learns loops in school, etc) – now, nitpickers will say "use a stored proc" etc – that will save minimal time, and in best practice yes, but for the sake of the example bear with it..

Method2 is less intuitive, and really tricky to get working (at least I had some issues with it) but once it works, it makes a little bit more sense then Method1.

Method3 is something that no one ever hears or uses, but once they do, they never go back.

Side note: about 5 years ago I worked on a program that inserted huge files, and they were taking 10-20 minutes a piece. I was using VB6, and converted the line by line insert to use BCP from code and got it down to 2-3 minutes, which was good. So I know about BCP and BULK INSERT. I just didn’t know it was built into .NET, now I do..anyways, on to the results.

Method 1- 14.69 minutes to insert 1 million records

Method 2 – 7.88 minutes to insert 1 million records

Method 3 – 0.28 minutes to insert 1 million records

So yeah. Wow. That is not a typo for Method 3. Roughly 17 seconds. Now, give Method2 so credit, it reduced the time from Method1 by 50% but Method3 just KILLS them both. Man, just awesome. When you run something like that and see that kind of performance, you can’t help but smile.

A few caveats and gotchas:

Method2 – the BatchSize property I have set to 1000. If you set to 0 it uses the max. I tried this and locke
d my machine up. Yikes.

Method3 – The SqlBulkCopyOptions makes a difference – TableLock speeds up the operation. The BatchSize here I have set to the # of rows, It might run differently with different batch sizes, I really didn’t experiment with it, but adding the Copy Options and BatchSize, it sped up the operations.

So, the fastest way I have found in .NET to load data from files to sql – hands down – SqlBulkCopy. Since it took 17 seconds, The next step is to try different file reading methods and see what time times are there, like I said, maybe that will be my next post. 🙂

p.s. yes I am up at 2:00 AM posting this, insomnia anyone? In any event I got to watch the repeat of the Nevada Democratic Debate 🙂

Technorati Tags: ,,,,,,
Categories
Geeky/Programming SQLServerPedia Syndication

SQL Server 2005 Books Online Scoped Search

Stumbled upon this today

http://search.live.com/macros/sql_server_user_education/booksonline/

Pretty sweet, using search macros, you can search books online..well, ONLINE 🙂

Here is a search for DATEDIFF

http://search.live.com/results.aspx?q=DATEDIFF&go=Search&form=QBJK&q1=macro%3Asql_server_user_education.booksonline

Handy little macro if you don’t have SQL BOL installed. You can always search Google or MSDN to find the info, but this seems “scoped” for just what you need for books online, pretty cool if you ask me..

Technorati tags: , , , ,
Categories
Business Intelligence Geeky/Programming SQLServerPedia Syndication

SQL 2005: OpenRowset, Dynamic MDX and Variable Scope

So, the other day I had to create something in T-SQL that called a MDX query using OpenRowset – this is pretty easy to do, you can query around the openrowset and get the values back you need in a T-SQL Query. This was fine when the MDX query was a static string.

The format of the query would be like this:

SELECT * FROM OpenRowset(‘MSOLAP’, ‘DATA SOURCE=MySSASServer; Initial Catalog=MySSASDB;’,’MyMDXQuery HERE’) as a

Now, the * will give you the columns from your MDX query, in the example above MyMDXQuery would be replaced with your actual MDX query.

The problem comes in, if you want your MDX query (which is a string), to contain some variable, so that you can pass something into the OpenRowset (say a date, or some other variable)…

The problem is, you need to execute the whole query (not just the OpenRowset) as a string, and the scope of variables is lost. You cannot declare a variable outside the TSQL string you want to EXEC, then set it inside the TSQL statement, then use it after. This makes it tough to get data out of the OpenRowset execution. Now if you just are executing the TSQL and getting a result set back for a report or something, it will work without doing what I am doing here, but if you need a scalar value back or something to use in a query later in your proc, then you need to do this. I tried different solutions and this was the only one I could get to work. Like I said, declaring a var before and trying to use in the TSQL exec wont work. Also, a RETURN wont work, it will give you an error saying it doesn’t work in the scope or something similar, here is an example of what does work – using a temp table.

 

DECLARE @TSQL varchar(max)

CREATE TABLE #results
(
  mytempresult DECIMAL(10, 3)
)

SET @TSQL = ‘
DECLARE @myVar AS DECIMAL(10,3)

SELECT @myVar =
SELECT [Measures].[MyMeasure] FROM
OpenRowset(
”MSOLAP”,
”DATA SOURCE=MySSASServer; Initial Catalog=MySSASDB;”,
”WITH
   MEMBER Measures.[MyMeasure]
  AS (‘ + @SomeDynamicString + ‘)
SELECT
{[Measures].[MyMeasure]} ON COLUMNS
FROM [MyCube]
”)
as a

INSERT INTO #results VALUES (@myVar)

EXEC ( @TSQL )

DECLARE @myVarForReal AS DECIMAL(10, 3)
SELECT  @myVarForReal = mytempresult
FROM    #results

DROP TABLE #results

as you can see, I CREATE the temp table outside the TSQL var, then I actually declare a var inside the TSQL statement, set it in my OpenRowset call, which I pass in some other var (@SomeDynamicString) and then insert that value into my temp table.

I then EXEC that TSQL statement, and then grab my variable for real from the temp table, and drop the temp table. You would think that I could just reference @myVar after the EXEC, but it doesn’t exist, and if I declare it outside the TSQL var, it will be empty after, and it won’t get set when I EXEC the TSQL.

Just a “gotcha” if you ever run into executing dynamic MDX from TSQL and you need to get a scalar value back from the MDX.. whew 🙂

Technorati tags: , , , , ,
Categories
Geeky/Programming SQLServerPedia Syndication

SQL: Moving log files for an existing database to another drive

So today, Chris asked me how to move log files or data files on SQL to another drive. My first response was to use the GUI and just detach and then move the files, reattach using the GUI. But he wanted to move just the log files, I don’t think it is very intuitive with the GUI, if even possible.

The KB article from Microsoft shows you how to do with SQL commands (sp_attach_db and sp_detach_db), which is really the correct way, although, it doesn’t say how to do it if you have multiple log files or data files.

Books online shows us:

sp_attach_db [ @dbname= ] ‘dbname’         , [ @filename1= ] ‘filename_n’ [ ,…16 ]

[ @filename1= ] ‘filename_n’ Is the physical name, including path, of a database file. filename_n is nvarchar(260), with a default of NULL. Up to 16 file names can be specified. The parameter names start at @filename1 and increment to @filename16. The file name list must include at least the primary file. The primary file contains the system tables that point to other files in the database. The list must also include any files that were moved after the database was detached.

After all that, you should be good to, moving your data and log files around all you want! And as a side note, if you have SQL 2000, use Query Analyzer to connect and run the commands, if you have SQL 2005, use SQL Server Management Studio (SSMS) or SQL Server Management Studio Express (SSMSE) to connect and run the commands.

Technorati tags: , , , , ,