In writing a quick and simple app the other day, I had a requirement to link directly to a report hosted on a Reporting Services instance. The tricky part was working with directories and parameters for the report.
It was for a report called "Product Detail" in a directory called "Product Reports" taking a single parameter called ProductID. The following seemed to work.
http://myServer/ReportServer_INSTANCE/Pages/ReportViewer.aspx?%2fProduct+Reports%2fProduct+Detail&rs:Command=Render&ProductID=24
Thursday, 28 July 2011
Thursday, 21 July 2011
T-SQL: Service Broker to alert errors
I read this interesting blog post by Jamie Thomson a few weeks back and it inspired me to write up this article.
I've been "kicking the tyres" of Service Broker myself on and off over the past few months as its an area of SQL Server which I don't often seen used in anger. I've put together a sample here which demonstrates how you may use Service Broker to alert your DBAs to any application errors in the database using Database Mail (specifically dbo.sp_send_dbmail).
1) Error gets logged to an central application error table
2) The error is passed to Service Broker for delivery
3) Service Broker delivers error via email in a nice format.
I've been "kicking the tyres" of Service Broker myself on and off over the past few months as its an area of SQL Server which I don't often seen used in anger. I've put together a sample here which demonstrates how you may use Service Broker to alert your DBAs to any application errors in the database using Database Mail (specifically dbo.sp_send_dbmail).
1) Error gets logged to an central application error table
2) The error is passed to Service Broker for delivery
3) Service Broker delivers error via email in a nice format.
/*
#################################################################
-- configure Service Broker Objects
#################################################################
*/
ALTER DATABASE GlobalDB
SET ENABLE_BROKER
WITH ROLLBACK IMMEDIATE
GO
USE GlobalDB
GO
-- initially a messagetype
CREATE MESSAGE TYPE [ApplicationError]
VALIDATION = NONE;
GO
-- and a contract to send messages of that type
CREATE CONTRACT [ApplicationErrorContract]
(
[ApplicationError] SENT BY INITIATOR
);
GO
-- a queue to send the messages to
CREATE QUEUE [ApplicationError_Snd_Q];
GO
-- and a service that manages that queue
CREATE SERVICE [ApplicationError_Snd_Svc]
ON QUEUE [ApplicationError_Snd_Q];
GO
-- we also need a queue to receive messages into
CREATE QUEUE [ApplicationError_Rcv_Q];
GO
-- and a service that manages that queue
CREATE SERVICE [ApplicationError_Rcv_Svc]
ON QUEUE [ApplicationError_Rcv_Q]([ApplicationErrorContract]);
GO
/*
#################################################################
-- we'll need to configure database mail here
-- to use this ASIS you'll need a default profile configured
#################################################################
*/
/*
#################################################################
-- configure database objects
#################################################################
*/
USE GlobalDB
GO
-- a generic table for holding error logs
CREATE TABLE [dbo].[tblErrorLog]
(
[LogID] [int] IDENTITY(1,1) NOT NULL,
[ErrorMessage] [nvarchar](2000) NULL,
[ErrorNumber] [int] NULL,
[ErrorSeverity] [int] NULL,
[ErrorState] [int] NULL,
[ErrorLine] [int] NULL,
[ErrorProcedure] [nvarchar](555) NULL,
[DatabaseName] [nvarchar](555) NULL,
[Date_Created] [datetime] NULL,
[Created_By] [nvarchar](555) NULL
)
GO
-- this function will take the error message from SB, parse it and return it in
-- HTML format for the send mail task
CREATE FUNCTION dbo.GetErrorEmailBody(@Msg XML)
RETURNS NVARCHAR(MAX)
AS
BEGIN
DECLARE @Errors TABLE (
LogID INT,
ErrorMessage NVARCHAR(1000),
ErrorProcedure NVARCHAR(1000),
Created_By NVARCHAR(1000),
Date_Created DATETIME
)
INSERT INTO @Errors
SELECT tbl.col.value('LogID[1]','int'),
tbl.col.value('ErrorMessage[1]','nvarchar(1000)'),
tbl.col.value('ErrorProcedure[1]','nvarchar(1000)'),
tbl.col.value('Created_By[1]','nvarchar(1000)'),
tbl.col.value('Date_Created[1]','datetime')
FROM @Msg.nodes('//Errors') AS tbl(col)
DECLARE @tableHTML NVARCHAR(MAX) ;
SET @tableHTML =
N'<H1>Application Errors</H1>' +
N'<table border="1">' +
N'<tr><th>Log ID</th><th>Error Message</th>' +
N'<th>Procedure</th><th>Date Created</th>' +
N'<th>Created By</th></tr>' +
CAST ( ( SELECT TOP 50 td = LogID, '',
td = ISNULL(ErrorMessage,''), '',
td = ISNULL(ErrorProcedure,''), '',
td = Date_Created, '',
td = Created_By, ''
FROM @Errors
ORDER BY Date_Created DESC
FOR XML PATH('tr'), TYPE
) AS NVARCHAR(MAX) ) +
N'</table>' ;
RETURN @tableHTML;
END
GO
-- a stored procedure to send data to the send queue
CREATE PROCEDURE dbo.usp_SB_Snd_ApplicationError
(
@LogID INT
)
AS
BEGIN
DECLARE @MessageBody XML
DECLARE @ConfigurationID INT
DECLARE @CalibrationID INT
SELECT @MessageBody = (
SELECT DISTINCT LogID, ErrorProcedure, ErrorNumber, ErrorMessage, Date_Created, Created_By
FROM dbo.tblErrorLog
WHERE LogID = @LogID
FOR XML PATH ('Errors'), TYPE
)
DECLARE @Handle UNIQUEIDENTIFIER;
BEGIN DIALOG CONVERSATION @Handle
FROM SERVICE [ApplicationError_Snd_Svc] TO SERVICE 'ApplicationError_Rcv_Svc'
ON CONTRACT [ApplicationErrorContract]
WITH ENCRYPTION = OFF;
SEND ON CONVERSATION @Handle
MESSAGE TYPE [ApplicationError](@MessageBody);
END
GO
-- this procedure will receive messages and process them
CREATE PROCEDURE dbo.usp_SB_Rcv_ApplicationErrorr
AS
BEGIN
SET NOCOUNT ON;
DECLARE @Handle UNIQUEIDENTIFIER;
DECLARE @MessageType SYSNAME;
DECLARE @Message XML
DECLARE @BodyTable NVARCHAR(MAX);
RECEIVE TOP (1) @Handle = conversation_handle,
@MessageType = message_type_name,
@Message = message_body
FROM [ApplicationError_Rcv_Q];
IF(@Handle IS NOT NULL AND @Message IS NOT NULL)
BEGIN
-- parse the message xml to get the error details
-- build the HTML table for send_dbmail
SET @BodyTable = (SELECT dbo.GetErrorEmailBody(@Message));
-- send the mail on the default profile
EXEC msdb.dbo.sp_send_dbmail
@recipients='richard.brown@barrhibb.com',
@subject = 'Errors',
@body = @BodyTable,
@body_format = 'HTML' ;
END CONVERSATION @Handle;
END
END
GO
-- this procedure will receive messages and process them
CREATE PROCEDURE dbo.[usp_SB_Snd_ProcessResponses]
AS
DECLARE @conversation_handle UNIQUEIDENTIFIER
DECLARE @message_body XML
DECLARE @message_type_name SYSNAME
DECLARE @error_message NVARCHAR(3000)
WHILE 1 = 1
BEGIN
SET @message_type_name = NULL;
WAITFOR (
RECEIVE TOP (1)
@conversation_handle = conversation_handle,
@message_body = CAST(message_body AS XML),
@message_type_name = message_type_name
FROM dbo.[ApplicationError_Snd_Q]), TIMEOUT 1000;
IF @message_type_name IS NULL BREAK; --no more messages
IF @message_type_name = 'http://schemas.microsoft.com/SQL/ServiceBroker/EndDialog'
BEGIN
END CONVERSATION @conversation_handle;
END
ELSE
BEGIN
IF @message_type_name = 'http://schemas.microsoft.com/SQL/ServiceBroker/Error'
BEGIN
-- should log expected errors here
END CONVERSATION @conversation_handle;
END
ELSE
BEGIN
--All other messages types are unexpected here - log error
SET @error_message = N'<Error>Unexpected message type</Error>';
END CONVERSATION @conversation_handle
WITH ERROR = 1
DESCRIPTION = @error_message;
END
END
END
GO
-- a generic logging procedure
CREATE PROCEDURE [dbo].[usp_LogError]
AS
DECLARE @LogID INT
INSERT INTO dbo.tblErrorLog
(
ErrorMessage, ErrorNumber, ErrorSeverity, ErrorState,
ErrorLine, ErrorProcedure, DatabaseName, Date_Created, Created_By
)
SELECT
ERROR_MESSAGE() AS ErrorMessage
,ERROR_NUMBER() AS ErrorNumber
,ERROR_SEVERITY() AS ErrorSeverity
,ERROR_STATE() AS ErrorState
,ERROR_LINE () AS ErrorLine
,ERROR_PROCEDURE() AS ErrorProcedure
,DB_NAME()
,GETDATE()
,SUSER_SNAME()
SET @LogID = SCOPE_IDENTITY();
EXEC dbo.usp_SB_Snd_ApplicationError @LogID;
GO
/*
#################################################################
-- active the queues
#################################################################
*/
-- lets active the queue
ALTER QUEUE [ApplicationError_Rcv_Q] WITH ACTIVATION
(
STATUS = ON,
MAX_QUEUE_READERS = 1,
PROCEDURE_NAME = dbo.usp_SB_Rcv_ApplicationError,
EXECUTE AS OWNER
);
GO
ALTER QUEUE [ApplicationError_Snd_Q] WITH ACTIVATION
(
STATUS = ON,
MAX_QUEUE_READERS = 1,
PROCEDURE_NAME = dbo.[usp_SB_Snd_ProcessResponses],
EXECUTE AS OWNER
);
GO
-- TEST --
CREATE PROCEDURE dbo.TestSB
AS
-- force an error
BEGIN TRY
SELECT 1/0
END TRY
BEGIN CATCH
EXEC dbo.usp_LogError
END CATCH
GO
EXEC dbo.TestSB
GO
Labels:
Service Broker,
SQL,
SQL 2008,
SQL Server,
SQL2008,
T-SQL
Wednesday, 13 July 2011
Admin: Enable .Net 3.5 SP1 on Windows 2k8
.Net 3.5 Framework SP1 is a pre-req of installing SQL2008 onwards and so I'm posting this here as reminder to myself of how to get this on your Windows 2008 server as you need to enable it via Powershell.
1) Run Powershell with Administrator privileges
2) Execute Import-Module ServerManager
3) Execute Add-WindowsFeature AS-NET-Framework
This is what it should look like:
Enjoy.
1) Run Powershell with Administrator privileges
2) Execute Import-Module ServerManager
3) Execute Add-WindowsFeature AS-NET-Framework
This is what it should look like:
Enjoy.
Labels:
.Net Framework,
Admin,
DBA,
Powershell,
Setup,
SQL,
Windows,
Windows 2008
Tuesday, 12 July 2011
T-SQL: ORDER BY on SELECT not INSERT
This post is contributing to T-SQL Tuesday #20 hosted by Amit Banerjee blog | @banerjeeamit on the topic of Best Practices.
I recently was called in to help one of the developers with an inconsistency they were having with an SSRS report. Essentially, thre report was "randomly" producing incorrect results but it yet it was impossible to faithfully recreate the issue. As is a developers way, he was on the attack against SQL Server suggesting it was a bug with the software. I must admit, I was a bit confused and my mind was wandering way ahead of itself checking for Service Pack or Hotfixes that may address the issue.
And then I remembered to go back to first principals and check that the basics were all covered so I opened up the function used in the report.
CREATE FUNCTION [dbo].[tvf_GetData](@Id INT)
RETURNS
@Results TABLE (
[ID] INT NULL,
[Output_ID] INT NULL,
[Value] DECIMAL (30, 15) NULL
)
AS
BEGIN
INSERT INTO @Results
SELECT Id, OutputId, Val
FROM dbo.tbl1
WHERE Id = @Id
ORDER BY Id -- ORDERING HERE ON THE INSERT
INSERT INTO @Results
SELECT Id, OutputId, Val
FROM dbo.tbl2
WHERE Id = @Id
ORDER BY Id -- ORDERING HERE ON THE INSERT
RETURN
END
GO
The report was highly dependent on the results of the above query coming back in a particular order. As you can see from my comments in the function body, the developer had attempted to force the order of the data by inserting it into the temporary table in a specific order. The mistaken belief that the data would come out in the same order as it went in was at the crux of this issue. The query calling this function did not have an explicit order statement and so it was left to the optimiser to make the call on how the data was returned. In this instance, the ORDER BY clauses on the insert statements are more of a hinderance than a help as they may adversely affect performance (ordering can be an expensive operation to perform) without adding any benefit.
The simple way of resolving the issue was just to add an explicit ORDER BY clause when querying the function:
SELECT *
FROM [dbo].[tvf_GetData]()
ORDER BY Id ASC
Ordering Best Practice
I don't care how many times I hear things along the lines of "you don't need an ORDER BY clause if you have a clustered index". I'm of the opinion that if you don't explicitly specify an ORDER BY clause then you are taking your chances if you rely on data coming out in a particular order. Sure, there may be emipirical evidence to suggest that not using one doesn't make a difference in certain scenarios but I like to err on the side of caution.
Not using one essentially puts you at the mercy of the query optimiser and although its a very mature and effective engine, its behaviour is out of your control. Should MS release a patch which effects how it interprets queries then your whole system would be compromised. If you have an explicit ORDER BY clause then your chances of being affected are negligible.
Friday, 8 July 2011
SQLPeople: I'm a SQL Person!!
Thursday the 7th July 2011 was a big day for me. I officially became a SQL Person. No, its not the first time I had a job whose core responsibilities revolved around SQL Server nor even the first time i'd written a SQL query. No, this was the day when I had my interview published on SQLPeople
SQLPeople is the brainchild of Andy Leonard (Blog | Twitter) and is a project for the SQL community. The SQL community is a vibrant, rewarding and educational group to be a part of and SQLPeople adds yet another dimension. At the moment, its principally a series of blogs about people within the SQL community and an opportunity to put some personality around those guys who you may only ever see in the context of the MSDN Forums, Blogs or #SQLHelp. However, I can see this concept growing and growing and I believe there has already been a first event in the US. Andy has done a great job.
I'm proud to be a part of SQLPeople and its reaffirmation that working in the SQL Server arena can be a rewarding experience.
Read my interview here.
SQLPeople is the brainchild of Andy Leonard (Blog | Twitter) and is a project for the SQL community. The SQL community is a vibrant, rewarding and educational group to be a part of and SQLPeople adds yet another dimension. At the moment, its principally a series of blogs about people within the SQL community and an opportunity to put some personality around those guys who you may only ever see in the context of the MSDN Forums, Blogs or #SQLHelp. However, I can see this concept growing and growing and I believe there has already been a first event in the US. Andy has done a great job.
I'm proud to be a part of SQLPeople and its reaffirmation that working in the SQL Server arena can be a rewarding experience.
Read my interview here.
Thursday, 7 July 2011
Powershell: List Local Users & Groups
I recently encountered a permissions issue on SQL Server, the route cause of which was a user being removed from a local windows group on a production server. This got me thinking of yet another use for Powershell, to list out the local users and groups on a particular server.
ListLocalGroupMembership.ps1
The basis of this script was taken from here.
You could easily take this a stage further and log this data to a repository and then alert any changes made.
ListLocalGroupMembership.ps1
$computerName = Read-Host 'Enter computer name or press <Enter> for localhost'
if ($computerName -eq "") {$computerName = "$env:computername"}
$computer = [ADSI]"WinNT://$computerName,computer"
$groups = $computer.psbase.Children | Where-object { $_.psbase.schemaclassname -eq 'group' }
foreach ($group in $groups)
{
$users = $group.psbase.invoke("Members")
$group.name
write-host "----------------------------------------------------"
if ($users -ne $null)
{
foreach ($user in $users)
{
$user.GetType().InvokeMember("Name","GetProperty",$null,$user,$null)
}
}
else
{
write-host "-----EMPTY-----" -foreground "red"
}
write-host "----------------------------------------------------"
write-host ""
}
The basis of this script was taken from here.
You could easily take this a stage further and log this data to a repository and then alert any changes made.
Labels:
Computer Management,
Configuration,
DR,
Group Membership,
Members,
Powershell,
SQL,
Windows
Tuesday, 5 July 2011
Denali: SSMS has Improved look and feel
One of the first things I noticed with the new version of SQL Server (Denali) was the changes to Management Studio and the development experience. While not as ground breaking as the move from Query Analyzer to SSMS when SQL 2005 was released, there is definitely further movement towards a common development platform for programmers.
Here is the initial splash page you get when booting up SSMS:
The main thing I noted here was the comment in the bottom right corner which explicitly states "Powered by Visual Studio". To me, this is a clear indication of the direction MS is heading with its database toolset.
And here we have a shot of what using SSMS for development looks like:
Notice here how we have much more varied colour coding for keywords, specifically variables and object names. This is a vast improvement on the generic black text in previous versions. Another feature you can see here is the bottom left of the shot above the results pane - a zoom dropdown allowing you to quickly increase the size of the text in your query window. There is also one for the results pane, although this shot doesn't show it.
Development Environment Convergence
We've already seen Visual Studio creeping into the SQL Server arena with the Business Intelligence Development Studio (BIDS) which is essentially the VS shell with SQL Server plugged in and I can see MS trying to consolidate their development environments still further. This makes a lot of sense for software devs who will often be developing database enabled applications and need to flit between their .Net code and database code. Having a more uniform and familiar environment makes the experience that much smoother and hopefully more efficient.
Here is the initial splash page you get when booting up SSMS:
The main thing I noted here was the comment in the bottom right corner which explicitly states "Powered by Visual Studio". To me, this is a clear indication of the direction MS is heading with its database toolset.
And here we have a shot of what using SSMS for development looks like:
Notice here how we have much more varied colour coding for keywords, specifically variables and object names. This is a vast improvement on the generic black text in previous versions. Another feature you can see here is the bottom left of the shot above the results pane - a zoom dropdown allowing you to quickly increase the size of the text in your query window. There is also one for the results pane, although this shot doesn't show it.
Development Environment Convergence
We've already seen Visual Studio creeping into the SQL Server arena with the Business Intelligence Development Studio (BIDS) which is essentially the VS shell with SQL Server plugged in and I can see MS trying to consolidate their development environments still further. This makes a lot of sense for software devs who will often be developing database enabled applications and need to flit between their .Net code and database code. Having a more uniform and familiar environment makes the experience that much smoother and hopefully more efficient.
Labels:
C#,
Denali,
Developer,
development,
Microsoft,
SQL 2005,
SQL 2008,
SQL Denali,
SQL Developer Tools,
SQL Server,
SSMS,
Visual Studio
Subscribe to:
Posts (Atom)