Posts

Showing posts from 2009

My Adventures in Codeplex

In October 2009 I created 2 codeplex projects which I would to tell you about: SQL Server Reporting Services MSBuild Tasks (ssrsmsbuildtasks) A few years ago I created some tasks for MSBuild to help deploy reports for my project, details can be found here ,  since then my tasks have been re-used in a few other projects within my company.  However lately we have been doing a few projects which have made use of Reporting Services integrated mode with SharePoint, which meant that my tasks were unusable as they only work with a native mode report server. So I have been updating them to include support for an integrated mode report server.  Also before I added the integrated mode support I took some time to rewrite my original native mode tasks as well.  The re-writing of the tasks includes some improved understanding of MSBuild by making more use of Item Groups and metadata for setting of report server properties of the report items being deployed.  I placed the tasks within code

SfTS V3 Beta 2:Helping to see the wood for the trees

Image
Within any application lifetime management methodology, the reports are a useful tool to help see how well , or not , the project is progressing.  Within the Scrum methodology the reports which are used for tracking progress are the burndown reports.  In the Scrum for Team Systems process templates for Team Foundation Server these reports are available to help with managing the project.  With a manual run burndown report when an unexpected trend in the burndown happens, there is an option to manually add an annotation.  This would help during a retrospective when talking about why the trend oddity occurred.  Within our previous versions of the Scrum for Team System templates our burndown reports would only show the end of each days aggregated totals.  Any unexpected totals would have to investigated to understand the underlying cause.  Previously this would have involved either using Team Foundation Explorer to review the history and comments; Using Excel Pivot Services to query the

Unit Testing Report within Reporting Services: My Theory

I have been working with Microsoft SQL Server Reporting Services, since its first release, within many of my projects.  One of the biggest tasks, that I find with writing a report is the testing the reports and making sure the data that is displayed is correct.  My method of developing and testing reports is as follows: Write the queries, outside Reporting Services, for creating the report data set(s) with the parameters defined which will be configured within the report. Define the parameters that report data set would create and assign a valid value. Execute the queries. Inspect the results to make sure that they meet the expected results. Repeat with a new valid value or fix any issues. After one or two or even three passes that have worked then I take the queries and put them into report data sets. Then I define the report layout. Then I preview the report make sure it’s still correct Then I release to our testers for

SfTS V3: The Beta 1 reports new comers

Image
As mentioned in my previous blog post, there are some new reports available in the Scrum for Team System V3 Beta 1 Template. I would like to give a high level overview of the new reports as there will be full report guidance available with the template explaining in more detail how to use the reports. The new Diagnostic report isn’t really a new report, more of a spin off from another report. Version 2.x included a Version report that provided details about the status of the Team Foundation Server warehouse process. This covered information like: When the process last ran When the process finished loading data into the warehouse How often the warehouse process would run When the last time the cube was changed Team Foundation Server 2010 provides more information about the Team Foundation Server warehouse process, including: Is the warehouse blocked from running Is the OLAP processing blocked from running The last full process started and ended time The last incremental pro

SfTS V3 Operation: Reports face lift.

Image
For the last few months I, with the help of fellow work colleague Kate Begley , have been working on the writing the reports for the Scrum for Team System V3 process template, which Beta 1 has been released and details are available from http://blogs.conchango.com/sfts/archive/2009/07/28/scrum-for-team-system-v3-beta-programme.aspx . There are a number of changes, from a reporting aspect, that have been introduced into Team Foundation Server 2010 platform and the Scrum for Team System V3 process template like: Warehouse schema changes; only supporting Reporting Services 2008; a sophisticated multi-team support model and new QA model. As a result the reports have been completely rewritten so there is a limited number in this beta release. However I would like to highlight some of the smaller updates that been applied to the reports to help readability / usability of the reports which are as follows: A reporting site folder structure has been created to make browsing the report

Reporting Services 2008 Report automatic upgrade gotcha.

For the Scrum for Team System project I have been focusing on the reports; I came across an issue which was due to the way the Reporting Services 2008 "automatic upgrade" feature upgraded our reports. The Scrum for Team System template contains over 20 reports all written for Reporting Services 2005. This was done because Team Foundation Server 2008 only worked on SQL Server 2005 and Reporting Services 2005. Then in August of 2008 Team Foundation Server 2008 Services Pack 1 was released which added support for the Team Foundation Server 2008 is to run on SQL Server 2008 platform. With Reporting Services 2008 there is a new reporting definition language schema. However reporting services does have backward compatibility method as describe below: The report server will validate report definition file against the included reference to the RDL namespace which specifies the version of the report definition schema that is used. The report is automatically upgraded the first

Some practical SQL Spatial tips.

I have just finished a project were I made a lot of use of the SQL Spatial to do some processing and loading into the database. Here are some of lesson I learnt: The first method that needs to call after instantiating a SQLGeographybuilder object is: SetSrid() then the BeginGeography(), BeginPoint(). Before using sending a SQLGeometry object to the SQL Server use the IsVaild() function to ensure that Geometry object is valid. I create a console application which loaded line, which was using OS coordinates system, data from a flat file. Some of the line data within the file wasn’t as correct as I was led to believe. The application create the SQLGeometry object .NET but the Sql Server then rasied the error when it received and try to save it into the table. The STPointN() function is 1 base which is stated in the Books On-Line If your query needs to select some of the derive data from the functions you can help performance by using persisted compute columns on t

Working With Reporting Services Multiple Value Parameters

A couple of colleagues of mine were working on a complicated reporting services project; they had one report which had a parameter which allowed a user to select multiple values of numbers, the problem they were facing was how pass the selected values into the query. They were looking for a way within SQL Server to split a string by a delimiter. As they were using a stored procedure to access the data and multiple value parameters, regardless of parameter type, are passed in as a string of comma separated values. One thing which they weren’t aware of was that with a multiple value parameter, the behaviour can differ between how the data is being retrieved i.e.: stored procedure or embed sql statement. If the data set is using direct query to gather the data then the multi value parameter can be used in the query with a IN clause like so: Where column1 in (@pMultiValueParam) What happens is that before the query is sent to the SQL Server the query the report server substitutes

How to load spatial data into SQL Server 2008 from .Net

I have been working on a project which made use of the spatial data type geography within SQL Server 2008. An issue that I had was how to load the geography data from a KML file into SQL Server 2008. Currently there is no out of the box tools to do this. There is a 3 rd party tool, Safe FME, which offer either their own tool or components which extend integration services. This was overkill for my issues as I only had to do it once. So I wrote a console application which parses the KML file to extract the point data convert it to a SqlGeography type and store it in the database. To use the SqlGeography C# type you need to add reference to the following name space: Microsoft.SqlServer.Types. This can be found in the following dll: Microsoft.SqlServer.Types.dll Then use the following code to create the c# sqlgeography type: 1: // use SqlGeographyBuilder to help create the SqlGeography type 2: SqlGeographyBuilder geographyBuilder = new SqlGeographyBuilder(); 3: s

Business Intelligence with SQL Server 2008 Geometry Data Type.

Image
I am currently working on a project which is automating a business intelligence process base upon images and image recognition. The process is as follows: Take a photography Put the photography through the image recognition software and record the recognition data. Process the recognition data into meaningful business metrics Produce a report on metrics The part of the process, from above, that I am helping with is processing the recognition data (the hit data) into meaningful business data. This process simply takes the hit data (which is a point), then creates some square blocks, which I use to group the hits data and perform some metric operations. The metrics currently use data about the area and the density of hits within the area. To help process this data I decided to make use of the SQL Server 2008 spatial type: Geometry. Firstly I would like to point out that I didn’t have to use the geometry type. As I am currently dealing with square areas, I can group