I have been working with Microsoft SQL Server Reporting Services, since its first release, within many of my projects. One of the biggest tasks, that I find with writing a report is the testing the reports and making sure the data that is displayed is correct. My method of developing and testing reports is as follows:
- Write the queries, outside Reporting Services, for creating the report data set(s) with the parameters defined which will be configured within the report.
- Define the parameters that report data set would create and assign a valid value.
- Execute the queries.
- Inspect the results to make sure that they meet the expected results.
- Repeat with a new valid value or fix any issues.
- After one or two or even three passes that have worked then I take the queries and put them into report data sets.
- Then I define the report layout.
- Then I preview the report make sure it’s still correct
- Then I release to our testers for testing.
With this amount of manual testing there could be a potential for mistakes to creep past for several reasons: The test data often isn’t diverse enough to test all the logic of the queries; The report data has a lot of rows which aren’t all inspected; Run out of time as the testing was under estimated so amount of testing is reduced.
I did, at the beginning of last year (2008), look into away of being able to test the reports by an automated tool. I was looking for a way to compare the information that the report was displaying and compare it to some expected results. The need to test the report itself and not just the underlying queries were for the following reasons:
- The Report could be doing more data manipulation outside the SQL queries that have been unit tested.
- Embedded reporting calculations
- Embedded VB.Net code
- External class libraries code
(all of the above could have been unit tested separately but still could produce incorrect information if wired-up incorrectly)
- Test visibility setting for sections of the reports.
- Test any embedded report calculations
- The Report could be using MDX and there isn’t an easy unit testing frame work for MDX
My Project Reasons:
- All the reports used embedded SQL and MDX.
- Writing reports against a 3rd party data warehouse and processes
- There is no database project in which to use a unit test frame work for SQL.
Searching the internet I came across this article: Using Visual Studio 2005 to Perform Load Testing on a SQL Server 2005 Reporting Services Report Server, which gives a guide on how to stress test reports. This is helpful to ensure that the reporting solution/architecture can handle the demands that are placed upon it. I also thought that this method of testing can be used for 2 other forms of testing:
- Simple unit testing: Ensure that report can render. Placing this within a build process, that adds data into the reported database, then runs the web test to check if the report renders can help highlight if there have been changes to the database that breaks the build. This can happen if the reports are based off some application database for simple reporting and the database is using an ORM tool. (Which at least one of our projects is doing).
- Build verification testing: Does the report render? Running a web test after the deployment has happened would highlight any error in the deployment of the solution or deployment of the infrastructure.
I helped to get this method of testing implemented for the above mentioned reason for an application project, where it did help to capture and highlight builds that broke the reports. However as this didn’t meet the requirements fully I was looking as this method, which uses web tests, of testing as it was only inspecting the http responses.
It wasn’t until recently that I starting looking again at unit testing reports again because of my current project as testing the reports is one of the hardest jobs for the following reason: Developing reports based upon a platform so have no control of the flow of the data. So I looked again at the beginning of this year (2009) and again didn’t found much there that would help me. So I went to the drawing board to design my own approach and remember that in June 2008 I used Reporting Services in interesting way; I wrote a report which was only create to generate an dynamic XML document for a web page to consume and bind the data to another control (more information can be found here: Could HTTP Soap / T-SQL endpoint be replaced by Reporting Services XML Reports? ). So I took a further look into the XML render options, to see how the reports that uses chart would render into XML, which my current project heavily utilises, and found that all the values of the data points within the charts were exposed. This meant that I could test the reports in the way that I intended.
My next challenge was to design the process that would render the report into XML and then perform some sort of test. The test was easy, if the report was going to produce an XML document then I should be able determine the how the report will render the XML based upon the following: Expected results based upon the known set of data within data store that report is querying; Passing known values for the reports and knowing the XML setting for the reports; So I decided to do a comparison on the report XML against a prepared XML document. The next problem was how to do the comparison as there wasn’t a tool that I could find. So I asked a few of my more .NET savvy guys as well with no joy.
Then I remembered that SQL Server Integration Services (SSIS) has an XML task that does a XML Compare that compares the element’s attributes and more importantly the values. So I set about creating a SSIS package to do the following:
- Loop though a control file that contains the location of the reports to test; required parameters, values and location of the expect XML output.
- Download the reports into XML format from the Report Server.
- Then use the XML task to compare the results from the report server and compared against the expect results.
Based on some initial testing of this approach work for testing my reports the results were good. However I haven’t been able to fully implement my approach for the following reasons:
- Need to work on creating the test data for project.
- Need to work on getting the test data through the 3rd party process.
- Need to work out how to deal with time as many of my reports show time differently.
- Need to work out how make it part of the build/test process.
Also recently I have found that my approach is similar to a tool that is available to buy from Innosphere.