9 Apr 2010

Configuring Excel Services & PowerPivot on multi server Topology

I have been working with a couple of colleagues, James Dawson and Russell Seymour , on installing and configuring PowerPivot within a SharePoint 2010 beta 2 farm.

We used the following instructions to install PowerPivot on one of the application tier server: http://msdn.microsoft.com/en-us/library/ee210616(SQL.105).aspx.

After following the instructions we were able to verify the installation as we had a SharePoint site with PowerPivot gallery.  We managed to uploaded and open a couple of Excel workbooks, with PowerPivot embedded, within the web browser.  Also we were able to connect to the PowerPivot service and could see the PowerPivot data files loaded. When I tested the slicers I got greeted with the following error:

The data connection uses Windows Authentication and Excel Services is unable to delegate user credentials. The following connections failed to refresh: Sandbox

We did some research and found the following blog post: http://powerpivotgeek.com/2009/12/11/excel-services-delegation/.  I implemented the changes it recommended to our Excel workbooks and tried again this time getting the following error:

The data connection uses None as the external data authentication method and Unattended Services has not been configured on Excel Services. The following connections failed to refresh: SandboxHowever at the time of this error we had already configured the Excel Services with the Secure Store Service application identity that we had created. 

The problem proved to be with the configuration of all the service applications. At the start all the services applications had their own application pools with there own domain users. 

The solution in the end was to run all the services applications under the following app pool configuration: SharePoint Web Services System.

9 Mar 2010

Using Powershell to understand a replication topology

I recently had to investigated a client replication topology.  I needed to understand the publications, subscribers and the articles. The problems I was facing were as follows:

  • The replication topology was only defined on the production infrastructure.
  • There was no documentation about the replication topology.
  • I had no way of being able re-create replication topology.
  • Only short amount of time.

I was able to script the replication topology from the SQL Server Management Studio.  This was partially helpful as I now had a script which could be used to review the replication topology.  However this end up being a fairly large file and would be time consuming to review all information. 

So I deicide to write a PowerShell, SQL Server Powershell for the Invoke-Sqlcmd commandlet, that would do the following steps:

  • Parse the file to find the lines containing the following store procedures:
    • sp_addarticle.
    • sp_addsubscription.
    As theses contain most useful parameters to help understand the replication topology
  • Pares the line to find the key parameters and their values.
  • Then insert the information into tables within a database.

I chose to use PowerShell because this was a perfect fit for a scripting language: manipulating a text file and doing regular expression on strings.

This help me see that articles were being repeated across publication, some of the publication had identical articles but had different subscriber.

A copy of powershell script is available here.

22 Jan 2010

How To: Emailing the Sprint Burndown report

The Scrum for Team System process template includes reports to help manage the progress of project.  The onus is on the team to run - by opening them - and review them.  This is the default behaviour of the Team Foundation Server which our process template built against, however as the Team Foundation Server platform uses SQL Server Reporting Services for its reporting software, the reports can actually be pushed out to the teams via e-mail. 

I have mentioned how to do this for version 2.x, and for version 1.x the same steps can be applied, with the following post:  How to E-mail the Sprint Burndown Chart to yourself or your team.

For version 3.0 (For which beta 2 is now available for download from the following link: http://scrumforteamsystem.com/cs/forums/4554/ShowPost.aspx ) things are a little different.

Firstly, if not already configured, the Reporting Services server will need to will need to be set up to point to an SMTP server. This can be done by using the Reporting Services configuration tool and updating the e-mail settings within this tool.   You can find this tool on your TFS report server under the following: Start Menu > Programs > Microsoft SQL Server 2008 > Configuration Tools > Reporting Services Configuration Manager


To create the subscription you need to use the Report Manager (to gain access to this you can right mouse click on the report folder within the team explorer):


Once the report manager has loaded at the root folder, for your team project report folder, with a list of reports and folders. Then click on the Sprint Burndown report to view the report. After the report has rendered select the subscriptions tab, when the page has loaded, click on New Subscriptions.

Then complete the necessary details on how and when the report should be delivered.  If your team project has only one work stream then all the report parameters can bet set to use the report defaults.  Otherwise set the value of the release and workstream parameter that you require and the remaining parameters can be left set to use the report defaults:


When there is only one work stream the report can work out the release and work stream and then current sprint.  When there is multiple work streams by default the report will get the current release but then default to the first work stream and work out the current sprint for the work stream.  To get the sprint burndown for the other work streams repeat the steps to create a new subscription and select the required release and work stream.  After the release has finished the subscription for the multiple work streams, it will need to be updated to the next release and work streams.