SPO CSOM Error: For security reasons DTD is prohibited in this XML document

I found myself encountering the following error when authenticating to SharePoint Online using CSOM from PowerShell:

Exception calling “ExecuteQuery” with “0” argument(s): “For security reasons DTD is prohibited in this XML document. To enable DTD processing set the DtdProcessing property on XmlReaderSettings to Parse and pass the settings into XmlReader.Create method.”

I believe that there a number of causes for this issue some of which are firewall and ISP related. This may only resolve a subset of the cases where this issue has been arising, even under the same circumstances.

In my scenario, I found that this issue was only arising when the credentials I was passing were being federated. That is, when the username was *not* in form <me>@<domain>.onmicrosoft.com but rather something like <me>@<domain>.co.uk. It is also possible that this issue resolves itself after a single successful authentication has occurred. Try providing credentials for a *.onmicrosoft.com account, and if that works try again with a federated account. This is discussed more later.

I used Fiddler to compare the request/response trace from a successful authentication and one where this error occurs. It turns out that somewhere internally a request is made to msoid.<full-domain> where <full-domain> is the bit after the @ symbol from the username provided. In the case where this value is of the *.onmicrosoft.com variety, a 502 error (no DNS entry) is returned with no request body and the authentication proceeds successfully. In the other case, the ‘msoid’ URL is resolved and a response with a request body is returned.
In my case the response was a 301 error (permanent relocation), however I read of cases where a 200 (success) has been received. Importantly to note, is that the response, success or otherwise, returns an HTML body containing a DTD (Document Type Declaration), and in turn produces the rather unhelpful error message.

So how do you fix it? Well one way is to provide an entry in your hosts file which ensure that the msoid URL will be invalid. I found that providing a local host entry for it worked. Your hosts file can be found here:

I added a line which looks like the following:        msoid.<domain>.co.uk

And it worked! Intriguingly I found that if I then removed this line from my hosts file, SharePoint Online authentication from PowerShell continued to work. It is for this reason that I suggested trying to use a *.onmicrosoft.com account first at the begging of this post – just in case it resolves the issue for you without touching the hosts file. Please comment if you have any success (or otherwise) with that approach.

Hope this helps! Good luck.

SharePoint licencing limitations: Standard vs Enterpise Features and Kiosk Users

When discussing Office 365 licencing there are a number of things that as an architect or developer you must be aware of.

EDIT: A handy Excel tool for checking which features are available for different licence types can be found here: Office 365 service comparison

Image of Buying kiosk licences
Buying kiosk licences

Creating solutions for limited users
Kiosk users are the cheap users will the greatest restrictions. However the limitations placed on these users really are quite manageable in many circumstances and shouldn’t cause you particular worry when developing a solution for these users. The key points to remember when providing a solution to these users is:

  • They don’t have a user profile. They can still view the ‘My Settings’ page, but not the ‘About Me’ page. These users still have the full set of user profile properties which can be set by an administrator or via AD synch and programmed against.
  • They can only use Office Web Apps in READ mode. They cannot edit documents with a client version of the correct Office application. Kiosk users from a K2 licence (opposed to K1) can also edit documents using OWA.
  • They can’t be administrators at the tenant or site collection level. However they can be granted Full Control permissions.

Be aware of the feature set available in Production
Many features will not be present or will not work under some licencing schemes. The primary issue which I have encountered is around content rollup. Licences which do not support the Enterprise feature set do not support the Content Search Web Part. You can use the Results Script Web Part instead, but remember that the display templates used are not transferable. The Content Search Web Part display templates reference the Srch javascript namespace which will not be present if using the Results Script Web Part.
There are obviously many other Enterprise features which I won’t mention explicitly but have a browse over the below table:


Developer features Only in SharePoint Server 2013—Enterprise Edition
Access Services Yes
BCS: Rich Client Integration Yes
BCS: Tenant-level external data log Yes
Custom Site Provisioning Yes
InfoPath Forms Services Yes
Analytics Platform Yes
Improved Self-Service Site Creation Yes
eDiscovery Yes
Preservation hold library Yes
Video Search Yes
WCM: Catalog Yes
WCM: Cross-site publishing Yes
WCM: Faceted navigation Yes
WCM: Image Renditions Yes
WCM: Multiple Domains Yes
WCM: Topic Pages Yes
Business Intelligence Center Yes
Calculated Measures and Members Yes
Data Connection Library Yes
Decoupled PivotTables and PivotCharts Yes
Excel Services Yes
Field list and Field Support Yes
Filter Enhancements Yes
Filter Search Yes
PerformancePoint Services Yes
PerformancePoint Services (PPS) Dashboard Migration Yes
Power View Yes
PowerPivot Yes
Quick Explore Yes
Scorecards & Dashboards Yes
Timeline Slicer Yes
Visio Services Yes
Content Search Web Part Yes
Custom entity extraction Yes
Extensible content processing Yes
Query rules—advanced actions Yes
Search vertical: “Video” Yes
Tunable Relevancy Yes

Script Editor only runs JavaScript in edit mode

If you need to embed script into a content editable page in SharePoint 2013/Online, you may decide to use the new Script Editor web part. There are often many preferable ways to add script to a page (e.g. via the master page, a custom action, custom control, the ScriptLink property, etc.) however this is an easy option for demo purposes or when deployment activities are out of scope.

There is a gotcha for those who like to skip attributes that may have seems verbose in the past. Any JavaScript which you include via the Script Editor web part must be wrapped in the <script> tag otherwise it will be rendered as text. However, if you fail to provide the required type='text/javascript' or language='javascript' attribute to the script tag then the code will continue to run when the page is in edit mode but will fail to execute when the page is saved and then viewed.

Note the attribute, you need it!
Note the language attribute, you need it!

CAVEAT: As I stated in the first paragraph, this is often not the best way to add script to a page.

New Button Order and Default Content Type is lost after moving a site

If you have customised the new button order and/or default content type for a list or library, then expect these changes to be lost if that site is moved. SharePoint Bug! You may have done this in order to change the content types that appear under a list’s new button or change the content type that used by default (i.e. when a user just clicks the new document icon rather than selecting a content type from the drop down).

A library with a restricted set of content types
A library with a restricted set of content types

As you can see in the image above, I have a library configured with a restricted set of content types available under the new button. After moving the site (using Site Settings > Content and Structure) these customisations are lost. See the next image.

The same library having been moved to a new location
The same library having been moved to a new location

So that’s the issue but what can you do to fix the situation if moving sites is something that you need to support regularly? Firstly, this isn’t the only issue you may encounter.

With the luxury of a farm solution this can fixed using a web event receiver. Using the WebMoving event you can store list new button order information (e.g. in a web property bag) and then in the WebMoved event, this information can be read and applied. I don’t have a code example of this as in my situation it was suitable to apply a static new button order to lists based on the site definition and list template.

If you are unable to deploy a farm solution (e.g. SharePoint Online) then this issue is more difficult to solve. I assume that you can write remote event receivers for web events and apply the same logic as above (I haven’t tried this). Otherwise I can’t think of another reasonable solution beyond writing a control with client JavaScript that ensures the new button order upon viewing the page. However this would require knowledge of what the new button order needs to be, and hence would be more complex to implement if users are free to change it.

Move Site (SPWeb) Operation fails with list view threshold exception

When using Move Site via Site Content and Structure to move a site to another location in the site hierarchy you may find it fails with one of the following errors (depending on where you look):

Operation to Move 'old site URL' to 'new site URL' failed

"MoveWebs.Move catches SPException : The attempted operation is prohibited because it exceeds the list view threshold enforced by the administrator."

"Move Operation under site 'Site Name' failed in the Content and Structure tool. Details in ULS logs"


Increasing the list view threshold alleviates the issue however I have not managed to figure out exactly why the limit is reached. I have encountered this when none of the lists in the site being moved, or any of its child sites, have breached the list view threshold. In fact the largest list was less than 2000 items with the list view threshold at the default 5000.

Perhaps its due to the aggregated total items being moved? This makes some sense as the list view threshold is in place to prevent SQL table locks which occur when more than 5000 rows are queried. As all the lists in a site collection are stored in a single table it makes sense that the same limitation would occur here.

If moving large sites is something that you need to do, I would suggest doing it out-of-hours if possible as these thresholds are in place for a good reason. Make sure that you utilise the administration list view threshold rather than increasing the one which restricts the majority of users. Performing actions that require an increased list view threshold may cause serious performance issues.

Please comment if you have anything to add to this.

Breadcrumb (SiteMapPath) is wrong after moving a site

I have been finding that when moving a site (SPWeb) to a different location in the site hierarchy of a site collection that the breadcrumb would often (not always) be incorrect once the site had been moved.

If you didn’t know, SiteMapProviders are cached in the SharePoint object cache. I’ll put the sporadic nature of the issue down to the natural refresh cycle of the object cache but honestly I’m not completely sure why it doesn’t go wrong all the time. The important bit is that there is way to ensure that the breadcrumb is refreshed correctly every time. For the sake of completeness, here is the SiteMapPath control with its SiteMapProvider property set to CurrentNavSiteMapProviderNoEncode from the custom master page (if you want to read more about this you could start here):


If you are suffering this issue I suspect that mentioning the object cache was enough to put you on the right path but I’ll spell it out just in case.

If you run a few lines of code in a WebMoved event receiver (I blogged briefly about attaching these here) you force the object cache to be refreshed whenever a site is relocated. Be warned that if you have a site that leverages the object cache (e.g. use of the cross-list query object, or the content query web part which utilises it) that these operations will need to re-cache and may have some performance impact.

SiteCacheSettingsWriter writer=new SiteCacheSettingsWriter(site);

You may need to fetch the SPSite object that is passed into the constructor from within an elevated privileges block to ensure the current user is allowed to perform this action. Obviously that depends on expected audience for this action.

PerformancePoint Dashboard Designer fails to create data connection

The following error occurs when attempting to make a data connection using the PerformancePoint Dashboard Designer and the SQL Server Table data source template:

An unexpected error has occured. Additional details have been logged for your administrator.
An unexpected error has occured. Additional details have been logged for your administrator.
The last thing you see before the error
The last thing you see before the error

Before I continue I want to make it clear that if you encounter this issue then you have not performed initial configuration/installation correctly for your needs and you should revisit installations guides. In my situation I’m just hacking together a dev environment which requires the use of PerformancePoint. To reiterate, better solutions to this issue exist in the form of installation guides. I wrote this because when I googled for a dirty resolution to this issue I couldn’t find any references to the error message.

Back to the issue at hand – of course this is a configuration issue. The issue for me was that the PerformancePoint service application was configured to run in a general ‘SPServices’ application pool, where the application pool account does not have access to the PerformancePoint database (correctly). To get around this issue I configured the service application to run in its own application pool, where the application pool account is a new domain account configured to have access to the PerformancePoint database. This approach meets the least privileges approach to security which we must all strive to uphold! (I actually just reused the ‘SPFarm’ (god) domain account as the application pool account to get it working in a dev environment, but that’s the theory…)
You will most likely want to use a different account when you configure the Unattended Service Account.

Note on Analysis Services
By following the above steps I managed to get the SQL Server Table data source working, but the Analysis Services data source was still throwing up the same error dialog. Upon setting the PerformancePoint service application properties, a dialog prompts you to install the PowerPivot for SharePoint installation package. After doing this not only was I still getting the same error dialog when attempting to create an Analysis Services data source but I could no longer create a SQL Services Table data source either. Running the PowerPivot for SharePoint 2013 Configuration resolved this issue, obvious really.

ULS Log entries (included for SEO purposes)

  • Log categories: PerformancePoint Services, Database
  • System.Data.SqlClient.SqlException (0x80131904): Cannot open database "WSS_Content" requested by the login. The login failed. Login failed for user 'DOMAIN\apppoolaccount'.
  • SQL Database 'WSS_Content' on SQL Server instance 'SERVER' not found. Additional error information from SQL Server is included below. Cannot open database "WSS_Content" requested by the login. The login failed. Login failed for user 'DOMAIN\apppoolaccount'.
  • An unexpected error occurred. Error 7451.
  • No windows identity for DOMAIN\apppoolaccount.
  • Unable to load custom data source provider type: Microsoft.PerformancePoint.Scorecards.DataSourceProviders.AdomdDataSourceProvider, Microsoft.PerformancePoint.Scorecards.DataSourceProviders.Standard, Version=, Culture=neutral, PublicKeyToken=71e9bce111e9429c System.IO.FileNotFoundException: Could not load file or assembly 'Microsoft.AnalysisServices.AdomdClient, Version=, Culture=neutral, PublicKeyToken=89845dcd8080cc91' or one of its dependencies. The system cannot find the file specified. File name: 'Microsoft.AnalysisServices.AdomdClient, Version=, Culture=neutral, PublicKeyToken=89845dcd8080cc91'


Cancel all workflows in site collection efficiently

Here you will find a script to cancel all list workflows (not site workflows) running in a given site collection in the most efficient manner possible (that I can think of) while still only using the SharePoint API. A better approach still would be to query the SharePoint workflows SQL table to identify the running workflows so as to avoid iterating all site collection webs and lists. Unfortunately there is no ‘GetAllRunningWorkflows(SPSite)’ method available via the API. I imagine that this approach should be satisfactory in the majority of cases though.
There are a number of posts on the web with code somewhat similar to this, at least with code that aims to achieve the same outcome. Of all the posts which I found they all performed this function in a very inefficiency manner, iterating the SPListItemCollection for every list in the site collection. This may be fine in many circumstances but I wanted something that would run faster with less strain on the server.

I have achieved this by checking for workflow associations on a list before iterating the items as well querying the list items and where a workflow column exists checking to see if the list item needs to be returned at all. When returning the list items, I am querying with ViewFieldsOnly so that less data is returned.

This script also accepts an optional parameter that specifies which workflow associations should be canceled if you are not looking to cancel all of the workflow associations but just those of a specific name.

NB: The script contains a reference to a help function GetNestedCaml which I have defined in a separate post which can be found here.

As the script sample is quite large I suggest clicking the ‘view raw’ link at the bottom of the sample to view it.

Dynamically nesting CAML query statements

Here is a short PowerShell function that can be used when you need to dynamically generate CAML queries with many logically joined statements. I actually adapted it from a C# implementation I wrote (which is probably more useful…) but as you can rewrite it in C# very easily I won’t bother bother posting it twice.

As CAML logical join operators (And, Or) can only compare two statements, when many statements need to be compared you must nest them which is what this function achieves. The $join parameter should be passed as “And” or “Or” and the $fragments parameter should be passed as an array of CAML statement strings such as:
@("<Eq><FieldRef Name='Title' /><Value Type='Text'>title</Value></Eq>", "<Eq><FieldRef Name='FileLeafRef' /><Value Type='Text'>name.docx</Value></Eq>")

# Define method for nesting caml query statements
function GetNestedCaml([array]$fragments, [string]$join)
    if ($fragments.Length -lt 1)
        return [string]::Empty
    elseif ($fragments.length -eq 1)
        return $fragments[0]
    elseif ($fragments.length -eq 2)
        return "<$join>" + $fragments[0] + $fragments[1] + "</$join>"
    $joinFrags = @()
     $baseJoinCount = [int][Math]::Floor($fragments.length / 2)
    for ($i = 0; $i -lt $baseJoinCount; $i++)
        $baseIndex = (2 * $i)
        $fragsToJoin = @($fragments[$baseIndex], $fragments[$baseIndex + 1])
        $joinFrag = GetNestedCaml $fragsToJoin $join
        $joinFrags += $joinFrag
    if ($fragments.length % 2 -ne 0)
        $joinFrags += $fragments[$fragments.length - 1]
    return GetNestedCaml $joinFrags $join