PowerShell Add-Type Error: Could not load file or assembly XXX or one of its dependencies.

If you are attempting to run the PowerShell command Add-Type with a Path parameter that references an assembly which has been emailed or downloaded from the internet (e.g. Add-Type -Path "C:\Microsoft.SharePoint.Client.dll") then it is likely that you will encounter the following error:

Could not load file or assembly ‘file:///’ or one of its dependencies. Operation is not supported. (Exception from HRESULT: 0x80131515)

PS_AddType

To confirm that the following steps are appropriate. Check the inner exception:

PS> $error[0].Exception.InnerException

“An attempt was made to load an assembly from a network location which would have caused the assembly to be
sandboxed in previous versions of the .NET Framework. This release of the .NET Framework does not enable
CAS policy by default, so this load may be dangerous. If this load is not intended to sandbox the assembly,
please enable the loadFromRemoteSources switch. See http://go.microsoft.com/fwlink/?LinkId=155569 for more
information.”

Not using SharePoint Client Components?

First check the execution policy under which PowerShell is running. This can be done by running the Get-ExecutionPolicy cmdlet. If you are not running under an Unrestricted policy, try running the Add-type cmdlet after setting the execution policy to Unrestricted (Set-ExecutionPolicy “Unrestricted”). If this resolves the issue then you’ll need to look into getting the assembly signed by a trusted publisher if you can’t continue to run PowerShell in this lowered state of security.

Next ensure that the assembly hasn’t been blocked. The easiest way to achieve this is by right-clicking on the assembly and checking on the General tab that there isn’t the option to unblock the assembly at bottom of the dialog. There is also the Unblock-File cmldlet to achieve this action via PowerShell.

Unblocking an assembly
Unblocking an assembly

Finally, PowerShell supports a config file which by default is not present. It allows configuration of a few things like the .NET versions which are supported and, importantly for us, whether or not PowerShell is allowed to load from remote sources. In order to add/update this file, ensure that you are logged in as a local administrator. The config file needs to reside adjacent to the PowerShell exe file. So, by default, it ought to be created here: C:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe.config

Note: Do not confuse this with powershell_ise.exe.config although you may want to update this as well if you use ISE.

The content of the powershell.exe.config should be as follows. If you already have a file present, ensure that you merge the XML rather than just replacing it.

<?xml version="1.0"?>
<configuration>
  <startup useLegacyV2RuntimeActivationPolicy="true">
    <supportedruntime version="v4.0.30319"/>
    <supportedruntime version="v2.0.50727"/>
  </startup>
  <runtime>
    <loadfromremotesources enabled="true"/>
  </runtime>
</configuration>

Using SharePoint Client Components?

If you are adding the SharePoint Client assemblies required for the execution of CSOM requests then you should take a different approach. Rather than bundling these assemblies with your script, instead insure that the correct SharePoint Client Components package is installed on the executing machine and reference the assemblies from their default location:

Download Client Components:
SharePoint Online Client Components
SharePoint 2013 Client Components

Reference Accordingly:
C:\Program Files\Common Files\microsoft shared\Web Server Extensions\16\ISAPI\assembly.dll
C:\Program Files\Common Files\microsoft shared\Web Server Extensions\15\ISAPI\assembly.dll

Paul.

SPO CSOM Error: For security reasons DTD is prohibited in this XML document

I found myself encountering the following error when authenticating to SharePoint Online using CSOM from PowerShell:

DTD_error
Exception calling “ExecuteQuery” with “0” argument(s): “For security reasons DTD is prohibited in this XML document. To enable DTD processing set the DtdProcessing property on XmlReaderSettings to Parse and pass the settings into XmlReader.Create method.”

I believe that there a number of causes for this issue some of which are firewall and ISP related. This may only resolve a subset of the cases where this issue has been arising, even under the same circumstances.

In my scenario, I found that this issue was only arising when the credentials I was passing were being federated. That is, when the username was *not* in form <me>@<domain>.onmicrosoft.com but rather something like <me>@<domain>.co.uk. It is also possible that this issue resolves itself after a single successful authentication has occurred. Try providing credentials for a *.onmicrosoft.com account, and if that works try again with a federated account. This is discussed more later.

I used Fiddler to compare the request/response trace from a successful authentication and one where this error occurs. It turns out that somewhere internally a request is made to msoid.<full-domain> where <full-domain> is the bit after the @ symbol from the username provided. In the case where this value is of the *.onmicrosoft.com variety, a 502 error (no DNS entry) is returned with no request body and the authentication proceeds successfully. In the other case, the ‘msoid’ URL is resolved and a response with a request body is returned.
In my case the response was a 301 error (permanent relocation), however I read of cases where a 200 (success) has been received. Importantly to note, is that the response, success or otherwise, returns an HTML body containing a DTD (Document Type Declaration), and in turn produces the rather unhelpful error message.

So how do you fix it? Well one way is to provide an entry in your hosts file which ensure that the msoid URL will be invalid. I found that providing a local host entry for it worked. Your hosts file can be found here:
C:\Windows\System32\drivers\etc

I added a line which looks like the following:

127.0.0.1        msoid.<domain>.co.uk

And it worked! Intriguingly I found that if I then removed this line from my hosts file, SharePoint Online authentication from PowerShell continued to work. It is for this reason that I suggested trying to use a *.onmicrosoft.com account first at the begging of this post – just in case it resolves the issue for you without touching the hosts file. Please comment if you have any success (or otherwise) with that approach.

Hope this helps! Good luck.

Setting Default Page Layout, Available Page Layouts, and Available Web Templates via CSOM (JSOM)

The publishing infrastructure provides a site with many features including page layouts. A default page layout, a restricted set of available page layouts, and a restricted set of web templates can be set. The page in the UI looks like this:

config_pageLayouts

In order to perform these configurations programmatically, methods are provided as part of the the server side object model via the PublishingWeb object. However setting these properties via CSOM is less obvious. The equivalent methods are not available (at least not at the time of writing. It is worth noting that the CSOM model gets extended as part of CUs as well as in SPO).
The property information is stored solely as web properties. I have verified this by reflecting the Microsoft.SharePoint.Publishing assembly, and ensured that the server side object model methods perform no other action than this.

The following JSOM code demonstrates how to set the web properties with valid XML values.

The methods can be called like this:

// The page layout is defined by its filename
var filename = "NewsArticlePage.aspx";
setDefaultPageLayout(filename, onSuccess, onFailure);


// The page layouts are defined by an array of filenames
var filenames = ["NewsArticlePage.aspx"];
setAvailablePageLayouts(filenames, onSuccess, onFailure);

// The web templates are defined by their long name,
//the guid is that of the feature which deploys the web template
var webTemplates = ["{BDACFFF5-05DF-4446-9907-B4C39F15F1D7}#WT_VanitySite"];
setAvailableWebTemplates(webTemplates, onSuccess, onFailure);

 Note – these samples reference both the jQuery and underscore libraries.

Key difficulties deploying a SharePoint Online solution using CSOM

I have been developing a console app that utilises the SharePoint C# CSOM to deploy a solution to SharePoint Online (a.k.a Office 365 SharePoint). The solution involves more than just a wsp (although it has one of those too). I have encountered a few difficulties during this process and this blog will discuss those:

  • (Re)creating a site collection
  • Importing a large-ish taxonomy
  • Uploading and installing a sandboxed solution (that contain only declarative elements)
  • Hooking up of taxonomy and (root site) lookup columns
  • Pre-creating a number of sites with specific features enabled (including the root site)

Before I go any further, for those of you reading this before doing something similar yourselves, please be aware of two constraints which caught me by surprise:

  • You can’t leverage the same import taxonomy function that is available in Term store management. If you already have files in that format you will need some custom code (I have an example later on) or you may want to import from a more robust XML formatted document
  • The CSOM does not support uploading or activating sandboxed solutions! However, there is a CodePlex project that assists with this. I also include the dll later in the post that I have rebuilt with references to the lastest v16 Microsoft.SharePoint.Client dlls.
  • The CSOM does not support activating web scoped features! You can active site scoped but not web scoped. You need to use web templates to achieve this. Again, I will cover this in some more depth later on.

SharePointOnline2L-1

Deleting and recreating a site collection

The initial step of the deployment process involves creating a new site collection (having deleted it first as required). In order to perform actions at this scope (tenant) you cannot create your client context in the same manner as usual (with a reference to the site collection; as it is yet to exist and the site collection delete and empty recycle bin require it too). Instead you must create the client context passing in tenant/admin site URL.
This is the one that looks like this: https://<tenant>-admin.sharepoint.com

You can then create a Microsoft.Online.SharePoint.Tenant object by passing the ‘admin’ client content to its constructor. This object requires a reference to the Microsoft.Online.SharePoint.Client.Tenant assembly which is available by downloading and installing the SharePoint Server 2013 Client Components SDK. The assembly can then be found here: C:\Program Files\SharePoint Client Components\16.0\Assemblies

The tenant object provides the methods required to perform the create and delete site collection actions. This process involves a lot of waiting about for deletion to complete, and then provisioning to complete. Unfortunately you can’t continue with other actions until this has occurred. I found this to take upwards of three minutes.

A link to the relevant code that I used to achieve this can be found here: https://gist.github.com/paulryan/cbfaa966571d6a9cdb8b

Importing taxonomy

As mentioned above you can’t pass those CSV files directly to the CSOM and have it import it all for you. In my scenario we had already developed a lot (dozens) of term sets in the form of these CSV files so that were able to import them during a discovery phase so it was important that I could support the import of taxonomy in this form. I wrote code to support in the import of these files, but only to the point that it meets my immediate requirements. Please use the following as a rough guide only as it is not fully featured (or tested beyond the happy path).

The code I wrote to support his can be found here: https://gist.github.com/paulryan/e461f8bac28336b05109#file-importtaxonomycsom-cs

Uploading and activating a sandboxed solution

There is a CodePlex project that provides this functionality (as well as some authentication utilities) that I mentioned above. It performs web requests to UI and I am very glad someone else has already done this before me! It was originally created when SharePoint 2010 was present in the cloud and references the v14 assemblies of the Microsoft.SharePoint.Client assemblies accordingly. If you don’t mind maintain references to both v14 and v16 assemblies then this might be fine. I have instead rebuilt the source having replaced the references with the v16 equivalents.

You can download it here: SharePointOnline.Helper.dll

FYI: v14 is SharePoint 2010, v15 is SharePoint 2013, v16 is SharePoint 2013 Online specific

Activating web features

Actually there isn’t a lot more to say here other than you must use web templates if you need to create sites with features enabled as part of the deployment process as it can’t (currently) be done using the CSOM. I would recommend using the web template for nothing other than activating features and put all other declarative elements in a feature. This will provide the best upgrade experience in the future.

Hooking up taxonomy columns

The best place to start is almost certainly a reference to Chris O’Brien’s blog on this here. As I have the luxury of being able to run further deployment code after uploading/activating the sandboxed solution I opted to avoid having to rebuild the solution for various environments and instead hook-up the columns using the CSOM and a mapping. There is a catch with this though.

If your list instance is built from a list template which defines the managed metadata columns then updating the site column via the CSOM fails to push down the new SspID. To get around this, DO NOT include managed metadata column definitions as part of the list definition (in the fields element). When you run the CSOM to update the site columns it will update the content type and add the column to the list instance with the correct SspID.

Good luck building your SharePoint Online CSOM deployment framework!

Running SP.UI.Status.addStatus on page load

This post is about a specific issue when running the SP.UI command ‘addStatus’ on page load as well as short discussion on the JavaScript page life cycle.

Initial Problem
When running the SP.UI.Status.addStatus command upon page load, the status message was being hidden almost immediately in Chrome but worked as expected in IE.

Scenario
We have a concept of ‘archived’ sites. A control is embedded on the page layout for site home pages (default.aspx) which renders javascript which should display a ‘this site is archived’ message. The message should be displayed using SP.UI.Status.addStatus and was initially implemented as follows (the script is being rendered dynamically using an ASP literal control, I’ll just be considering the final output):

// This is an example of what NOT to do
ExecuteOrDelayUntilScriptLoaded(function () {
    SP.UI.Status.addStatus("This is an archived site", "removed for brevity");
}, 'sp.js'););

This worked as expected in IE, however the status message would appear for a split second and then disappear in Chrome.

Investigation

The first piece of the the puzzle: What is happening?
After getting deep with Chrome DevTools I found the script responsible for hiding the status message. SharePoint utilises the document.onreadystatechange handle to run a function called fnRemoveAllStatus. I think you can guess what it achieves. Why this is being run at this point is beyond me. Importantly, I don’t want to prevent it running in case it serves a purpose that I’m unaware of.

The second piece of the the puzzle: How’s that work?
If a function is assigned to document.onreadystatechange it will be run as many as four times (depending on when in the cycle the assignment occurs), once for each transition between the following sequence of states:

  1. ‘uninitialized’
  2. ‘loading’
  3. ‘interactive’
  4. ‘complete’

Good practice would have the function check for the current state and only act once, when in the correct state. Naturally this logic is absent from the fnRemoveAllStatus function.

The third piece of the the puzzle: What is running when?
$(document).ready vs $(window).load vs ExecuteOrDelayUntilScriptLoaded
The difference between these options in regards to what we have just discussed is that $(document).ready and ExecuteOrDelayUntilScriptLoaded run when document.readyState is ‘interactive’ whereas $(window).load runs when document.readyState is ‘complete’.

Laying the final puzzle piece: Why’s it working in IE but not Chrome?
When running the code in IE, rather than executing the script block during the ‘interactive’ readyState is was being executed after transitioning to the ‘complete’ readyState and this meant that is was running after the fnRemoveAllStatus call as we desire. I believe this happens because the sp.js file is being added via a script link control with ‘LoadAfterUI’ set to true which is only understood by IE. I haven’t investigated this last comment, if I am wrong please leave a comment about it below.

Solution
So, the solution is rather simple once you have this understanding. Wrap the command in $(window).load to ensure it occurs after the fnRemoveAllStatus method is called during the transition into the ‘complete’ state. Like this:

$(window).load(function() {
    ExecuteOrDelayUntilScriptLoaded(function () {
        SP.UI.Status.addStatus("This is an archived site", "removed for brevity");
    }, 'sp.js'););
});

NB: ExecuteOrDelayUntilScriptLoaded will also load scripts which are marked to load on-demand. If you are testing in Chrome you may begin to believe it unnecessary to use ExecuteOrDelayUntilScriptLoaded when using $(window).load as all scripts have loaded by then. This is true for browsers other than IE. In IE we must use this function to ensure that the script is loaded at all.

As a final note I’d like to add that apart from this specific case I would suggest using $(document).ready rather than $(window).load as it will mean that the page loads faster (unless of course your script requires all resources to be loaded before acting. e.g. you are working with images of undefined sizes).