A history of extensibility in SharePoint and how to choose a model

I recently wrote a post for my employer about the recent history of SharePoint extensibility models. It also touches on how we as company settled on the model with which we are currently delivering our Intranet/Digital-Workplace solution. I discuss the Feature Framework, Farm and Sandboxed Solutions, SharePoint Add-in Model, SharePoint Framework, Remote Provisioning, and more.

Check it out here: A history of extensibility in SharePoint and how Fresh chose a model

SharePoint extensibility model, Generic image

Excuting JS in an RSS Viewer web part in SharePoint

Back in SharePoint 2010 the RSS Viewer web part (a.k.a RSS Aggregator web part, RSSAggregatorWebPart) supported the use of XSL templates which contained script tags. There were a few funny things you had to in order get it working but it did work. Bring on SharePoint Online and it is not possible to add a script tag to your XSL templates (I believe this ‘issue’ exists on SharePoint 2013 as well). More accurately, script tags can be added and will rendered to the page but the script will not be executed.

In this day and age the correct answer is very likely: “why are you using this web part?” or “XSL are you mad?”. Both very strong arguments, but regardless, for the record you can code (hack) your way around this issue.

Use the onerror handler of an image tag to instantiate your JS. Ensure that the img tag is rendered as the last element in your template if you plan on using the js to modify the DOM.

<img style='display:none' src='#' onerror='console.log("boo!")' />

If you are using SharePoint 2010 check this out:
https://sharepoint.stackexchange.com/questions/9720/calling-javascript-within-xsl-after-transformation-in-rss-web-part

Paul.

“SafeQueryPropertiesTemplateUrl” error when calling the SharePoint _api

If you call the SharePoint 2013 REST API in your applications ensure that any requests originating from the client are sent from the current web base URL to avoid returning a SafeQueryPropertiesTemplateUrl error.

If the current site is
https://tenant.sharepoint.com/sites/mysitecollection/subsite1/subsite2 then it is very important that you submit API requests as
https://tenant.sharepoint.com/sites/mysitecollection/subsite1/subsite2/_api
and NOT as any of:

  • https://tenant.sharepoint.com/_api or
  • https://tenant.sharepoint.com/sites/mysitecollection/_api or even
  • https://tenant.sharepoint.com/sites/mysitecollection/subsite1/_api

The reason for this is that the current user must have access to the site addressed by the base URL of the API request (the bit before the _api). If the user cannot access this site then the request will fail. Unfortunately it doesn’t fail in the manner you might expect (i.e. a 401 access denied exception). A request that fails in this manner will return a 500 error. The specific exception details are as follows:

Exception class:
Microsoft.Office.Server.Search.REST.SearchServiceException

Exception message:
The SafeQueryPropertiesTemplateUrl "The SafeQueryPropertiesTemplateUrl "{0}" is not a valid URL." is not a valid URL.

SafeQueryPropertiesTemplateUrl Error received when attempting to hit _api endpoint from web to which the current user does not have access
Error received when attempting to hit _api endpoint from web to which the current user does not have access

Paul.

HTTP 405 Error in .NET Web App

I was recently told that an web app I had developed was returning an HTTP 405 error upon being freshly deployed. It took me way too long to realise that cause of the issue came down to missing files. Specifically, the complete folder structure had been deployed however the files at the top level web root were missing. These are files are rather critical.

They are the web.config and global.asax

If you are seeing this error, ensure these files have been deployed correctly and aren’t corrupt as a first point of call.

Receiving an HTTP 405 in IE11
Receiving a 405 in IE11

For SEO HTTP 405

  • Chrome: The page you are looking for cannot be displayed because an invalid method (HTTP verb) is being used.
  • IE: HTTP 405 The website has a programming error. This error (HTTP 405 Method Not Allowed) means that Internet Explorer was able to connect to the website, but the site had a programming error.
  • Edge: HTTP 405 error That’s odd… Microsoft Edge can’t find this page

Paul.

Azure CDN integration with SharePoint, cache control headers max-age, s-maxage

After recently implementing an Azure-based solution to mitigate SharePoint Online’s poor image rendition performance by utilising Azure CDN (see Chris O’Brien’s post on this issue, see Fran R’s post on other Image Rendition issues) I’ve reached a few conclusions regarding setting appropriate cache control headers. It is important to reach a practical balance between performance and receiving updates to files.

Azure CDN logo

Before continuing it is important to understand the fundamental building blocks when using a CDN. At any time a file can be present in three location types: the blob or source file, the CDN endpoint(s), and users’ browser caches. In the case of Azure CDN, the source file must be a blob in Azure Blob Storage. Depending on the CDN/configuration it is likely that the file may be cached at many (dozens) of CDN endpoints dispersed around the globe. Without a CDN the only consideration is the cache timeout for files stored at the user’s browser cache. When considering a CDN we must also consider the cache timeout between the CDN endpoint and the source file.

Another important point to call out is that CDNs generally only push content to an endpoint when is it first requested: on-demand. This will incur a delay for the first user to request that asset from a given endpoint, while source blob is transferred to the endpoint. The impact of this will differ depending on the distance between the source blob and the CDN endpoint and the file size. It is this process that increasing the s-maxage header prevents (discussed below).

Relevant cache control headers

Definitions

  • max-age : Defines the period which, until reached, the client will used the cached file without contacting the server. ‘Client’ refers to a user’s browser cache as well as a CDN.
  • s-maxage : If provided, overrides max-age for CDNs only
  • public : Explicitly marks the file as not user specific
  • no-transform : Proxy servers may compress or encode images to improve performance or reduce bandwidth traffic. This header prevents this for occurring. It is preferable to avoid this header assuming that you can spare the effort to ensure the files being served are not affected adversely.

A good summary of the many remaining cache control headers that I didn’t feel were relevant to this post can be found here:
A beginners guide to HTTP cache headers

In practice

  • For an image that has been previously requested:
    • When s-maxage has not expired and max-age has not expired, server responds with 200 (OK), the file is not downloaded again [0ms]
    • When s-maxage has not expired but max-age has expired, server responds with 304 (not modified), the file is not downloaded again [<100ms]
    • When s-maxage has expired but max-age has not expired, server responds with 200 (OK), the file is not downloaded again [0ms]
    • When s-maxage has expired and max-age has expired and the blob has not changed, server responds with 304 (not modified), the file is not downloaded again [<100ms]
    • When s-maxage has expired and max-age has expired and the blob has changed, server responds with 200 (OK), the file is downloaded again [download image]
  • A request for an image will return 200 (OK) until max-age has expired and then 304 (not modified) for every subsequent request until the blob is updated. Once updated, this process repeats
  • If an existing image is updated, the longest a user can wait to see the updated image is
    • Without clearing browser cache: max-age + s-maxage
    • With clearing browser cache: s-maxage
  • If an user views an image from the CDN for the first time, it is only guaranteed to be the latest version of that image if the blob hasn’t been updated in the last s-maxage
  • SharePoint library images are served with a max-age of 24 hours
  • As SharePoint library images are not served via a CDN they have an effective s-maxage of 0

My recommendations

Keeping all of the above in mind, I feel that the most important factor is to replicate the experience that users expect from images being served from the SharePoint environment. This can presented as a couple of simple rules:

  1. max-age + s-maxage = 24 hours = 86400 seconds
  2. s-maxage is as low as possible whilst satisfying bandwidth and performance targets (especially for locations most distant to the source blob)

For a recent SharePoint/CDN, I used the following cache control headers:

  • max-age: 23 hours
  • s-maxage: 1 hour
  • public
  • no-transform

Which looks like this:
no-transform,public,max-age=82800,s-maxage=3600

Setting the cache headers served by Azure CDN and Azure Blob Storage

When working with cache control headers in Azure, they are set on the blob itself. It is not a CDN configuration setting.

Paul.

PowerShell recursion with delegate functions, iterate all lists in all webs

There are many ways to iterate a collection in PowerShell. I just really like using delegate functions. This approach is not native PowerShell but utilises the .NET Action class as a function parameter. Using a delegate function approach, it is possible to create a recursive loop that can be very easily reused in the future just by providing an alternative Action.

The example code I provide below demonstrates how to create a delegate function in PowerShell, how to write a function that accepts one as a parameter, and provides some ready made samples for iterating SharePoint objects, specifically all webs or all lists. I am using some specific SharePoint objects in these samples, however the fundamental pattern can be used to effectively iterate any recursive structure.

foreachDecendentWeb : perform an action on every web below the provided web
foreachListInWeb : perform an action on every list in the provided web
foreachListInWebAndAllDecendentWebs : perform an action on every list in the current and all decendent webs

Delegates

Some notes

The below script references ‘TopOfScript.ps1’, it is specifically related to calling SharePoint CSOM from PowerShell. Read about it here on sharepointnutsandbolts.

Making the call, providing the delegate

The utility scipts, recursive functions accepting delegate parameters

Paul.

jsLink: how to display a custom ‘no items’ message

If you use jsLink to override the rendering of list views then you may have noticed that your custom jsLink no longer renders a message when there are no items returned in the view. I am going to discuss with code samples how to display a ‘no items’ message – or at least help you stop overriding it.

You have complete control over how list items are rendered using jsLink
You have complete control over how list items are rendered using jsLink

If, alternatively, you have a ‘no items’ message being displayed and just want to modify the text, try this link.

If you don’t know what jsLink is then it is worth learning about it. Try this link.

What am I doing wrong?

Chances you are making the same mistake that many people make. A mistake that has been replicated again and again online and doesn’t break anything but does prevent the display of the ‘no items’ message and the paging control. When you override Templates.Header you DO NOT need to override Templates.Footer in order to close tags which you opened in the header.

Although doing so seems to make sense, you can rest assured knowing that tags you open in the header will be closed auto-magically after the item templates have completed rendering. In fact, the footer template is rendered in a different table cell to the header and item templates when this all hits the page. Think of the footer template as a distinct block that is rendered after everything else rather than the end of the same block.

By overriding the footer template you are also inadvertently overriding the ‘no items’ message and the list view paging control. You can see exactly what you are overriding by inspected the default values for the templates. Below is snippet from clientrenderer.js which shows the default footer template.

So what should you do?

If you just want the default no items message and can get away with not overriding the footer template (as in the first code snippet), then great – you are all done.

If want a custom message then check out the link at the very top of the article (in summary: renderCtx.ListSchema.NoListItem = "Nada, nothing, zilch";).

If you want to override the footer template or perhaps you want the message to appear within a wrapper tag defined in the header or you want some custom logic behind which message to display then you can do that too – keep reading.

Doing it yourself

I’ve written a utility function that is based on the logic in the OOTB footer template that makes it easier to manage the ‘no items’ text. This function does NOT replicate the paging functionality. If you need paging and are overriding the footer template then you will need to replicate the paging functionality as well. You will need to look into clientrenderer.js to find out how MSFT do this.
Looking at this snippet you can see the if-else block where you can define custom messages for different list templates or if the lack of results has occurred only after a search term was provided. This sample should not be considered the superlative version, it just does a basic job in line with what happens by default.

Below are two examples of how you may want to use this. The first is by overriding the footer template, and the second is by overriding the header template. The advantage of sticking this code into the header template is that it allows you wrap the no items message in the same wrapper tags that you defined for the main content.

Paul.

For aiding findability:

  • There are no items to show in this view of the list
  • Your search returned no results
  • Some items might be hidden. Include these in your search
  • Still didn’t find it? Try searching the entire site.

Upload Centre vs OneDrive Synch

The following is quick summary of both the Office Upload Centre and the OneDrive for Business Synch Client and a discussion of why is it safe to have them both running simultaneously.

Office Upload Centre
Upload Centre manages the offline cache when SharePoint documents are opened in the appropriate client application. By integrating with Office it enables co-authoring and some other ‘integration’ features such as sharing from the Office client applications and ensures that changes are synched back to SharePoint even if the connection is lost.

Upload Centre in the windows tray
Upload Centre in the windows tray
Office Upload Centre
Office Upload Centre

OneDrive for Business Synch Client
OneDrive synch ensures that OneDrive documents which have been synchronised to a local folder remain the same (synchronised). It also acts as the synch client which is used for the synchronisation of other SharePoint content such as a Document Library in team site. It does not integrate directly with Office applications. NOTE: Microsoft are upgrading the synch engine to improve reliability but only for OneDrive (but it is probably safe to assume that it will reach the rest of SharePoint eventually).

OneDrive Synch
OneDrive Synch

When do they meet?
The closest these features get to interacting with one another is when a document stored in OneDrive is opened in the client application. In this scenario it is Upload Centre that will ensure changes are persisted to the document. Once this occurs it is OneDrive Synch that ensures that this change is replicated to the synchronised copy stored locally. If the document is edited in another way then Upload Centre will not be involved.
Both products are a core part of the Microsoft cloud suite and although you can disable them, you can also be confident that they will work together.

Paul.

DateTime validation message colour

There is a minor style bug in SharePoint 2013 (including SharePoint Online). The error message on a required DateTime field is not displayed in a manner consistent with other control validation errors. No it’s not just you, and no it’s not due to some conflicting CSS – it is a SharePoint bug.

Specifically I am referring to the page layout edit experience. A user fails to provide a value for a required DateTimeField control and the validation message is shown in the default text colour – ‘You must specify a value for this required field.’

redvalidation

For all other validation messages the SharePoint controls add the ms-formvalidation which sets a CSS rule to set the red colour. This is the only rule which the ms-formvalidation class sets and as such this is the only rule that should be applied to fix the issue.

I use the following CSS selector to resolve this issue:

Paul.

Dynamically generating complex pre-refined search result page URLs

I while ago I blogged about creating a static link to a pre-refined (pre-filtered) search page. This post follows that idea to it’s natural conclusion by providing a number of JavaScript functions which can dynamically create search result page URLs. These URLs will look something like this:

https://tenant.sharepoint.com/search#Default=%7B%22k%22%3A%22article%22%2C%22r%22%3A%5B%7B%22n%22%3A%22RefinableString20%22%2C%22t%22%3A%5B%22%5C%22%C7%82%C7%824275696c64%5C%22%22%5D%2C%22o%22%3A%22OR%22%2C%22k%22%3Afalse%2C%22m%22%3A%7B%22%5C%22%C7%82%C7%824275696c64%5C%22%22%3A%22Build%22%7D%7D%2C%7B%22n%22%3A%22RefinableString21%22%2C%22t%22%3A%5B%22%5C%22%C7%82%C7%824c6f6e646f6e%5C%22%22%5D%2C%22o%22%3A%22OR%22%2C%22k%22%3Afalse%2C%22m%22%3A%7B%22%5C%22%C7%82%C7%824c6f6e646f6e%5C%22%22%3A%22London%22%7D%7D%5D%7D

The provided scripts support filtering on:

  • a search term
  • multiple refiners
  • multiple values for a refiner, or
  • any combination of the above

It would be worth reading the intro of my earlier article to get a better understanding of what is happening in the snippets provided in this post.

Default Enterprise Search Centre
Default Enterprise Search Centre

OF NOTE:

  • As the most common usage will surely be to produce search result page URLs that are refined on a single value, I have written an ‘overload’ function that simplifies calling the method in this scenario
  • The ‘search page URL’ can be provided to the functions in a number of ways including:
    • “/search” : to the web. The default page for that web. In the case of an Enterprise Search Centre this will be the ‘Everything’ search results page
    • “/search/Pages/peopleresults.aspx” : to the page
    • Use an absolute URL if you are out of the context of the SharePoint Online tenant in which the search page resides. This will be true for provider hosted add-ins (apps)
    • If you are writing your own refiner, then pass an empty string and set window.location.hash to the result of the function
  • This script has no dependencies on other libraries (jQuery, SP.js, etc)
  • The hex encoded string must be UTF-8 encoded. JavaScript is natively UTF-16. The particular scenario where this raised an issue for me was the wide-ampersand character which is often used instead of a standard ampersand as it is XML friendly. ‘unescape’ returns a UTF-8 encoded string and is used to force the required encoding. Thanks to ecmanaut for this solution
  • I took inspiration for the stringToHex method from a post by pussard

The functions:

var getPreRefinedSearchPageUrl = function (searchPageUrl, searchTerm, managedPropertyName, managedPropertyValue) {
  return getComplexPreRefinedSearchPageUrl({
    searchPageUrl: searchPageUrl,
    searchTerm: searchTerm,
    refiners: [
      {
        managedPropertyName: managedPropertyName,
        managedPropertyValues: [
          managedPropertyValue
        ]
      }
    ]
  });
};

// input:
// {
//   searchPageUrl: "/search/Pages/results.aspx",
//   searchTerm: "",
//   refiners: [
//     {
//       managedPropertyName: "RefinableString08",
//       managedPropertyValues: [
//         "Human Resources"
//       ]
//     }
//   ]
// }
var getComplexPreRefinedSearchPageUrl = function (data) {
  var searchObj = {
    "k": data.searchTerm,
    "r": []
  };
  for (var i = 0; i < data.refiners.length; i++) {
    var refiner = data.refiners[i];
    var searchObjRefiner = {
      "n": refiner.managedPropertyName,
      "t": [],
      "o": "OR",
      "k": false,
      "m": {}
    };
    for (var j = 0; j < refiner.managedPropertyValues.length; j++) {
      var refinerValue = refiner.managedPropertyValues[j];
      // Force UTF8 encoding to handle special characters, specifically full-width ampersand
      var managedPropertyValueUTF8 = unescape(encodeURIComponent(refinerValue)); 
      var managedPropertyValueHex = stringToHex(managedPropertyValueUTF8);
      var managedPropertyValueHexToken = "\"ǂǂ" + managedPropertyValueHex + "\"";
      searchObjRefiner.t.push(managedPropertyValueHexToken);
      searchObjRefiner.m[managedPropertyValueHexToken] = refinerValue;
      searchObj.r.push(searchObjRefiner);
    }
  }
  var seachObjString = JSON.stringify(searchObj);
  var searchObjEncodedString = encodeURIComponent(seachObjString);
  var url = data.searchPageUrl + "#Default=" + searchObjEncodedString;
  return url;
};

var stringToHex = function (tmp) {
  var d2h = function (d) {
    return d.toString(16);
  };
  var str = '',
    i = 0,
    tmp_len = tmp.length,
    c;
  for (; i < tmp_len; i += 1) {
    c = tmp.charCodeAt(i);
    str += d2h(c);
  }
  return str;
};

These are examples of how to call the function that are defined above.

var complexUrl = getComplexPreRefinedSearchPageUrl({
  searchPageUrl: "/search/Pages/results.aspx",
  searchTerm: "article",
  refiners: [
    {
      managedPropertyName: "RefinableString20",
      managedPropertyValues: [
        "Build", "Land"
      ]
    },
    {
      managedPropertyName: "RefinableString21",
      managedPropertyValues: [
        "London"
      ]
    }
  ]
});
var basicUrl = getPreRefinedSearchPageUrl("/search/Pages/results.aspx", "", "RefinableString20", "Build");

Paul.