Ignite 2017 Updates Webinar

I recently joined Jeremy Thake of Hyperfish and my colleague David Bowman to host a webinar focussed mainly on some our favourite announcements that came out of Ignite 2017. I have a list of my recommended viewing from the conference at the end of this post.

The webinar was to have primarily a non-technical audience so it doesn’t get deep into many particularly technical topics but it does give a good overview of some of more important announcements of Ignite 2017, in my humble opinion.

Ignite 2017 banner image

You can view the webinar here: Harness the power of Office 365

I discuss

  • Required information fields
  • Files that need attention view
  • Bulk update of file metadata
  • Metadata prompt on file upload
  • Filter panel updates
  • Improving support for large lists and libraries
  • Column formatters
  • Hub sites
  • New Yammer web part
  • Custom modern themes
  • Site designs
  • Multi-geo
  • Groupify – create Office 365 groups from existing team sites
  • Group naming policies and management
  • Microsoft Teams Admin Centre
  • Microsoft Teams data storage and information protection

Ignite 2017: Recommend viewing:

Hope this helps you get up to date in record time!

Paul.

A history of extensibility in SharePoint and how to choose a model

I recently wrote a post for my employer about the recent history of SharePoint extensibility models. It also touches on how we as company settled on the model with which we are currently delivering our Intranet/Digital-Workplace solution. I discuss the Feature Framework, Farm and Sandboxed Solutions, SharePoint Add-in Model, SharePoint Framework, Remote Provisioning, and more.

Check it out here: A history of extensibility in SharePoint and how Fresh chose a model

SharePoint extensibility model, Generic image

Office 365 CDN – Some Notes and Sample Scripts

The Office 365 CDN (Content Delivery Networks) may be activated to host SharePoint Online files in a more globally accessible manner. The general premise behind this is that static assets can be served to users from a location more local to them than the data centre in which the Office 365 tenant is located.

I won’t go into the real benefits of this beyond to say that my limited testing at this point leads me to believe that the performance impact of using a CDN will be negligible for the vast majority of users/organisations. This is because the volume of data which can be served via the CDN is not a significant proportion of the data impacting page load speed.

Regardless, the documentation around how to get started with the Office 365 CDN is decent. A good place to start is this link.

Private CDN with auto-rewrite
Private CDN with auto-rewrite. Image credit to Microsoft (https://dev.office.com/blogs/general-availability-of-office-365-cdn)

A couple of gotchas I’ve noticed

  • Fetching an image rendition using the width query string parameter does NOT correctly return the image rendition as configured. It simply scales the image to the specified width (i.e. no cropping or positioning is performed).
  • If all users are located in the same region as the Office 365 tenant, turning on the CDN may reduce performance due to CDN priming (replication of files to the CDN) and will complicate updates to files which are replicated (e.g. JavaScript in the Style Library).
  • Search web parts must be configured for ‘Loading Behaviour’ – ‘Sync option: Issue query from the server’ in order for the auto rewrite of CDN hosted files to occur. This is true for display templates as well as the value of the PublishingImage managed property

Office 365 CDN PowerShell Samples

I’ve got some sample PowerShell below showing how to activate the Office 365 CDN (there’s private and public, you can use either or both) and associate origins with it (an origin is a document library which will be replicated to the CDN).

I’ve also got a simple sample of how to remove all origins as there is not a single cmdlet for this. It is worth noting that although an enabled CDN with no origins is functionally identical to a disabled CDN (i.e. no files are being replicated) they are not the same from a configuration perspective.

Please note that these are just sample scripts and have not been parameterised as you may require.

Calling the PowerShell functions:

$cdnType = "Private" # Private or Public
$serverRelSiteCollectionUrl = "/sites/mysite" # site collection URL or * for all site collections

Authenticate-PowerShell

Set-CdnConfiguration $serverRelSiteCollectionUrl $cdnType

#Remove-CdnConfiguration $cdnType # This removes all origins but the CDN is still enabled
#Set-SPOTenantCdnEnabled -CdnType $cdnType -Enable $false # This disables the CDN

The PowerShell functions:

Function Authenticate-PowerShell() {
	[string]$tenantUrl = "https://TENANT-admin.sharepoint.com"
	[string]$adminUsername = "USER@TENANT.onmicrosoft.com"
	[string]$adminPassword = "PASSWORD"

	# Ensure module is loaded
	if ((Get-Module Microsoft.Online.SharePoint.PowerShell).Count -eq 0) {
		Import-Module Microsoft.Online.SharePoint.PowerShell -DisableNameChecking
	}

	$secureAdminPassword = $(convertto-securestring $adminPassword -asplaintext -force)
	$cred = New-Object -TypeName System.Management.Automation.PSCredential -argumentlist $adminUsername, $secureAdminPassword
	Connect-SPOService -Url $tenantUrl -credential $cred
}

Function Set-CdnConfiguration($serverRelSiteCollectionUrl, $cdnType){
    #LogWaiting "Configuring CDN"
    #LogInfo ""

    $fileTypes = "GIF,ICO,JPEG,JPG,JS,PNG,CSS,EOT,SCG,TTF,WOFF"
    $cdnOrigins = @(
        "$serverRelSiteCollectionUrl/_catalogs/masterpage", 
        "$serverRelSiteCollectionUrl/style library",  
        "$serverRelSiteCollectionUrl/sitecollectionimages",
        "$serverRelSiteCollectionUrl/publishingimages"#,
        #"$serverRelSiteCollectionUrl/news/publishingimages"
    )

    # Enable cdn WITHOUT default origins
    $supressOutput = Set-SPOTenantCdnEnabled -CdnType $cdnType -Enable $true -NoDefaultOrigins -Confirm:$false

    # Configure cdn origins (incl ensure default origins)
    Ensure-CdnOrigin $cdnType $cdnOrigins

    # Extend list of file types
    $supressOutput = Set-SPOTenantCdnPolicy -CdnType $cdnType -PolicyType IncludeFileExtensions -PolicyValue $fileTypes

    #LogSuccess "done"

    # Print status
    Get-SPOTenantCdnOrigins -CdnType $cdnType
}

Function Ensure-CdnOrigin($cdnType, $originUrls){
  $originUrls | ForEach {
    $oUrl = $_
    try {
      #LogWaiting "Adding origin: $oUrl"
      $supressOutput = Add-SPOTenantCdnOrigin -CdnType $cdnType -OriginUrl $oUrl -Confirm:$false
    }
    catch {
      if($Error[0].Exception.ToString().Contains("The library is already registered as a CDN origin")) {
        # aleady present, do nothing
      }
      else {
        #LogError $Error[0]
        Exit
      }
    }
    #LogSuccess "done"
  }
}

Function Remove-CdnConfiguration($cdnType){
	(Get-SPOTenantCdnOrigins -CdnType $cdnType)	| ForEach { $_ | ForEach { $supress = Remove-SPOTenantCdnOrigin -CdnType $cdnType -OriginUrl $_ -Confirm:$false }}
}

Paul.

Excuting JS in an RSS Viewer web part in SharePoint

Back in SharePoint 2010 the RSS Viewer web part (a.k.a RSS Aggregator web part, RSSAggregatorWebPart) supported the use of XSL templates which contained script tags. There were a few funny things you had to in order get it working but it did work. Bring on SharePoint Online and it is not possible to add a script tag to your XSL templates (I believe this ‘issue’ exists on SharePoint 2013 as well). More accurately, script tags can be added and will rendered to the page but the script will not be executed.

In this day and age the correct answer is very likely: “why are you using this web part?” or “XSL are you mad?”. Both very strong arguments, but regardless, for the record you can code (hack) your way around this issue.

Use the onerror handler of an image tag to instantiate your JS. Ensure that the img tag is rendered as the last element in your template if you plan on using the js to modify the DOM.

<img style='display:none' src='#' onerror='console.log("boo!")' />

If you are using SharePoint 2010 check this out:
https://sharepoint.stackexchange.com/questions/9720/calling-javascript-within-xsl-after-transformation-in-rss-web-part

Paul.

OAuth Implicit Flow in JS without ADAL.js

This post provides a lightweight implementation of the OAuth implicit flow grant for obtaining an access token. Implicit flow is appropriate when the current user is authenticated to a common identity provider (e.g. Azure Active Directory a.k.a AAD) and the client (the environment requesting the token) is not secure. A great example of this is making a call to the Microsoft Graph from a page in SharePoint Online using only JavaScript.

The ADAL.js library exists as an authentication solution specifically for when working against AAD as the identity provider. Unfortunately, it is currently not well maintained and is over complicated. EDIT: ADAL.js has been updated multiple times since this post was first written and I would recommend using it. From a user experience perspective, the implementation discussed in this post avoids the need to redirect in order to authenticate. It happens seamlessly in the background via a hidden iframe.

Azure Active Directory
Azure Active Directory
  • A great article on the OAuth grants, agnostic of implementation, can be found here.
  • Thanks to my colleague Paul Lawrence for writing the first iteration of this code.
  • This code has a dependency on jQuery, mostly just for promises. I know, old school. I expect I’ll write an es6/2016 version of this soon enough but it shouldn’t be a challenge to convert this code yourself.
  • As I know I’ll get comments about it if I don’t mention it, this code doesn’t send and verify a state token as part of the grant flow. This is optional as far as the OAuth specification is concerned but it should be done as an additional security measure.
  • Although I’m Microsoft stack developer and have only tested this with AAD as the identity provider, I believe that it should work for any identify provider that adheres to the OAuth specification for authentication. You would need to play around with the authorisation server URL as login.microsoftonline.com is specifically for authenticating to AAD. I’d love feedback on this.
  • By definition, the OAuth implicit flow grant does not return a refresh token. Furthermore, the access token has a short lifetime, an hour I believe, and credentials must be re-entered before additional access tokens can be obtained via the implicit flow grant. The code provided in this post handles this by returning a URL which can be used to re-authenticate when a request fails. This URL can be used behind a link or redirection could be forced to occur automatically.

The following code snippet is an example of using this implicit flow library to call into the Microsoft Graph from within the context of a SharePoint Online page.
You will need to provide an appropriate AAD app ID for your AAD app. And don’t forget that you need to enable implicit flow via the app manifest and associate the correct delegate permissions.
This code should work not only with the Microsoft Graph but also to SharePoint Online endpoints, other AAD secured resources such as Azure services or your own AAD secured and CORS enabled web API.
[See note above about identity providers other than AAD]

var aadAppClientId = "8BE5AA0E-F900-4BDF-A7CF-71B3CC53B78E";
var resource = "https://graph.microsoft.com"
var query = "/v1.0/me/events";
var tokenFactory = new CC.CORE.Adal.AppTokenFactory(aadAppClientId, resource);
tokenFactory.ExecuteQuery(query)
.done(function (response) {
	// Success!
})
.fail(function (response) {
	// NOTE: Provide a link to renew an expired or yet to be approved session:
	// "Sorry, your session has expired or requires your approval. 
	// <div><a href='" + response.authorizeUrl + "'>Click here to sign in</a></div>";
});

Here is the implicit flow library code itself.

var CC = CC || {};
CC.CORE = CC.CORE || {};

CC.CORE.Log = function (errMsg) {
    // console.log is undefined in IE10 and earlier unless in debug mode, so must check for it
    if (typeof window.console === "object" && typeof console.log === "function") {
        console.log(errMsg);
    }
};

CC.CORE.Adal = (function () {
    "use strict";

    var appTokenFactory = function (aadAppClientId, resource) {
        // redirectUrl is the URL which the iframe will redirect to once auth occurs.
        // we use blank.gif as it is a very low payload
        var redirectUrl = _spPageContextInfo.webAbsoluteUrl + "/_layouts/images/blank.gif";

        // NOTE on security: include the userId in the cache key to prevent the case where a user logs out but
        // leaves the tab open and a new user logs in on the same tab. The first user's calender
        // would be returned if we didn't associate the cache key with the current user.
        var cacheKey = "candc_cache_adal_" + _spPageContextInfo.userId + "_" + aadAppClientId + "_" + resource;

        this.params = {
            clientId: aadAppClientId,
            redirectUrl: redirectUrl,
            resource: resource,
            cacheKey: cacheKey
        };

        var getAuthorizeUri = function (params, redirectUrl) {
            var authUri = "https://login.microsoftonline.com/common/oauth2/authorize" +
                            "?client_id=" + params.clientId +
                            "&response_type=token" +
                            "&redirect_uri=" + encodeURIComponent(redirectUrl) +
                            "&resource=" + encodeURIComponent(params.resource);
            return authUri;
        };

		var getQueryStringParameterByName = function (name, url) {
			name = name.replace(/[\[\]]/g, "\\$&");
			var regex = new RegExp("[?&#]" + name + "(=([^&#]*)|&|#|$)");
			var results = regex.exec(url);
			if (!results) return null;
			if (!results[2]) return '';
			return decodeURIComponent(results[2].replace(/\+/g, " "));
		};
		
        // create iframe, set its href, set listener for when loaded
        // to parse the query string. Deferred returns upon parse of query string in iframe.
        var acquirePassiveToken = function (params) {
            var deferred = jQuery.Deferred();

            // create iframe and inject into dom
            var iframe = jQuery("<iframe />").attr({
                width: 1,
                height: 1,
                src: getAuthorizeUri(params, params.redirectUrl)
            })
            jQuery(document.body).append(iframe);

            // bind event handler to iframe for parse query string on load
            iframe.on("load", function (iframeData) {
                parseAccessTokenFromIframe(iframeData, deferred);
            });

            return deferred.promise();
        };

        // handle iframe once it has loaded
        var parseAccessTokenFromIframe = function (iframeData, deferred) {
            // read the iframe href
            var frameHref = "";
            try {
                // this will throw a cross-domain error for any issue other than success
                // as the iframe will diplay the error on the login.microsoft domain
                frameHref = iframeData.currentTarget.contentWindow.location.href;
            }
            catch (error) {
                deferred.reject(error);
                return;
            }

            // parse iframe query string parameters
            var accessToken = getQueryStringParameterByName("access_token", frameHref);
            var expiresInSeconds = getQueryStringParameterByName("expires_in", frameHref);

            // delete the iframe, and event handler.
            var iframe = jQuery(iframeData.currentTarget);
            iframe.remove();

            // resolve promise
            deferred.resolve({
                accessToken: accessToken,
                expiresInSeconds: expiresInSeconds
            });
        };

        // get the most recent token from the cache, or if not available,
        // fetch a new token via iframe
        var getToken = function (params) {
            var deferred = jQuery.Deferred();
            
            // check for cached token
            var tokenFromCache = CC.CORE.Cache.Get(params.cacheKey);
            if (!tokenFromCache) {
                // fetch token via iframe
                acquirePassiveToken(params)
                .done(function (tokenFromIframe) {
                    CC.CORE.Log("ADAL: Fetched token from iframe.");
                    // expire cache a minute before token expires to be safe
                    var cacheTimeout = (tokenFromIframe.expiresInSeconds - 60) * 1000;
                    CC.CORE.Cache.Set(params.cacheKey, tokenFromIframe, cacheTimeout);
                    // resolve the promise
                    deferred.resolve(tokenFromIframe);
                })
                .fail(function (error) {
                    // Logs when rejection is caught
                    deferred.reject(error);
                });
            }
            else {
                CC.CORE.Log("ADAL: Fetched token from cache.");
                // resolve the promise
                deferred.resolve(tokenFromCache);
            }
            return deferred.promise();
        };

        this.ExecuteQuery = function (query, additionalHeaders) {
            var deferred = jQuery.Deferred();
            var params = this.params;
            // get token from cache or via iframe
            getToken(params)
            .done(function (token) {
                // submit request with token in header
                var ajaxHeaders = {
                    'Authorization': 'Bearer ' + token.accessToken
                };
                if (typeof additionalHeaders === "object") {
                    jQuery.extend(ajaxHeaders, additionalHeaders);
                }
                jQuery.ajax({
                    type: "GET",
                    url: params.resource + query,
                    headers: ajaxHeaders
                }).done(function (response) {
                    deferred.resolve(response);
                }).fail(function (error) {
                    deferred.reject({
                        error: error
                    });
                });
            })
            .fail(function (error) {
                CC.CORE.Log('ADAL error occurred: ' + error);
                deferred.reject({
                    error: error,
                    authorizeUrl: getAuthorizeUri(params, window.location.href)
                });
            });
            return deferred.promise();
        };
    };

    return {
        AppTokenFactory: appTokenFactory
    };

})(jQuery);

And here is the definition of the cache functions used above. Nothing special here, this could be swapped out with any cache implementation or removed altogether if caching is truly unnecessary or a security concern.

var CC = CC || {};
CC.CORE = CC.CORE || {};

CC.CORE.Cache = (function () {
    var defaultCacheExpiry = 15 * 60 * 1000; // default is 15 minutes
	var aMinuteInMs = (1000 * 60);
	var anHourInMs = aMinuteInMs * 60;
	
    var getCacheObject = function () {
        // Using session storage rather than local storage as caching benefit
        // is minimal so would rather have an easy way to reset it.
        return window.sessionStorage;
    };

    var isSupportStorage = function () {
        var cacheObj = getCacheObject();
        var supportsStorage = cacheObj && JSON && typeof JSON.parse === "function" && typeof JSON.stringify === "function";
        if (supportsStorage) {
            // Check for dodgy behaviour from iOS Safari in private browsing mode
            try {
                var testKey = "candc-cache-isSupportStorage-testKey";
                cacheObj[testKey] = "1";
                cacheObj.removeItem(testKey);
                return true;
            }
            catch (ex) {
                // Private browsing mode in iOS Safari, or possible full cache
            }
        }
        CC.CORE.Log("Browser does not support caching");
        return false;
    };

    var getExpiryKey = function (key) {
        return key + "_expiry";
    };

    var isCacheExpired = function (key) {
        var cacheExpiryString = getCacheObject()[getExpiryKey(key)];
        if (typeof cacheExpiryString === "string" && cacheExpiryString.length > 0) {
            var cacheExpiryInt = parseInt(cacheExpiryString);
            if (cacheExpiryInt > (new Date()).getTime()) {
                return false;
            }
        }
        return true;
    };

    var get = function (key) {
        if (isSupportStorage()) {
            if (!isCacheExpired(key)) {
                var valueString = getCacheObject()[key];
                if (typeof valueString === "string") {
                    CC.CORE.Log("Got from cache at key: " + key);
                    if (valueString.indexOf("{") === 0 || valueString.indexOf("[") === 0) {
                        var valueObj = JSON.parse(valueString);
                        return valueObj;
                    }
                    else {
                        return valueString;
                    }
                }
            }
            else {
                // remove expired entries?
                // not required as we will almost always be refreshing the cache
                // at this time
            }
        }
        return null;
    };

    var set = function (key, valueObj, validityPeriodMs) {
        var didSetInCache = false;
        if (isSupportStorage()) {
            // Get value as a string
            var cacheValue = undefined;
            if (valueObj === null || valueObj === undefined) {
                cacheValue = null;
            }
            else if (typeof valueObj === "object") {
                cacheValue = JSON.stringify(valueObj);
            }
            else if (typeof valueObj.toString === "function") {
                cacheValue = valueObj.toString();
            }
            else {
                alert("Cannot cache type: " + typeof valueObj);
            }

            // Cache value if it is valid
            if (cacheValue !== undefined) {
                // Cache value
                getCacheObject()[key] = cacheValue;
                // Ensure valid expiry period
                if (typeof validityPeriodMs !== "number" || validityPeriodMs < 1) {
                    validityPeriodMs = defaultCacheExpiry;
                }
                // Cache expiry
                getCacheObject()[getExpiryKey(key)] = ((new Date()).getTime() + validityPeriodMs).toString();
                CC.CORE.Log("Set in cache at key: " + key);
                didSetInCache = true;
            }
        }
        return didSetInCache;
    };

    var clear = function (key) {
        var cache = getCacheObject();
        cache.removeItem(key);
        cache.removeItem(getExpiryKey(key));
    };

    return {
        Get: get,
        Set: set,
        Clear: clear,
        IsSupportStorage: isSupportStorage,
        Timeout: {
            VeryShort: (aMinuteInMs * 1),
            Default: (anHourInMs * 2),
            VeryLong: (anHourInMs * 72),
        }
    };
})();

I welcome your comments, especially from anyone who gives this a go outside of Office 365 and the Microsoft stack.

Paul.

tslint, VS Code, and the SharePoint Framework

The new SharePoint Framework (SPFx) is currently in developer preview. In order to really get into it and start making great new web parts a developer needs to get a handle on TypeScript. The initial preview iteration of the SPFx shipped with very strict linting rules (tslint) and it forced (in)experienced developers to follow many best practices regarding not just typescript but es6/es2015 conventions as well. This was done by reporting linting errors as build failures as part of the Glup build chain.

Later drops of the SharePoint Framework have relaxed these linting rules but it is still less than ideal only being prompted about these issues at transpile/compile-time. The set of linting rules that is used in the build process is defined in a tslint.json file within the root config folder.

The tslint file that is provided by the SPFx generator
The tslint file that is provided by the SPFx generator

When it comes to developing SPFx web parts I have found Visual Studio Code to be great, as it is lightweight has an integrated terminal and github support and has extensions – noticeably a nice tslint extension. Unfortunately this extension does not support the JSON format nor all of the rules specified in tslint file provided by SPFx generator.

The TSLint VS Code extension
The TSLint VS Code extension

So here it is, my SPFx tslint file for use in VS Code. Just drop this file in the root of your src directory.

Add this tslint file to the root of the src folder
Add this tslint file to the root of the src folder

The following file is based on a core set of rules from SPFx Drop 2 with the incompatible rules removed and I’ve taken some liberty by adding my own preferred rules. Of course you can change these as you need, a list of the rules which the extension supports can be found here. I have also included an ‘extended’ version of the tslint file that is provided in the config folder further below.

tslint.json in the src folder:

tslint.json in the config folder:

Paul.

User photos in Office 365

The user photo story in Office 365 is not so straight forward. Photos are stored in Active Directory (AD) on-premises, Azure Active Directory (AAD), Exchange Online (EXO), SharePoint Online (SPO), and at first appearances possibly elsewhere as well (where does my Delve profile picture live, what about my Skype for Business (SfB) avatar?).

I have put together a flow diagram to represent how this actually works. It aims to demonstrate where user photos are stored and where different applications fetch user photos from (if they don’t store the images), and leads to some recommendations about user photo synchronisation.

Please note the date of this article (August 2016) and be conscious that Office 365 is changing rapidly and the following recommendations may have changed (e.g. Prior to the Delve user profile page, the SharePoint user profile page referenced images stored in SharePoint rather than Exchange. Changes such as these will continue to evolve).

User photos: the diagram

User photos flow in Office 365
User photos flow in Office 365

Where applications store and fetch user photos

Photo Location

Comments

Size

Is source?

On-premises AD DS in the thumbnailPhoto attribute

100Kb maximum

Recommended to be

96×96 or 48×48

Yes

Azure AD in the thumbnailPhoto attribute

100Kb maximum

Usually synced from AD DS via Azure AD Connect

Recommended to be

96×96 or 48×48

No.

Sync from AD

Exchange Online as property of the mailbox

500Kb

Provided manually by users or a bulk import can be scripted if source photos can be located and named appropriately.

If not provided, Exchange will reference the AAD thumbnailPhoto in some instances but only if the thumbnailPhoto is less than 10Kb.

Does not sync back to AD

Recommended to be

648×648

Yes

SharePoint Online ‘User Photos’ library

Three renditions of the EXO photo are automatically created in SharePoint after upload to EXO.

It generally takes up to 72 hours to see changes to EXO photo here. Sometimes we see that a user must ‘touch’ their profile before the sync will be performed.

NOTE: Updating user profile photo via Delve profile is actually updating EXO profile photo and not performing any actions directly in SharePoint Online.

Small is 48 x 48,

Medium is 72 x 72,

Large changes depending on the source image but is always square. I have seen as small as 120 x 120 and as large as 300 x 300. PnP image upload solution uploads these as 200 x 200.

No.

Sync from EXO

Skype for Business

Does not store any images

Uses the high resolution Exchange image if available, otherwise uses the AD thumbnailPhoto

EXO image or AD thumbnailPhoto

No.

Read from EXO

Delve user profile

Does not store any images

Uses the high resolution Exchange image if available, otherwise uses the AD thumbnailPhoto

EXO image or AD thumbnailPhoto

No.

Read from EXO

Yammer

Also stores its own photo. Out of scope of this discussion for now.

Yes

Likely issues and resolutions

Issue

Resolution

Exchange Online user photo is low quality (and in turn so is the SPO photo and SfB photo)

The source image coming from AD was/is low quality.

EXO user photos can be updated by users individually or if high res source photos are available this import can be scripted.

Source images should be jpg of 648×648 (resizing and compression can also be scripted)

Exchange Online user photo is high quality but SfB photo is low quality

High resolution photos from Exchange will be used as long as both Exchange and Sfb/Lync are of new enough versions (2013 or greater) and SfB is configured to allow all photos (not just those from AD).

NB. If a user doesn’t have a mailbox (e.g. not licenced) then they will be displayed using the AD photo

There is no Exchange Online user photo (and in turn there is no SPO photo or SfB photo)

A photo has not been imported to the user’s EXO mailbox and the AAD thumbnailPhoto either doesn’t not contain an image or that image is greater than 10Kb.

Import of photos up to 500Kb to EXO mailbox can be scripted (the source images could be on a file share, or AAD).

Changes to user photos are reflected quickly in Exchange and Skype but take days to replicate to SPO

Exchange to SPO synchronisation is a periodic process and can take up to 72 hours.

A custom solution can perform this replication on demand (e.g. at the same time EXO user photos are set)

User photos changed in other systems which update AD are not reflected in EXO, SPO, SfB.

E.g. A user in an on-premises SharePoint farm updates their user photo

When AD is updated, it is synchronised with AAD but that is as far as it gets as the “sync” from AAD to EXO is one-off import rather than a Sync.

Unlikely to be desirable to create a custom sync relationship here as users will want to be able to update EXO directly and won’t want their photo’s overwritten

User photos updated in EXO aren’t replicated to other systems which share an AD.

E.g. An on-premises SharePoint farm

The user photo in EXO is not synched back to AD – it can’t be consistently as the AD thumbnailPhoto attribute only supports photos up to 100Kb where EXO supports larger images.

Potential for a custom solution to sync images back to AD after having resized/compressed them to <100kb – However general recommendation is that AD thumbnailPhoto optimal size is 10Kb and 96×96.

Recommendations

Use Exchange user photos as the master. Allow users to update their user photos but pre-populate their user photo if possible and before end users are provided any access to the system.

If high resolution photos are available, script import of high resolution photos (648×648) to Exchange Online (see Set-UserPhoto and this and sample script below). These will then be visible in Exchange, in Skype, and, once processed, in SharePoint Online. In a dispersed environment this may have to be managed by many teams rather than trying to compile a single list of all user photos.

Users may then update their user profile photo directly via Outlook or indirectly via their Delve profile.

If synchronisation back to AD is required in order serve other applications (e.g. an on-premises SharePoint farm) then a custom solution could provide synchronisation from EXO to AD but this process should compress and shrink images as the recommended size of thumbnailPhoto images is only 96×96 and 10Kb.

Sample usage of the Set-UserPhoto cmdlet

Paul.


Azure AD app wildcard Reply URL

Azure AD apps (a.k.a Azure Active Directory apps, a.k.a AAD apps) are an essential component when interacting with Office 365 data outside of SharePoint – Mail, Calendar, Groups, etc.

As an O365 developer I have found myself writing JavaScript code against AAD apps (using ADAl.js) and often, especially during development, found myself entering a long list of Reply URLs. Reply URLs must be specified for any location from which authentication to AAD occurs. From a practical standpoint this results in someone (an Azure Administrator) having to update the list of Reply URLs every time a web part is inserted into a page or a new site is provisioned which relies on an Azure AD app.

If this is not done, the user is redirected to Azure login failure with ‘The reply address … does not match the reply addresses configured for the application’.

Error when Reply URL is not correctly specified
Error when Reply URL is not correctly specified

Perhaps the following is documented elsewhere but I have not come across it – a Reply URL can be specified using wildcards!

Using a wildcard Reply URL when configuring an AAD app
Using wildcard Reply URLs when configuring an AAD app

Probably the most common use for this is to end a Reply URL with an asterisk (wildcard) which will permit any URL which begins with the characters preceding it.

e.g. https://tenant.sharepoint.com/*
This example would support any URL coming from any page in SharePoint Online from within the named tenant.

It is also possible to use the wildcard character elsewhere in the Reply URL string.
e.g. https://*.sharepoint.com/*
This example would support any URL coming from any page in SharePoint Online from within *any* tenant.

Armed with this knowledge, be responsible and limit strictly how it is utilised. The implementation of Reply URL is a security feature and it is important that only trusted locations are allowed to interact with your app. I recommend only using wildcard Reply URLs in development environments.

Paul.

Creating dynamic links to Delve pages

Delve, as part of the Office 365 suite, provides a number of useful pages for finding content or people that are trending around you or that you recently interacted with. Often, as a Developer, these pages are the perfect target for “See More” links as part of customisations written using the Office Graph. Or perhaps as an administrator you would like to configure a promoted link on a team site home page to navigate to a user’s ‘Your Recent Documents’ page in Delve, for example.

Delve recent document page
The Delve Recent Documents page. Note that the URL contains the user’s AAD object ID.

Delve Links – a minor problem

When you visit pages that show content relevant to a specific user (such as Your Recent Documents or the Recent Documents page for another user) the URL of that page contains a query string variable ‘u’ with the value of this variable equal to the Azure Active Directory (AAD) object ID of the user. Azure Active Directory is the identity provider that backs Office 365 and is out the scope of this post. If this parameter is not provided then Delve falls back to the Delve homepage. I would have preferred it to have just used the current user if the parameter is not present, but no, this is how it works.

So, you can fetch the AAD Object ID by calling into the User Profile Service (Vardhaman Deshpande), however this is not necessary.

Delve Links – an easy resolution

The ‘u’ query string parameter can be substituted for the ‘p’ query string parameter where the value of p is the user’s account name – the email address which they use to login as.

This value is present on any SharePoint 2013+ page via the JavaScript variable: _spPageContextInfo.userLoginName
This can be utilised as follows:

var mySiteHostUrl = "https://-my.sharepoint.com";
var pageKey = "liveprofilemodified"; // liveprofilemodified='Recent Documents', liveprofileworkingwith='People page'
var delveUrl = mySiteHostUrl + "/_layouts/15/me.aspx" + "?v=" + pageKey + "&p=" + _spPageContextInfo.userLoginName;

Delve Links – side note

This value is present as an Office Graph property as: AccountName
The AAD Object is present as an Office Graph property as: AadObjectId

Paul.

Convert an existing plain text note field/column to rich text

If you create a SharePoint site column (a note field in this case), associate it with a site content type, and then associate that content type with a list in a sub site, the site column will be available on that library. Obviously right?

However, when you update the site column (and push all changes to lists and libraries) not *all* of the changes you make are in fact pushed down. An example of this is the setting that dictates whether a note field should allow rich text or enforce plain text. If you change this setting at the site column level it will *not* propagate to libraries which already exist. New instances of the column (say if you associated the content type with a list for the first time) will be configured correctly, but existing list-level instances are not updated. NOTE: This is only true for properties specific to particular column type; common properties such as ‘required’ will be pushed down to existing instances of the column at the list level.

Configuring a SharePoint note field to support rich text
Configuring a SharePoint note field

So you want to change a list-level instance of a plain text note column to a rich text note column (or vice-versa, or otherwise change column specific properties or another field type)? You need to do it for every list where the column is in use. That would be very tedious to do via the SharePoint UI, but you can’t anyway. The UI only supports changing the set of common field properties (type, required, hidden, etc).

In comes PowerShell. Below you will find a script which updates a plain text note column to be a rich text note column. It is important to note that this script only updates the list-level columns and not the site column. This means that after running the script, new instances will continue to inherit the site column configuration.

The script takes advantage of recursion using delegate functions which is an approach I blogged about here: PowerShell Recursion with Delegate Functions

Credit also to Chris O’Brien’s topofscript.ps1 for the CSOM integration bit: Using CSOM in PowerShell scripts with Office 365

The script is written for SharePoint Online (and assumes that the SharePoint Online Client Components SDK is installed) but for this to work on-premises you would only need to update the referenced assemblies (v15 for 2013) and modify the code which passes the credentials.

Paul.