User photos in Office 365

The user photo story in Office 365 is not so straight forward. Photos are stored in Active Directory (AD) on-premises, Azure Active Directory (AAD), Exchange Online (EXO), SharePoint Online (SPO), and at first appearances possibly elsewhere as well (where does my Delve profile picture live, what about my Skype for Business (SfB) avatar?).

I have put together a flow diagram to represent how this actually works. It aims to demonstrate where user photos are stored and where different applications fetch user photos from (if they don’t store the images), and leads to some recommendations about user photo synchronisation.

Please note the date of this article (August 2016) and be conscious that Office 365 is changing rapidly and the following recommendations may have changed (e.g. Prior to the Delve user profile page, the SharePoint user profile page referenced images stored in SharePoint rather than Exchange. Changes such as these will continue to evolve).

User photos: the diagram

User photos flow in Office 365
User photos flow in Office 365

Where applications store and fetch user photos

Photo Location

Comments

Size

Is source?

On-premises AD DS in the thumbnailPhoto attribute

100Kb maximum

Recommended to be

96×96 or 48×48

Yes

Azure AD in the thumbnailPhoto attribute

100Kb maximum

Usually synced from AD DS via Azure AD Connect

Recommended to be

96×96 or 48×48

No.

Sync from AD

Exchange Online as property of the mailbox

500Kb

Provided manually by users or a bulk import can be scripted if source photos can be located and named appropriately.

If not provided, Exchange will reference the AAD thumbnailPhoto in some instances but only if the thumbnailPhoto is less than 10Kb.

Does not sync back to AD

Recommended to be

648×648

Yes

SharePoint Online ‘User Photos’ library

Three renditions of the EXO photo are automatically created in SharePoint after upload to EXO.

It generally takes up to 72 hours to see changes to EXO photo here. Sometimes we see that a user must ‘touch’ their profile before the sync will be performed.

NOTE: Updating user profile photo via Delve profile is actually updating EXO profile photo and not performing any actions directly in SharePoint Online.

Small is 48 x 48,

Medium is 72 x 72,

Large changes depending on the source image but is always square. I have seen as small as 120 x 120 and as large as 300 x 300. PnP image upload solution uploads these as 200 x 200.

No.

Sync from EXO

Skype for Business

Does not store any images

Uses the high resolution Exchange image if available, otherwise uses the AD thumbnailPhoto

EXO image or AD thumbnailPhoto

No.

Read from EXO

Delve user profile

Does not store any images

Uses the high resolution Exchange image if available, otherwise uses the AD thumbnailPhoto

EXO image or AD thumbnailPhoto

No.

Read from EXO

Yammer

Also stores its own photo. Out of scope of this discussion for now.

Yes

Likely issues and resolutions

Issue

Resolution

Exchange Online user photo is low quality (and in turn so is the SPO photo and SfB photo)

The source image coming from AD was/is low quality.

EXO user photos can be updated by users individually or if high res source photos are available this import can be scripted.

Source images should be jpg of 648×648 (resizing and compression can also be scripted)

Exchange Online user photo is high quality but SfB photo is low quality

High resolution photos from Exchange will be used as long as both Exchange and Sfb/Lync are of new enough versions (2013 or greater) and SfB is configured to allow all photos (not just those from AD).

NB. If a user doesn’t have a mailbox (e.g. not licenced) then they will be displayed using the AD photo

There is no Exchange Online user photo (and in turn there is no SPO photo or SfB photo)

A photo has not been imported to the user’s EXO mailbox and the AAD thumbnailPhoto either doesn’t not contain an image or that image is greater than 10Kb.

Import of photos up to 500Kb to EXO mailbox can be scripted (the source images could be on a file share, or AAD).

Changes to user photos are reflected quickly in Exchange and Skype but take days to replicate to SPO

Exchange to SPO synchronisation is a periodic process and can take up to 72 hours.

A custom solution can perform this replication on demand (e.g. at the same time EXO user photos are set)

User photos changed in other systems which update AD are not reflected in EXO, SPO, SfB.

E.g. A user in an on-premises SharePoint farm updates their user photo

When AD is updated, it is synchronised with AAD but that is as far as it gets as the “sync” from AAD to EXO is one-off import rather than a Sync.

Unlikely to be desirable to create a custom sync relationship here as users will want to be able to update EXO directly and won’t want their photo’s overwritten

User photos updated in EXO aren’t replicated to other systems which share an AD.

E.g. An on-premises SharePoint farm

The user photo in EXO is not synched back to AD – it can’t be consistently as the AD thumbnailPhoto attribute only supports photos up to 100Kb where EXO supports larger images.

Potential for a custom solution to sync images back to AD after having resized/compressed them to <100kb – However general recommendation is that AD thumbnailPhoto optimal size is 10Kb and 96×96.

Recommendations

Use Exchange user photos as the master. Allow users to update their user photos but pre-populate their user photo if possible and before end users are provided any access to the system.

If high resolution photos are available, script import of high resolution photos (648×648) to Exchange Online (see Set-UserPhoto and this and sample script below). These will then be visible in Exchange, in Skype, and, once processed, in SharePoint Online. In a dispersed environment this may have to be managed by many teams rather than trying to compile a single list of all user photos.

Users may then update their user profile photo directly via Outlook or indirectly via their Delve profile.

If synchronisation back to AD is required in order serve other applications (e.g. an on-premises SharePoint farm) then a custom solution could provide synchronisation from EXO to AD but this process should compress and shrink images as the recommended size of thumbnailPhoto images is only 96×96 and 10Kb.

Sample usage of the Set-UserPhoto cmdlet

Paul.


Convert an existing plain text note field/column to rich text

If you create a SharePoint site column (a note field in this case), associate it with a site content type, and then associate that content type with a list in a sub site, the site column will be available on that library. Obviously right?

However, when you update the site column (and push all changes to lists and libraries) not *all* of the changes you make are in fact pushed down. An example of this is the setting that dictates whether a note field should allow rich text or enforce plain text. If you change this setting at the site column level it will *not* propagate to libraries which already exist. New instances of the column (say if you associated the content type with a list for the first time) will be configured correctly, but existing list-level instances are not updated. NOTE: This is only true for properties specific to particular column type; common properties such as ‘required’ will be pushed down to existing instances of the column at the list level.

Configuring a SharePoint note field to support rich text
Configuring a SharePoint note field

So you want to change a list-level instance of a plain text note column to a rich text note column (or vice-versa, or otherwise change column specific properties or another field type)? You need to do it for every list where the column is in use. That would be very tedious to do via the SharePoint UI, but you can’t anyway. The UI only supports changing the set of common field properties (type, required, hidden, etc).

In comes PowerShell. Below you will find a script which updates a plain text note column to be a rich text note column. It is important to note that this script only updates the list-level columns and not the site column. This means that after running the script, new instances will continue to inherit the site column configuration.

The script takes advantage of recursion using delegate functions which is an approach I blogged about here: PowerShell Recursion with Delegate Functions

Credit also to Chris O’Brien’s topofscript.ps1 for the CSOM integration bit: Using CSOM in PowerShell scripts with Office 365

The script is written for SharePoint Online (and assumes that the SharePoint Online Client Components SDK is installed) but for this to work on-premises you would only need to update the referenced assemblies (v15 for 2013) and modify the code which passes the credentials.

Paul.

PowerShell recursion with delegate functions, iterate all lists in all webs

There are many ways to iterate a collection in PowerShell. I just really like using delegate functions. This approach is not native PowerShell but utilises the .NET Action class as a function parameter. Using a delegate function approach, it is possible to create a recursive loop that can be very easily reused in the future just by providing an alternative Action.

The example code I provide below demonstrates how to create a delegate function in PowerShell, how to write a function that accepts one as a parameter, and provides some ready made samples for iterating SharePoint objects, specifically all webs or all lists. I am using some specific SharePoint objects in these samples, however the fundamental pattern can be used to effectively iterate any recursive structure.

foreachDecendentWeb : perform an action on every web below the provided web
foreachListInWeb : perform an action on every list in the provided web
foreachListInWebAndAllDecendentWebs : perform an action on every list in the current and all decendent webs

Delegates

Some notes

The below script references ‘TopOfScript.ps1’, it is specifically related to calling SharePoint CSOM from PowerShell. Read about it here on sharepointnutsandbolts.

Making the call, providing the delegate

The utility scipts, recursive functions accepting delegate parameters

Paul.

PowerShell Add-Type Error: Could not load file or assembly XXX or one of its dependencies.

If you are attempting to run the PowerShell command Add-Type with a Path parameter that references an assembly which has been emailed or downloaded from the internet (e.g. Add-Type -Path "C:\Microsoft.SharePoint.Client.dll") then it is likely that you will encounter the following error:

Could not load file or assembly ‘file:///’ or one of its dependencies. Operation is not supported. (Exception from HRESULT: 0x80131515)

PS_AddType

To confirm that the following steps are appropriate. Check the inner exception:

PS> $error[0].Exception.InnerException

“An attempt was made to load an assembly from a network location which would have caused the assembly to be
sandboxed in previous versions of the .NET Framework. This release of the .NET Framework does not enable
CAS policy by default, so this load may be dangerous. If this load is not intended to sandbox the assembly,
please enable the loadFromRemoteSources switch. See http://go.microsoft.com/fwlink/?LinkId=155569 for more
information.”

Not using SharePoint Client Components?

First check the execution policy under which PowerShell is running. This can be done by running the Get-ExecutionPolicy cmdlet. If you are not running under an Unrestricted policy, try running the Add-type cmdlet after setting the execution policy to Unrestricted (Set-ExecutionPolicy “Unrestricted”). If this resolves the issue then you’ll need to look into getting the assembly signed by a trusted publisher if you can’t continue to run PowerShell in this lowered state of security.

Next ensure that the assembly hasn’t been blocked. The easiest way to achieve this is by right-clicking on the assembly and checking on the General tab that there isn’t the option to unblock the assembly at bottom of the dialog. There is also the Unblock-File cmldlet to achieve this action via PowerShell.

Unblocking an assembly
Unblocking an assembly

Finally, PowerShell supports a config file which by default is not present. It allows configuration of a few things like the .NET versions which are supported and, importantly for us, whether or not PowerShell is allowed to load from remote sources. In order to add/update this file, ensure that you are logged in as a local administrator. The config file needs to reside adjacent to the PowerShell exe file. So, by default, it ought to be created here: C:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe.config

Note: Do not confuse this with powershell_ise.exe.config although you may want to update this as well if you use ISE.

The content of the powershell.exe.config should be as follows. If you already have a file present, ensure that you merge the XML rather than just replacing it.

<?xml version="1.0"?>
<configuration>
  <startup useLegacyV2RuntimeActivationPolicy="true">
    <supportedruntime version="v4.0.30319"/>
    <supportedruntime version="v2.0.50727"/>
  </startup>
  <runtime>
    <loadfromremotesources enabled="true"/>
  </runtime>
</configuration>

Using SharePoint Client Components?

If you are adding the SharePoint Client assemblies required for the execution of CSOM requests then you should take a different approach. Rather than bundling these assemblies with your script, instead insure that the correct SharePoint Client Components package is installed on the executing machine and reference the assemblies from their default location:

Download Client Components:
SharePoint Online Client Components
SharePoint 2013 Client Components

Reference Accordingly:
C:\Program Files\Common Files\microsoft shared\Web Server Extensions\16\ISAPI\assembly.dll
C:\Program Files\Common Files\microsoft shared\Web Server Extensions\15\ISAPI\assembly.dll

Paul.

SPO CSOM Error: For security reasons DTD is prohibited in this XML document

I found myself encountering the following error when authenticating to SharePoint Online using CSOM from PowerShell:

DTD_error
Exception calling “ExecuteQuery” with “0” argument(s): “For security reasons DTD is prohibited in this XML document. To enable DTD processing set the DtdProcessing property on XmlReaderSettings to Parse and pass the settings into XmlReader.Create method.”

I believe that there a number of causes for this issue some of which are firewall and ISP related. This may only resolve a subset of the cases where this issue has been arising, even under the same circumstances.

In my scenario, I found that this issue was only arising when the credentials I was passing were being federated. That is, when the username was *not* in form <me>@<domain>.onmicrosoft.com but rather something like <me>@<domain>.co.uk. It is also possible that this issue resolves itself after a single successful authentication has occurred. Try providing credentials for a *.onmicrosoft.com account, and if that works try again with a federated account. This is discussed more later.

I used Fiddler to compare the request/response trace from a successful authentication and one where this error occurs. It turns out that somewhere internally a request is made to msoid.<full-domain> where <full-domain> is the bit after the @ symbol from the username provided. In the case where this value is of the *.onmicrosoft.com variety, a 502 error (no DNS entry) is returned with no request body and the authentication proceeds successfully. In the other case, the ‘msoid’ URL is resolved and a response with a request body is returned.
In my case the response was a 301 error (permanent relocation), however I read of cases where a 200 (success) has been received. Importantly to note, is that the response, success or otherwise, returns an HTML body containing a DTD (Document Type Declaration), and in turn produces the rather unhelpful error message.

So how do you fix it? Well one way is to provide an entry in your hosts file which ensure that the msoid URL will be invalid. I found that providing a local host entry for it worked. Your hosts file can be found here:
C:\Windows\System32\drivers\etc

I added a line which looks like the following:

127.0.0.1        msoid.<domain>.co.uk

And it worked! Intriguingly I found that if I then removed this line from my hosts file, SharePoint Online authentication from PowerShell continued to work. It is for this reason that I suggested trying to use a *.onmicrosoft.com account first at the begging of this post – just in case it resolves the issue for you without touching the hosts file. Please comment if you have any success (or otherwise) with that approach.

Hope this helps! Good luck.

Web Event Receivers: Attach using PowerShell

The following is a simple PowerShell script for attaching SharePoint web event receivers. It is clever enough to check if they have already been attached and hence to avoid duplication.

There are no “gotchas” or anything really worth noting here. This example is pretty much the same as the dozens of list item event receiver examples out there, just targeting a different event receiver type. I am only posting it because I was so surprised that I had to write it myself after failing to find an example to ‘steal’ from the internet – maybe I wasn’t looking hard enough.

The first half of the script is the bit you’ll be interested in (if you are interested in any of it!), the second half is just a usage example.

# Defintion of function to attach web event receivers
# if they are not already added
Function AddWebEventReceivers($webER) 
{   
  $assembly = "*fully_qualified_assembly_name*"
  $class = "*class_name_including_namespace*"

  # Only attach receivers if there aren't already added
  # You can make the check more specific by checking the Type
  # property as well if required
  $existingER = $webER.EventReceivers | Where { $_.Class -eq $class }
  if($existingER -eq $null -or $existingER.length -eq 0)
  {
    $webER.EventReceivers.Add("WebMoved", $assembly, $class)
  }
}

# Iterate all webs and attach the web event receivers to
# sites based on a certain web template
$site = Get-SPSite "*webAppUrl*"
$allWebs = $site.AllWebs
foreach($web in $allWebs)
{
  try 
  {
    # Only act on certain sites
    if($web.WebTemplateId -eq 100009 -and $web.Configuration -eq 2)
    {
      AddWebEventReceivers($web)
    }
  }
  catch [System.Exception]
  {
    $errorMessage = $Error[0]
    Write-Host "Failed: $errorMessage" -NoNewline -F Red
  }
  finally
  {
    $web.Dispose()
  }
}
$site.Dispose()

Cancel all workflows in site collection efficiently

Here you will find a script to cancel all list workflows (not site workflows) running in a given site collection in the most efficient manner possible (that I can think of) while still only using the SharePoint API. A better approach still would be to query the SharePoint workflows SQL table to identify the running workflows so as to avoid iterating all site collection webs and lists. Unfortunately there is no ‘GetAllRunningWorkflows(SPSite)’ method available via the API. I imagine that this approach should be satisfactory in the majority of cases though.
cancelbutton
There are a number of posts on the web with code somewhat similar to this, at least with code that aims to achieve the same outcome. Of all the posts which I found they all performed this function in a very inefficiency manner, iterating the SPListItemCollection for every list in the site collection. This may be fine in many circumstances but I wanted something that would run faster with less strain on the server.

I have achieved this by checking for workflow associations on a list before iterating the items as well querying the list items and where a workflow column exists checking to see if the list item needs to be returned at all. When returning the list items, I am querying with ViewFieldsOnly so that less data is returned.

This script also accepts an optional parameter that specifies which workflow associations should be canceled if you are not looking to cancel all of the workflow associations but just those of a specific name.

NB: The script contains a reference to a help function GetNestedCaml which I have defined in a separate post which can be found here.

As the script sample is quite large I suggest clicking the ‘view raw’ link at the bottom of the sample to view it.

Dynamically nesting CAML query statements

Here is a short PowerShell function that can be used when you need to dynamically generate CAML queries with many logically joined statements. I actually adapted it from a C# implementation I wrote (which is probably more useful…) but as you can rewrite it in C# very easily I won’t bother bother posting it twice.

As CAML logical join operators (And, Or) can only compare two statements, when many statements need to be compared you must nest them which is what this function achieves. The $join parameter should be passed as “And” or “Or” and the $fragments parameter should be passed as an array of CAML statement strings such as:
@("<Eq><FieldRef Name='Title' /><Value Type='Text'>title</Value></Eq>", "<Eq><FieldRef Name='FileLeafRef' /><Value Type='Text'>name.docx</Value></Eq>")

# Define method for nesting caml query statements
function GetNestedCaml([array]$fragments, [string]$join)
{
    if ($fragments.Length -lt 1)
    {
        return [string]::Empty
    }
    elseif ($fragments.length -eq 1)
    {
        return $fragments[0]
    }
    elseif ($fragments.length -eq 2)
    {
        return "<$join>" + $fragments[0] + $fragments[1] + "</$join>"
    }
    $joinFrags = @()
     $baseJoinCount = [int][Math]::Floor($fragments.length / 2)
    for ($i = 0; $i -lt $baseJoinCount; $i++)
    {
        $baseIndex = (2 * $i)
        $fragsToJoin = @($fragments[$baseIndex], $fragments[$baseIndex + 1])
        $joinFrag = GetNestedCaml $fragsToJoin $join
        $joinFrags += $joinFrag
    }
    if ($fragments.length % 2 -ne 0)
    {
        $joinFrags += $fragments[$fragments.length - 1]
    }
    return GetNestedCaml $joinFrags $join
}

Deployment failure causing Get-SPSite to fail

I run a scripted deployment process each time new (SharePoint) solutions are ready for QA, or any environment for that matter. We script a series of commands, specifically:
Uninstall-SPSolution, Remove-SPSolution, Add-SPSolution and finally Install-SPSolution.

Before running these commands, along with a number of other checks, we perform Get-SPSite to ensure the site is available and fail early if need be. If our custom solution has not been deployed then viewing any page under any of the web application’s site collections fails with an exception due to a failure to find one of the solution assemblies. This is because we have custom membership and claims providers which are defined in the solution assembly and are referenced in the web.config for the web application. Despite this, any SPSite and SPWeb objects can be obtained safely via PowerShell as forms authentication is not taking place.

So I was surprised to find that the Get-SPSite check failing during deployment today with a failure to locate the assembly containing the membership and claims providers. I did not discover the root cause as to why this suddenly occurred but will outline what it took to fix it.

In the end I was able to ” Install-SPSolution -force ” to recover from this situation but not before stopping and starting the SharePoint Administration Service across all Web servers in the farm. The service was not stopped on any of the machines however the Install job was never completing despite the timer job history in Central Administration stating that the job had been successful. Upon restarting all this service the Install-SPSolution command would then complete.

Credit to Andreas’ blog for putting me in the right direction.

Scripting WSP deployment

I won’t take this opportunity to evangelise the benefits of scripted deployments, I’m going to assume that you already do it (as you should be!) and provide a tiny bit of script that will identify when your deployment doesn’t run quite so smoothly. Before I do, I’ll briefly mention my experience as to why a deployment may fail.

I have found that the by far the primary reason that a SharePoint full-trust deployment fails is that one or more of the assemblies being deployed is locked in the GAC and cannot be removed/replaced. An ISSRESET fixes this in the majority of cases (consider performing a remote ISSRESET across all the farm SharePoint servers as part of your deployment process. Note to self: future blog topic…) and in the remainder of cases stopping related services (v4 Timer, SSRS etc) on the affected server will release the assembly. The easiest way to identify the servers at which the deployment failed is via CA:

Central Administration > System Settings > Farm Management :: Manage farm solutions > "Your WSP"
Central Administration > System Settings > Farm Management :: Manage farm solutions > “Your WSP”

WSPs are deployed using a timer job. When performing deployment actions we need to wait on the timer job and upon it completing then verify the deployment status of the solution is as we expect. If we don’t wait for the action and ensure that it has run successfully we run this risk of not detecting a failed to deployment until we attempt to access the site. This can be a real time sink if we run lengthy scripts after deployment or are scripting the deployment a number of solutions in succession. The following snippet shows how easy it is to achieve this in PowerShell.

May your deployment failures be immediately detected.