Delete all list Item and file versions from site and sub sites using CSOM and PowerShell

The versions property is not available from client object model on the ListItem class as with server object model.  I landed on the article SharePoint – Get List Item Versions Using Client Object Model that describes how to get a list item versions property using the method GetFileByServerRelativeUrl by passing the FileUrl property. The trick is to build the list item url as “sites/test/Lists/MyList/30_.000” where 30 is the item id for which the version history needs to be retrieved. Using that information I created a PowerShell to loop through all lists in the site collection and sub sites to delete all version history from lists.

The script below targets a SharePoint tenant environment.

Please note that I have used script Load-CSOMProperties.ps1 from blog post Loading Specific Values Using Lambda Expressions and the SharePoint CSOM API with Windows PowerShell to help with querying object properties like Lambda expressions in C#. The  lines below demonstrate use of the  Load-CSOMProperties.ps1  file which I copied in the same directory where the script DeleteAllVersionsFromListItems.ps1 is.

         Set-Location $PSScriptRoot

        $pLoadCSOMProperties=(get-location).ToString()+”\Load-CSOMProperties.ps1″
       . $pLoadCSOMProperties

  •  Retrieve ServerRelativeURL property from file object

 Load-CSOMProperties -object $file -propertyNames @(“ServerRelativeUrl”); 

You can download the script from technet.

SQL script batch execution using sqlcmd in PowerShell

There is often a mismatch between needs of the development team (multiple discreet T-SQL files for separate concerns) and the release team (the requirement  for one step automated deployment) . The script bridges the requirement by using sqlcmd.exe to run a batch of SQL scripts.

A text file is used listing all sql files that need to run in a particular order to avoid errors which may occur if there are dependencies between the scripts. Instead of using a text file a number can be prefixed to the scripts based on the order they need to run.

The script expects two parameters –

  • Path of folder containing the set of T-SQL files (and the manifest file, see below)
  • Connection string

The script can be downloaded from technet gallery.

## Provide the path name of the SQL scripts folder and connnection string
##.\SQLBatchProcessing.ps1 -SQLScriptsFolderPath "C:\Sql Batch Processing\SQLScripts" -ConnectionString "DEV-DB-01"
Param(
[Parameter(Mandatory=$true)][String]$ConnectionString ,
[Parameter(Mandatory=$true)][String]$SQLScriptsFolderPath
)
Set-ExecutionPolicy -ExecutionPolicy:Bypass -Force -Confirm:$false -Scope CurrentUser
Clear-Host
#check whether the SQL Script Path exists
$SQLScriptsPath = Resolve-Path $SQLScriptsFolderPath -ErrorAction Stop
#a manifest file will exisit in the SQL scripts folder detailing the order the scripts need to run.
$SQLScriptsManifestPath = $SQLScriptsFolderPath + "\Manifest.txt"
#Find out whether the manifest file exists in the the SQL Scripts folder
$SQLScriptsManifestPath = Resolve-Path $SQLScriptsManifestPath -ErrorAction Stop


#if manifest file found iterate through each line , validate if corresponding SQL script exists in file before running each of them
Get-Content $SQLScriptsManifestPath | ForEach-Object {
$SQLScriptsPath = $SQLScriptsFolderPath + "\" + $_.ToString()
Resolve-Path $SQLScriptsPath -ErrorAction Stop
}
$SQLScriptsLogPath = $SQLScriptsFolderPath + "\" + "SQLLog.txt"
Add-Content -Path $SQLScriptsLogPath -Value "***************************************************************************************************"
Add-Content -Path $SQLScriptsLogPath -Value "Started processing at [$([DateTime]::Now)]."
Add-Content -Path $SQLScriptsLogPath -Value "***************************************************************************************************"
Add-Content -Path $SQLScriptsLogPath -Value ""
Get-Content $SQLScriptsManifestPath | ForEach-Object {
$SQLScriptsPath = $SQLScriptsFolderPath + "\" + $_.ToString()
$text = "Running script " + $_.ToString();
Add-Content -Path $SQLScriptsLogPath -Value $text
sqlcmd -S "DEV-DB-01" -i $SQLScriptsPath | Out-File -Append -filepath $SQLScriptsLogPath
}
Add-Content -Path $SQLScriptsLogPath -Value "***************************************************************************************************"
Add-Content -Path $SQLScriptsLogPath -Value "End processing at [$([DateTime]::Now)]."
Add-Content -Path $SQLScriptsLogPath -Value "***************************************************************************************************"
Add-Content -Path $SQLScriptsLogPath -Value ""

Instantiate CRMServiceClient using current user’s login

The Get-CrmConnection method can be used to  return connection to a CRM instance. The syntax to call the method is

Parameter Set: OnLine
Get-CrmConnection [-OnLineType] <OnlineType> [[-Credential] <PSCredential> ] 
[-DeploymentRegion] <String> [[-ProfileName] <String> ] -OrganizationName <String> [ <CommonParameters>]

Parameter Set: OnPrem
Get-CrmConnection [-ServerUrl] <Uri> [[-Credential] <PSCredential> ] 
[-OrganizationName] <String> [[-HomeRealmUrl] <Uri> ] [[-ProfileName] <String> ] 
[ <CommonParameters>]

Parameter Set: UIOnly
Get-CrmConnection [[-InteractiveMode]] [ <CommonParameters>]

I wanted to get the crm connection with the current user’s credentials without any prompts. The first and second options required the object PSCredential which can’t be created using logged current user’s credentials. The third option with the switch InteractiveMode  displays a dialog box prompting to enter connection details. All three options were not appropriate for the requirement.

The method returns the object Microsoft.Xrm.Tooling.CrmServiceClient. From the msdn article, it can be constructed using the NetworkCredential object.

The constructor’s definition in C#

public CrmServiceClient(
	NetworkCredential credential,
	AuthenticationType authType,
	string hostName,
	string port,
	string orgName,
	bool useUniqueInstance = false,
	bool useSsl = false,
	OrganizationDetail orgDetail = null
)

 

In PowerShell, the current user’s credentials can be retrieved using [System.Net.CredentialCache]::DefaultNetworkCredentials.

There is no way the [System.Net.CredentialCache]::DefaultNetworkCredentials can be converted to the PSCredential object.

The below syntax can be used to create the Microsoft.Xrm.Tooling.Connector.CrmServiceClient using logged in user’s credentials. Replace the variables $serverName, $serverPort, $organizationName with the respective values.

 $crmConnection = New-Object `
 -TypeName Microsoft.Xrm.Tooling.Connector.CrmServiceClient `
 -ArgumentList ([System.Net.CredentialCache]::DefaultNetworkCredentials), 
 ([Microsoft.Xrm.Tooling.Connector.AuthenticationType]::AD),
 $serverName,
 $serverPort, 
 $organizationName, 
 $False,
 $False,
 ([Microsoft.Xrm.Sdk.Discovery.OrganizationDetail]$null)

Instantiate SharePoint Client Context using current user credentials in PowerShell

In C# managed code, SharePoint Client Context can be created using System.Net.CredentialCache to pass logged in user credentials.

ICredentials credentials = CredentialCache.DefaultCredentials;
clientContext.Credentials = credentials;

I could not find anywhere how to achieve it in PowerShell.

If using PnP PowerShell module, the switch parameter CurrentCredentials can be used with the cmdlet Connect-Online.

Connect-SPOnline -Url “http://dev-sp-001a:1214/Teams/Legal” -CurrentCredentials 
$ctx= Get-SPOContext

In most of my CSOM code without use of PnP I used to get current user name using [Environment]::UserName

[Environment]::UserName

I used to prompt the current user to enter password

$AdminPassword = Read-Host "Enter password: " -AsSecureString

Lately I discovered I could use  system.net.credentialcache in PowerShell  to pass current user credentials when instantiating the ClientContext object.

 $ctx=New-Object Microsoft.SharePoint.Client.ClientContext($siteUrl) 
 $Credentials = [System.Net.CredentialCache]::DefaultNetworkCredentials
 $ctx.Credentials = $Credentials;
 $web = $ctx.Web
 $ctx.Load($web);
 $ctx.ExecuteQuery();

This means you can start Windows PowerShell as the user having appropriate permissions to the SharePoint Environment and run CSOM code without prompting credentials.

External Sharing of Documents with Nintex in SharePoint Online

The “Office 365 update item permissions” step available in Nintex Workflow updates item permissions. However if external users are given permissions using this step, it fails with message “failed to resolve user “. The issue with Office 365 update item permissions is that it does not allow updating permissions for external users.

With the SharePoint 2016 Remote API Sharing with external users can be achieved using method UpdateDocumentSharingInfo from class SPDocumentSharingManager which fortunately can be called using REST.

The tenant and site collection sharing property needs to be updated to either option “Allow external users who accept sharing invitations and sign in as authenticated users” or “Allow sharing with all external users, and by using anonymous links” to be able to add external users. If the first option is selected the user needs to be given access to the site.

allowsharingwithallexternalusers

The workflow will work against a list having columns Email and SharePointLink

listcolumnconfig

The first step in the workflow is to get the RequestDigest from targeted site collection. App Step to get request digest

The steps to add in the workflow are as below

  • Add “App Step” action
  • Add “Web Request” action. Fill in the following fields
    • URL: <siteURL> /_api/ContextInfo
    • Method : POST
    • Content type : text/xmlcontextinforequest
    • Body : Choose “Content” option  and enter “{}” in text field
    • UserName: Credentials who has access to targeted site
    • Password: password of above UserName
    • Store response content in : Create a variable “RequestDigest”
  • Add “Query XML” action. Fill in the following fields
    • XML source: Choose “Content” and pick variable “digest token”
    • XPath query: /d:GetContextWebInformation/d:FormDigestValuegetrequestdigest
    • Return result as :Text
    • Query result in : create a text variable strDigestInfo
  • Add “Log to History List” step. Print the strDigestInfo to make sure valid request digest are used.

Please note that workflow app permissions need to amended to give site collection full control access.

The second step is to call the UpdateDocumentSharingInfo REST API method to grant appropriate access to the external user.

appsteptoaddreadpermissiontodocument

  • Add “App Step” action
  • Add Build Dictionary step.  Rename is to “Request Headers”.
    • Add key “accept” and Value “application/json;odata=verbose”
    • Add key “content-type” and Value  “application/json;odata=verbose”
    • Add key “X-RequestDigest” and Value ‍{Variable:strDigestInfo}
    • Add Output to variable RequestHeadersPerm.

Request Headers build-request-headers

  • Add Build Dictionary step.  Rename is to “Build Metadata”.
    • Add key “type” and value “SP.Sharing.UserRoleAssignment”.
    • Output in variable MetadaPerm.

buildmetadata

  • Add Build Dictionary step.  Rename is to “Build UserRoleAssignments”.
    • Add key “__metadata” and Value of type Dictionary to variable MetadataPerm
    • Add key “Role” and value of type Integer 1. The Role property represents the level of permission you want to grant. Possible values are  1 =  View, 2 =  Edit, 3 = Owner, 0 = None.
    • Add UserId to ‍{Current Item:Email}
    • Add Output to UserRoleAssignments

build-userroleassignments

  • Add Item to Collection Step
    • Set Target collection to  collection variable RoleAssignmentCol
    • Set Index 0
    • Set Value to dictionary variable UserAssignmentts
    • Set Output to RoleAssignmentCol

roleassignmentstocollection

  • Add a Build  Dictionary step and label to “Build Parameters”
    • Add Key “userRoleAssignments” and Value of type Dictionary to variable RoleAssignmentCol
    • Add Key “resourceAddress” and Value ‍{Workflow Context:Current site URL}‍‍{Current Item:SharePointLink}
    • Add Key “validateExistingPermissions” and Value of type Boolean set to No
    • Add key “additiveMode” and Value of type Boolean set to Yes
    • Add Key “sendServerManagedNotification” and Value of type Boolean set to Yes
    • Add Key “customMessage” and Value “Document has been shared with you”
    • Add Key “includeAnonymousLinksInNotification” and Value of type Boolean set to Yes
    • Add Key “propagateAcl” and Value of type Boolean set to Yes

 

buildparameters

  • Add a Call Http Web Service step to assign permissions using SP.Sharing.DocumentSharingManager.UpdateDocumentSharingInfolistcolumnconfig
    • Set Address to ‍{Workflow Context:Current site URL}‍/_api/SP.Sharing.DocumentSharingManager.UpdateDocumentSharingInfo
    • Set Request Type to “Http Post”
    • Set Request Headers to variable RequestHeadersPerm
    • Set Request Content to variable ParametersPerm
    • Set Response Headers to variable ResponseContentPerm
    • Set Response Content to variable ResponseHeadersPerm
    • Set Response Status Code to ResponseStatusCode

 

callhttpwebservice_updatedocumentsharinginfo

You can use the Log to History List step to log the ResponseStatusCode,  ResponseContentPerm and ResponseHeadersPerm.

An email will be sent to the external user when run with a link to access the resource.

emailgeneratedwithlink

Apply Theme and Create Composed Look CSOM and PowerShell

There are several ways a theme can be applied to a SharePoint site.

I tried the PnP command  SetSPOTheme  as follows on SharePoint 2013 On Premises instance.

Set-SPOTheme -ColorPaletteUrl $PathThemeColor -FontSchemeUrl  $PathThemeFont

Unfortunately I was getting message ‘Field or property “AlternateCssUrl” does not exist.’ SetSPOThemeError_AlternateCssUrlDoesNotExist

The error is due not having latest patch installed. Unfortunately the latest SharePoint CU can’t be installed as we have heavily customised legacy applications migrated from SharePoint 2007 and SharePoint 2010 running in compatibility mode.

The other option was to use the add-in “DeployCustomTheme” from PnP samples. However it meant deploying the add-in before applying the theme and add-in catalog has not been enabled on the SharePoint farm.

Using PnP PowerShell Commands the custom theme and master page files can be uploaded to folder “/_catalogs/theme/15” and “/_catalogs/masterpage” respectively.

Deploy Custom Color

Add-SPOFile -Path ".\Files\Themes\Palette_EDRMS.spcolor" -Folder "/_catalogs/theme/15"

Deploy Custom Font

Add-SPOFile -Path ".\Files\Themes\fontscheme_EDRMS.spfont" -Folder "/_catalogs/theme/15"

Deploy Custom Master Page html and preview

Add-SPOFile -Path ".\Files\Themes\MasterPage\seattle_edrms.html" -Folder "/_catalogs/masterpage"


Add-SPOFile -Path ".\Files\Themes\MasterPage\seattle_edrms.preview" -Folder "/_catalogs/masterpage"

After deploying the files, the custom theme can be applied using the PowerShell script below. The custom composed look is created before the theme is applied using the CSOM method “ApplyTheme”

$web.ApplyTheme($themeUrl, $FontSchemeUrl,$BckImageUrl,$true)

After running script, the theme on the site is updated.

AfterRunningCSOMScript.PNG

Import/Export Columns and Content Types using PnP PowerShell

It is useful to migrate Site Columns and Content Types from one Site Collection to another under the following scenarios maintaining the original GUIDS.

  1. SharePoint On Premise environment to SharePoint Online
  2. Within SharePoint Online, migrate to another tenant
  3. If Content Hub Feature is not used, migrate the contents to another site collection
  4. Keep a list for governance/change tracking purpose, i.e. checking in TFS

This article covers export/import of site columns and content types using SharePointPnP2016PowerShell commands which needs to be installed to be able to use the relevant commands.

The following can be used to connect to SharePoint 2013 Site collection.

Connect-SPOnline -Url $SiteUrl -CurrentCredentials

If script is run for SharePoint 2013 premises and you are logged in as the user already have access to the  site collection, use the parameter -CurrentCredentials. If the parameter is omitted you will be prompted to enter your credentials which you might need to do if connecting to SharePoint Online.

The site columns and content types need to be exported as XML first before being imported back to a different site collection.

Export Site Columns to XML

Use command Get-SPOField to retrieve all fields from the site collection. If the field is in a group, e.g. “Custom Group”, the field schemaXML property is copied to the XML file which is created in the location the script is run.

The script can be downloaded from tech net article Export Column using SharePointPnP2016PowerShell to XML

Export Site Content Types to XML

Use command Get-SPOContentType to retrieve all content types in the sitecollection. If the field is in a grou p,e.g. “Portfolio DB”, the content type schemaXML property is copied to the XML file which is created in the location the script is run.

The script can be downloaded from tech net article  Export Content Types using SharePointPnP2016PowerShell to XML

Import Site Columns from XML

Use command   Add-SPOFieldFromXml -FieldXml $_.OuterXml  to add site column to site collection.

For some site columns exported, there was a version tag which was causing the error ” The object has been updated by another user since it was last fetched”  to be thrown when imported. The version tag was removed from the XML using method RemoveAttribute.

$_.RemoveAttribute(“Version”)

There are some challenges associating the termset to a managed metadata site column. There are some additional steps needed

– Import the SharePoint client dlls

Add-Type -Path “c:\Program Files\Common Files\microsoft shared\Web Server Extensions\15\ISAPI\Microsoft.SharePoint.Client.dll”

Add-Type -Path “c:\Program Files\Common Files\microsoft shared\Web Server Extensions\15\ISAPI\Microsoft.SharePoint.Client.Runtime.dll”

Add-Type -Path “c:\Program Files\Common Files\microsoft shared\Web Server Extensions\15\ISAPI\Microsoft.SharePoint.Client.Taxonomy.dll”

– Query the site column added to site collection

Get-SPOField  -Identity $_.ID

– Cast the field object into Microsoft.SharePoint.Client.Taxonomy.TaxonomyField

$taxonomyField= [Microsoft.SharePoint.Client.ClientContext].GetMethod(“CastTo”).MakeGenericMethod([Microsoft.SharePoint.Client.Taxonomy.TaxonomyField]).Invoke($Context, $field)

– Update the properties  SsId and TermSetId

$taxonomyField.SspId = [Guid]$termStoreIdEle.Value.InnerText;

$taxonomyField.TermSetId = $termId;

The script can be downloaded from tech net article  Import Site Columns using SharePointPnP2016PowerShell to XML

Import Content Types from XML

Use command Get-SPOContentType -Identity $_.Name  to retrieve content type by name in the sitecollection. If the content type does not exist , use command below to create content type

Add-SPOContentType -ContentTypeId $_.ID -Group $_.Group -Name $_.Name -Description $_.Description

To add the fields to content type use

Add-SPOFieldToContentType -Field $_.Name -ContentType $spContentType.Name

If the field is required use the switchParameter -Required

Add-SPOFieldToContentType -Field $_.Name -ContentType $spContentType.Name   -Required

If the field is hidden use the  switchParameter -Hidden

Add-SPOFieldToContentType -Field $_.Name -ContentType $spContentType.Name   -Hidden

The script can be downloaded from tech net article  Import Content Types using SharePointPnP2016PowerShell to XML

The full code is shown below