Connect Azure Functions To Office 365

Connect an Azure Function to Office 365

In the past couple of weeks I’ve uploaded a few scripts to help manage Office 365 customer environments in bulk via delegated administration. These scripts work well for us, though they only work when they’re initiated by a delegated administrator here. Sure, we could set them up on a server as a scheduled task, though in the interest of keeping things in the cloud, we’re moving them to Azure Functions.

If you’re interested, the scripts I’ve posted so far regarding Delegated Administration are here:

What are Azure Functions?

The Azure Functions service is Microsoft’s Function as a Service offering (FaaS). It’s similar to, Google Cloud Functions or AWS Lambda if you’ve used any of those. Basically it lets you run standalone scripts or functions of a program in the cloud. One of Azure Functions’ benefits is that you don’t have to look after the underlying infrastructure, you can just add in your code and you’re pretty much done. You can start an Azure function using a HTTP or Azure Storage Queue trigger, or just set it to run on a timer. Azure Functions can run a variety of languages, though in this scenario, we’ll convert a simple Office 365 PowerShell script into a timer trigger function that runs each weekday.

Consumption Plan vs App Service Plan

Azure Functions Consumption Plan vs App Service PlanFor the number of functions we’ll be running, Azure functions are pretty much free with a Consumption Plan. This plan gives you a grant of 1 million executions and 400,000 GB-s of bandwidth, which we’ll be well under. However, Azure functions can also run on top of a paid Azure App Service Plan – which we’ll be taking advantage of.

Why pay for an Azure App Service Plan to run Azure Functions?

One of the limitations of the (almost) free version of Azure Functions is that it’s executions have a 5 minute limit, after which time they are terminated automatically. Apparently this is because the underlying virtual machines that run the functions are regularly recycled. Since some of our scripts have the potential to run longer than five minutes, we need to provision a small Azure App Service resource and then run our Azure functions on top of this. The VM that runs our App service runs continuously and will support long running functions

Here’s what we want to achieve:

  1. Set up an Azure Function App running on an App Service Plan
  2. Connect an Azure Function to Office 365
  3. Modify an existing PowerShell script to run on an Azure function

In another post we’ll look at connecting Azure Functions to Azure Storage to use in reporting via Power BI, and triggers for Microsoft Flow.

How to set up a new Azure Function App

  1. Log on to using an account with an active Azure subscription.
  2. Click the Green + button on the left menu, search for Functions, then click Function AppSearch For Azure Functions And Click Create
  3. Click Create on the bottom right
  4. Complete the required fields for the Function AppComplete Fields To Create Azure Function App
  5. Choose to create a new Resource Group and Storage Account. For the Hosting Plan option, choose App Service Plan, then select an existing subscription or create a new one. In my case, I chose an S1 Plan, which is probably overkill. You’ll be able to get by with something much smaller.Create A New App Service Plan For Azure Functions
  6. Once you’ve completed the required fields, click Create and wait for it to complete deploymentWait For Azure Function App To Complete Deployment
  7. After it’s finished deploying, open your function app and click the + button to create a new function.Create A New Function Within Azure Functions
  8. Choose Custom function at the bottomChoose To Create A New Custom Function
  9. On the dropdown on the right, choose PowerShellSelect PowerShell From Azure Functions Drop Down
  10. Choose TimerTigger-PowerShell and enter a name for your Azure Function.Create Timer Trigger PowerShell Azure Function
  11. For the Schedule, enter a cron expression. There used to be documentation at the bottom of the page on how to format these, though at the time of writing it hasn’t appeared. For a function that runs Monday to Friday at 9:30 AM GMT time, enter the following:
    0 30 9 * * 1-5

    Define Schedule For Azure Function

  12. Click Create, you’ll be greeted with an almost blank screen where you can start to enter your PowerShell script. Before we do this, we’ll set up the Azure function to connect to Office 365, and secure your credentials within the function app.

Set up your Azure Function to connect to Office 365

In this step, we’ll be doing the following:

Define and retrieve your FTP Details

The FTP Details of the Azure Function are needed to upload resources that the Azure Function requires to connect to Office 365.

Download, then upload the MSOnline PowerShell Module via FTP

Azure Functions have a lot of PowerShell Modules installed by default, though they don’t have the MSOnline module that lets us connect to Office 365. We’ll need to download the module on our local computer, then upload it into the Azure function. This method was borrowed from this article by Alexandre Verkinderen.

Secure your Office 365 Credentials within the Function App

Right now, Azure Functions don’t integrate with the Azure Key Vault service. While we can store credentials within the function, these credentials are stored in plain text where anyone with access to the function can view them. This method was borrowed from this article by Tao Yang.

How to define and retrieve the FTP credentials for your Azure function app

  1. Click on the name of your function on the left menu.Click Azure Function Settings To Retrieve FTP Details
  2. Click Platform Features at the top, then click Deployment CredentialsOpen Platform Features
  3. Define a username and password for your FTP CredentialsSet Deployment Credentials For FTP Access
  4. Next under General Settings, click Properties.Open Properties Under General Settings
  5. Copy the FTP Host Name and make a note of it. You’ll need it to connect to the function’s storage via FTP and upload the MSOnline ModuleCopy FTP Host Name And User Details For FTP Deployment

Download, then upload the MSOnline PowerShell Module via FTP

  1. Open PowerShell on your computer, then run the following command. Make sure there’s a folder called ‘temp’ in your C:\ drive.
    Save-Module msonline -Repository PSGallery -Path "C:\temp"

    Save MSOnline Module For Office365 PowerShell On Local PC

  2. Wait for it to download, then make sure it exists within C:\tempWait For MSOnline Module To Download
  3. Open Windows Explorer, and connect to your function via FTP using the FTP Hostname and credentials we retrieved earlier.Connect To Your Azure App Service Via FTP Credentials
  4. Navigate to site/wwwroot/YourFunctionName then create a new folder called binCreate Bin Directory Under Azure Function
  5. Open the bin directory, and upload the MSOnline folder from your C:\Temp DirectoryUpload MSOnline PowerShell Module To Bin Directory In Azure Function

Secure your Office 365 Credentials within the Azure Function App

  1. On your computer, open PowerShell again and run the following commands. When you’re asked for your password, enter the password for the delegated admin account that you’ll use to manage your customers Office 365 environments. Make sure you press Enter again to run the final command to output the EncryptedPassword.txt file.
    $AESKey = New-Object Byte[] 32
     $Path = "C:\Temp\PassEncryptKey.key"
     $EncryptedPasswordPath = "C:\Temp\EncryptedPassword.txt"
     Set-Content $Path $AESKey
     $Password = Read-Host "Please enter the password"
     $secPw = ConvertTo-SecureString -AsPlainText $Password -Force
     $AESKey = Get-content $Path
     $Encryptedpassword = $secPw | ConvertFrom-SecureString -Key $AESKey
     $Encryptedpassword | Out-File -filepath $EncryptedPasswordPath

    Run PowerShell Script To Secure Password
    This will create two files on in your C:\temp folder. An EncryptedPassword text file and a PassEncryptKey file. Be sure to delete the EncryptedPassword file once we’re done.Locate Secure Password And Key In Temp Folder

  2. Return to the FTP connection and create a directory called keys under the bin directory
  3. Upload the PassEncryptKey file into the keys directory.Upload PassEncryptKey To Azure Function Via FTP
  4. Return to your Azure Function Platform Settings, then open Application Settings.
  5. Under Application Settings, create two new Key-Value pairs. One called user, which contains the username of your delegated admin account, and another called password, which contains the contents of your EncryptedPassword.txt file. Once you’ve added this, be sure to delete the EncryptedPassword.txt file from your computer.
  6. Before you leave Application settings, update the Platform from 32 bit to 64 bit.Update Azure Function Platform To 64 Bit
  7. Wait for the settings to apply, then return to the Develop Section of your Azure FunctionWait For Azure Function Web App Settings To Apply

Modify your Office 365 PowerShell script for Azure Functions

  1. Update the variables at the top of the script to ensure they match the function name, Module Name and Module Version.For your existing scripts, you may need to update your Write-Host references to Write-Output.This sample script is a modified version of this one. It will set the default password expiration policy for all of your customers’ domains to never expire.You can use this one or create your own script under the # Start Script comment
    Write-Output "PowerShell Timer trigger function executed at:$(get-date)";
    $FunctionName = 'SetPasswordExpirationPolicy'
    $ModuleName = 'MSOnline'
    $ModuleVersion = ''
    $username = $Env:user
    $pw = $Env:password
    #import PS module
    $PSModulePath = "D:\home\site\wwwroot\$FunctionName\bin\$ModuleName\$ModuleVersion\$ModuleName.psd1"
    $res = "D:\home\site\wwwroot\$FunctionName\bin"
    Import-module $PSModulePath
    # Build Credentials
    $keypath = "D:\home\site\wwwroot\$FunctionName\bin\keys\PassEncryptKey.key"
    $secpassword = $pw | ConvertTo-SecureString -Key (Get-Content $keypath)
    $credential = New-Object System.Management.Automation.PSCredential ($username, $secpassword)
    # Connect to MSOnline
    Connect-MsolService -Credential $credential
    # Start Script
    $Customers = Get-MsolPartnerContract -All
    $PartnerInfo = Get-MsolCompanyInformation
    Write-Output "Found $($Customers.Count) customers for $($PartnerInfo.DisplayName)"
    foreach ($Customer in $Customers) { 
    	Write-Output "-----------------------------------------------"
    	Write-Output " "
    	Write-Output "Checking the Password Expiration Policy on each domain for $($Customer.Name)"
    	Write-Output " "
    	$domains = Get-MsolDomain -TenantId $Customer.TenantId | Where-Object {$_.Status -eq "Verified"}
    	foreach($domain in $domains){
    		$domainStatus = Get-MsolPasswordPolicy -TenantId $Customer.TenantId -DomainName $domain.Name
    		if($domainStatus.ValidityPeriod -eq 2147483647){
    			Write-Output "Password Expiration Policy is set for $($ already"
    			$PasswordsWillExpire = $false
    			$MsolPasswordPolicyInfo = @{
    				TenantId = $Customer.TenantId
    				CompanyName = $Customer.Name
    				DomainName = $domain.Name
    				ValidityPeriod = $domainStatus.ValidityPeriod
    				NotificationDays = $domainStatus.NotificationDays
    				PasswordsWillExpire = $PasswordsWillExpire
    		if($domainStatus.ValidityPeriod -ne 2147483647){
    			Write-Output "Setting the Password Expiration Policy on $($domain.Name) for $($Customer.Name):"
    			Write-Output " "
    			Set-MsolPasswordPolicy -TenantId $Customer.TenantId -DomainName $domain.Name -ValidityPeriod 2147483647 -NotificationDays 30
    			$PasswordPolicyResult = Get-MsolPasswordPolicy -TenantId $Customer.TenantId -DomainName $domain.Name
    			if($PasswordPolicyResult.ValidityPeriod -eq 2147483647){
    				$PasswordsWillExpire = $false
    				Write-Output "Password policy change confirmed working"
    			if($PasswordPolicyResult.ValidityPeriod -ne 2147483647){
    				$PasswordsWillExpire = $true
    				Write-Output "Password policy change not confirmed yet, you may need to run this again."
    			$MsolPasswordPolicyInfo = @{
    				TenantId = $Customer.TenantId
    				CompanyName = $Customer.Name
    				DomainName = $domain.Name
    				ValidityPeriod = $PasswordPolicyResult.ValidityPeriod
    				NotificationDays = $PasswordPolicyResult.NotificationDays
    				PasswordsWillExpire = $PasswordsWillExpire
  2. Click Run to manually start the script. You should see following output under LogsAzure Functions Output Log
, ,

Enabling the Unified Audit Log on all delegated Office 365 tenants via PowerShell

What is the Office 365 Unified Audit Log?

For security and compliance in Office 365, the Unified Audit Log is probably the most important tool of all. It tracks every user and account action across all of the Office 365 services. You can run reports on deletions, shares, downloads, edits, reads etc, for all users and all products. You can also set up custom alerting to receive notifications whenever specific activities occur.

For all of it’s usefulness, the most amazing thing about it is that it’s not turned on by default.

It can be extremely frustrating when you come across a query or problem that could easily be resolved if we had access to the logs, only to find out they were never enabled in the first place. Here’s how to get it set up in your own organisation, or if you’re a Microsoft Partner, how to script it for all of your customers using Delegated Administration and PowerShell.

How to enable the Unified Audit Log for a single Office 365 tenant

If you’re only managing your own tenant, it’s quite simple to turn it on. You can do this in two ways.

How to enable the Unified Audit Log via the Security and Compliance Center for a single Office 365 tenant

  1. Visit as an Office 365 admin
  2. Click Search & investigation
  3. Click Audit log search
  4. If it’s not enabled you’ll see a link to Start recording user and admin activities. Click it to enable the Unified Audit Log.

How to enable the Unified Audit Log via PowerShell for a single Office 365 tenant

  1. Connect to Exchange Online via PowerShell as an administrator by following this guide
  2. Make sure your Office 365 tenant is ready for the Unified Audit Log by enabling Organization Customization:
  3. Run the following command to enable the Unified Audit Log:
    Set-AdminAuditLogConfig -UnifiedAuditLogIngestionEnabled $true

How to Enable the Unified Audit Log on Multiple Office 365 tenants using Delegated Administration via PowerShell

I’ve recently written a few posts on running bulk PowerShell operations across all of your customer’s Office 365 tenants.

Since the PowerShell command for enabling the Unified Audit Log is just one line, I assumed we’d be able to add it as a script block and run it across all of our Office 365 customers at once.

When I tried setting this up, it initially appeared to be working, though I soon received the following error:

The remote server returned an error: (401) Unauthorized.

Attempting To Set Office 365 Unified Audit Log Via Delegated Administration in PowerShell

It looks like Microsoft don’t allow you to run this particular script using Delegated Administration, though I’m not too sure why. You also can’t enable it via using your delegated admin credentials, it just seems to revert you back to the settings for your own Office 365 tenant.

In order to enable the Unified Audit Log, we’ll need to activate it using an admin within the customer’s Office 365 tenant. The remainder of this blog post contains the instructions on how to script this process.


Use the following scripts at your own risk. They are designed to temporarily create Global Admins with a standard password (chosen by you) on each of your customer’s environments. If all goes well, every admin that was created should be deleted automatically. If some tenants fail to enable the Unified Audit Log correctly, the new admin for those tenants will remain (I’ve included a script to remove these ones too). Also, see step 3 for a link to a script that reports on every Unlicensed Office 365 Company Admin in your Office 365 tenant. Use it to verify that none of these temporary admins remain.

This process has three parts

  1. PowerShell Script One: Checking Unified Audit Log Status and creating admin users
  2. PowerShell Script Two: Enabling Unified Audit Log on all Office 365 tenants and removing successful admins
  3. PowerShell Script Three (And Optional Script): Removing unsuccessful admins and checking tenants for all unlicensed admins.

Things you should know beforehand

For the most part, these scripts work. Using these three scripts, I’ve enabled the Unified Audit Log on 227 of our 260 delegated Office 365 customers. However, there are a few error messages that can pop up, and a few reasons that will prevent it working for some Office 365 tenants at all.

Here are a few things to keep in mind:

  • It doesn’t work with LITEPACK and LITEPACK_P2 subscriptions

    In our case these are Telstra customers running the older Office 365 Small Business and Office 365 Small Business Premium subscriptions. You can run our Office 365 Delegated Tenant license report to identify these customers.LITEPACK_P2 Will Not Enable Office 365 Unified Audit Log

  • It does not work on customers that don’t have any subscriptions, or only has expired subscriptions.

    It won’t work for Office 365 tenants that don’t have any Office 365 subscriptions, or if their Office 365 subscriptions have expired. The script will fail for these organisations with the error: The tenant organization isn’t in an Active State. Complete the administrative tasks that are active for this organization, and then try again.Office 365 Organisation Isn't In An Active State

  • It does not work on customers that only have Dynamics CRM licenses

    This script doesn’t seem to run on customers that only have Dynamics CRM Online. It hasn’t been tested with customers that only have Dynamics 365.

  • You should wait before running the second PowerShell Script

    It can take a while for the temporary admin user to receive the appropriate permissions in your customers Office 365 organisation. If you run the second script too soon, the temporary admin may not be able to pull down all the Exchange Online cmdlets to perform the required tasks.

PowerShell Script One: Checking Unified Audit Log Status and creating admin users

This script uses your own delegated admin credentials. It creates a list of all of your Office 365 Customers and reports on their subscriptions. If they have at least one subscription (active or not) it attempts to run an Exchange Online cmdlet to check whether the Unified Audit Log is enabled. If it’s enabled, it does nothing and moves onto the next customer. If it’s disabled, it creates a new user, assigns it to the Company Administrator role and adds a row to a CSV with the tenant ID, customer name and user principal name.

Retrieving License Count, Unified Audit Log Status and Creating Office 365 Admin

To use the script, copy and paste it into a PowerShell document. You can use Visual Studio Code, PowerShell ISE, or Notepad etc.

Modify the placeholder variables at the top of the script and run it in PowerShell.

<# This script will connect to all delegated Office 365 tenants and check whether the Unified Audit Log is enabled. If it's not, it will create an Exchange admin user with a standard password. Once it's processed, you'll need to wait a few hours (preferably a day), then run the second script. The second script connects to your customers' Office 365 tenants via the new admin users and enables the Unified Audit Log ingestion. If successful, the second script will also remove the admin users created in this script. #>


# Here are some things you can modify:

# This is your partner admin user name that has delegated administration permission

$UserName = "[email protected]"

# IMPORTANT: This is the default password for the temporary admin users. Don't leave this as Password123, create a strong password between 8 and 16 characters containing Lowercase letters, Uppercase letters, Numbers and Symbols.

$NewAdminPassword = "Password123"

# IMPORTANT: This is the default User Principal Name prefix for the temporary admin users. Don't leave this as gcitsauditadmin, create something UNIQUE that DOESNT EXIST in any of your tenants already. If it exists, it'll be turned into an admin and then deleted.

$NewAdminUserPrefix = "gcitsauditadmin"

# This is the path for the exported CSVs. You can change this, though you'll need to make sure the path exists. This location is also referenced in the second script, so I recommend keeping it the same.

$CreatedAdminsCsv = "C:\temp\CreatedAdmins.csv"

$UALCustomersCsv = "C:\temp\UALCustomerStatus.csv"

# Here's the end of the things you can modify.


# This script block gets the Audit Log config settings

$ScriptBlock = {Get-AdminAuditLogConfig}

$Cred = get-credential -Credential $UserName

# Connect to Azure Active Directory via Powershell

Connect-MsolService -Credential $cred

$Customers = Get-MsolPartnerContract -All

$CompanyInfo = Get-MsolCompanyInformation

Write-Host "Found $($Customers.Count) customers for $($CompanyInfo.DisplayName)"

Write-Host " "
Write-Host "----------------------------------------------------------"
Write-Host " "

foreach ($Customer in $Customers) {

	Write-Host $Customer.Name.ToUpper()
	Write-Host " "

	# Get license report

	Write-Host "Getting license report:"

	$CustomerLicenses = Get-MsolAccountSku -TenantId $Customer.TenantId

	foreach($CustomerLicense in $CustomerLicenses) {

		Write-Host "$($Customer.Name) is reporting $($CustomerLicense.SkuPartNumber) with $($CustomerLicense.ActiveUnits) Active Units. They've assigned $($CustomerLicense.ConsumedUnits) of them."


	if($CustomerLicenses.Count -gt 0){

		Write-Host " "

		# Get the initial domain for the customer.

		$InitialDomain = Get-MsolDomain -TenantId $Customer.TenantId | Where {$_.IsInitial -eq $true}

		# Construct the Exchange Online URL with the DelegatedOrg parameter.

		$DelegatedOrgURL = "" + $InitialDomain.Name

		Write-Host "Getting UAL setting for $($InitialDomain.Name)"

		# Invoke-Command establishes a Windows PowerShell session based on the URL,
		# runs the command, and closes the Windows PowerShell session.

		$AuditLogConfig = Invoke-Command -ConnectionUri $DelegatedOrgURL -Credential $Cred -Authentication Basic -ConfigurationName Microsoft.Exchange -AllowRedirection -ScriptBlock $ScriptBlock -HideComputerName

		Write-Host " "
		Write-Host "Audit Log Ingestion Enabled:"
		Write-Host $AuditLogConfig.UnifiedAuditLogIngestionEnabled

		# Check whether the Unified Audit Log is already enabled and log status in a CSV.

		if ($AuditLogConfig.UnifiedAuditLogIngestionEnabled) {

			$UALCustomerExport = @{

				TenantId = $Customer.TenantId
				CompanyName = $Customer.Name
				DefaultDomainName = $Customer.DefaultDomainName
				UnifiedAuditLogIngestionEnabled = $AuditLogConfig.UnifiedAuditLogIngestionEnabled
				UnifiedAuditLogFirstOptInDate = $AuditLogConfig.UnifiedAuditLogFirstOptInDate
				DistinguishedName = $AuditLogConfig.DistinguishedName

			$UALCustomersexport = @()

			$UALCustomersExport += New-Object psobject -Property $UALCustomerExport

			$UALCustomersExport | Select-Object TenantId,CompanyName,DefaultDomainName,UnifiedAuditLogIngestionEnabled,UnifiedAuditLogFirstOptInDate,DistinguishedName | Export-Csv -notypeinformation -Path $UALCustomersCSV -Append


		# If the Unified Audit Log isn't enabled, log the status and create the admin user.

		if (!$AuditLogConfig.UnifiedAuditLogIngestionEnabled) {

			$UALDisabledCustomers += $Customer

			$UALCustomersExport =@()

			$UALCustomerExport = @{

				TenantId = $Customer.TenantId
				CompanyName = $Customer.Name
				DefaultDomainName = $Customer.DefaultDomainName
				UnifiedAuditLogIngestionEnabled = $AuditLogConfig.UnifiedAuditLogIngestionEnabled
				UnifiedAuditLogFirstOptInDate = $AuditLogConfig.UnifiedAuditLogFirstOptInDate
				DistinguishedName = $AuditLogConfig.DistinguishedName

			$UALCustomersExport += New-Object psobject -Property $UALCustomerExport
			$UALCustomersExport | Select-Object TenantId,CompanyName,DefaultDomainName,UnifiedAuditLogIngestionEnabled,UnifiedAuditLogFirstOptInDate,DistinguishedName | Export-Csv -notypeinformation -Path $UALCustomersCSV -Append

			# Build the User Principal Name for the new admin user

			$NewAdminUPN = -join($NewAdminUserPrefix,"@",$($InitialDomain.Name))

			Write-Host " "
			Write-Host "Audit Log isn't enabled for $($Customer.Name). Creating a user with UPN: $NewAdminUPN, assigning user to Company Administrators role."
			Write-Host "Adding $($Customer.Name) to CSV to enable UAL in second script."

			$secpasswd = ConvertTo-SecureString $NewAdminPassword -AsPlainText -Force
			$NewAdminCreds = New-Object System.Management.Automation.PSCredential ($NewAdminUPN, $secpasswd)

			New-MsolUser -TenantId $Customer.TenantId -DisplayName "Audit Admin" -UserPrincipalName $NewAdminUPN -Password $NewAdminPassword -ForceChangePassword $false

			Add-MsolRoleMember -TenantId $Customer.TenantId -RoleName "Company Administrator" -RoleMemberEmailAddress $NewAdminUPN
			$AdminProperties = @{
				TenantId = $Customer.TenantId
				CompanyName = $Customer.Name
				DefaultDomainName = $Customer.DefaultDomainName
				UserPrincipalName = $NewAdminUPN
				Action = "ADDED"

			$CreatedAdmins = @()
			$CreatedAdmins += New-Object psobject -Property $AdminProperties

			$CreatedAdmins | Select-Object TenantId,CompanyName,DefaultDomainName,UserPrincipalName,Action | Export-Csv -notypeinformation -Path $CreatedAdminsCsv -Append

			Write-Host " "



Write-Host " "
Write-Host "----------------------------------------------------------"
Write-Host " "


Write-Host "Admin Creation Completed for tenants without Unified Audit Logging, please wait 12 hours before running the second script."

Write-Host " "

See the Unified Audit Log status for your customers

One of the outputs of this script is the UALCustomerStatus.csv file. You can make a copy of this, and rerun the process at the end to compare the results.
Report On Customer Status Of Office 365 Unified Audit Log

Browse the list of created admins

The script will also create a CSV containing the details for each admin created. This CSV will be imported by the second PowerShell Script and will be used to enable the Unified Audit Log on each tenant.

List Of Office 365 Admins Created By PowerShell Script

PowerShell Script Two: Enabling Unified Audit Log on all Office 365 tenants and removing successful admins

This script should be run at least a few hours after the first script to ensure that the admin permissions have had time to correctly apply. If you don’t wait long enough, your admin user may not have access to the required Exchange Online cmdlets.

You’ll need to update the password in this script to reflect the password you chose for your temporary admins in the first script.

To use the script, copy and paste it into a PowerShell document. You can use Visual Studio Code, PowerShell ISE, or Notepad etc.

Modify the placeholder variables at the top of the script and run it in PowerShell.

<# This script will use the admin users created by the first script to enable the Unified Audit Log in each tenant. If enabling the Unified Audit Log is successful, it'll remove the created admin. If it's not successful, it'll keep the admin in place and add it to another CSV. You can retry these tenants by modifying the $Customers value to import the RemainingAdminsCsv in the next run. #>


# Here are some things you can modify:

# This is your partner admin user name that has delegated administration permission

$UserName = "[email protected]"

# IMPORTANT: This is the default password for the temporary admin users. Use the same password that you specified in the first script.

$NewAdminPassword = "Password123"

# This is the CSV containing the details of the created admins generated by the first script. If you changed the path in the first script, you'll need to change it here.

$Customers = import-csv "C:\temp\CreatedAdmins.csv"

# This CSV will contain a list of all admins removed by this script.

$RemovedAdminsCsv = "C:\temp\RemovedAdmins.csv"

# This CSV will contain a list of all unsuccessful admins left unchanged by this script. Use it to retry this script without having to start again.

$RemainingAdminsCsv = "C:\temp\RemainingAdmins.csv"


$Cred = get-credential -Credential $UserName

foreach ($Customer in $Customers) {

	Write-Host $Customer.CompanyName.ToUpper()
	Write-Host " "

	$NewAdminUPN = $Customer.UserPrincipalName

	$secpasswd = ConvertTo-SecureString $NewAdminPassword -AsPlainText -Force

	$NewAdminCreds = New-Object System.Management.Automation.PSCredential ($NewAdminUPN, $secpasswd)

	Write-Host " "

	Write-Output "Getting the Exchange Online cmdlets as $NewAdminUPN"

	$Session = New-PSSession -ConnectionUri `
	-ConfigurationName Microsoft.Exchange -Credential $NewAdminCreds `
	-Authentication Basic -AllowRedirection
	Import-PSSession $Session -AllowClobber

	# Enable the customization of the Exchange Organisation

	# Enable the Unified Audit Log

	Set-AdminAuditLogConfig -UnifiedAuditLogIngestionEnabled $true

	# Find out whether it worked

	$AuditLogConfigResult = Get-AdminAuditLogConfig

	Remove-PSSession $Session

	# If it worked, remove the Admin and add the removed admin details to a CSV


		# Remove the temporary admin
		Write-Host "Removing the temporary Admin"

		Remove-MsolUser -TenantId $Customer.TenantId -UserPrincipalName $NewAdminUPN -Force

		$AdminProperties = @{
			TenantId = $Customer.TenantId
			CompanyName = $Customer.CompanyName
			DefaultDomainName = $Customer.DefaultDomainName
			UserPrincipalName = $NewAdminUPN
			Action = "REMOVED"

		$RemovedAdmins = @()
		$RemovedAdmins += New-Object psobject -Property $AdminProperties
		$RemovedAdmins | Select-Object TenantId,CompanyName,DefaultDomainName,UserPrincipalName,Action | Export-Csv -notypeinformation -Path $RemovedAdminsCsv -Append

	# If it didn't work, keep the Admin and add the admin details to another CSV. You can use the RemainingAdmins CSV if you'd like to try again.


		Write-Host "Enabling Audit Log Failed, keeping the temporary Admin"

		$AdminProperties = @{
			TenantId = $Customer.TenantId
			CompanyName = $Customer.CompanyName
			DefaultDomainName = $Customer.DefaultDomainName
			UserPrincipalName = $NewAdminUPN
			Action = "UNCHANGED"

		$RemainingAdmins = @()
		$RemainingAdmins += New-Object psobject -Property $AdminProperties
		$RemainingAdmins | Select-Object TenantId,CompanyName,DefaultDomainName,UserPrincipalName,Action | Export-Csv -notypeinformation -Path $RemainingAdminsCsv -Append


	Write-Host " "
	Write-Host "----------------------------------------------------------"
	Write-Host " "


View the successful Office 365 admins that were removed

If the Unified Audit Log was enabled successfully, the newly created Office 365 admin will be automatically removed. You can see the results of this in the RemovedAdmins CSV.

Office 365 Admins Removed Once Unified Audit Log Is Enabled

See the remaining Office 365 admins that couldn’t enable the Unified Audit Log

If the Unified Audit Log couldn’t be enabled, the Office 365 admin will remain unchanged. If you like, you can use the RemainingAdmins CSV in place of the CreatedAdmins CSV and rerun the second script. In our case, some tenants that couldn’t be enabled on the first try, were able to be enabled on the second and third tries.

Office 365 Admins That Remain Unchanged Since Unified Audit Log Enable Failed


PowerShell Script Three: Removing unsuccessful admins

Any tenants that weren’t able to have their Unified Audit Log enabled via PowerShell will still have the Office 365 admin active. This script will import these admins from the RemainingAdminsCsv and remove them.

Once removed, it will add them to the RemovedAdmins CSV. You can compare this to the CreatedAdmins CSV from the first script to make sure they’re all gone.

<# This script will use the admin users created by the first script to enable the Unified Audit Log in each tenant. If enabling the Unified Audit Log is successful, it'll remove the created admin. If it's not successful, it'll keep the admin in place and add it to another CSV. You can retry these tenants by modifying the $Customers value to import the RemainingAdminsCsv in the next run. #>


# Here are some things you can modify:

# This is your partner admin user name that has delegated administration permission

$UserName = "[email protected]"

# This CSV contains a list of all remaining unsuccessful admins left unchanged by the second script.

$RemainingAdmins = import-csv "C:\temp\RemainingAdmins.csv"

# This CSV will contain a list of all admins removed by this script.

$RemovedAdminsCsv = "C:\temp\RemovedAdmins.csv"


$Cred = get-credential -Credential $UserName

Connect-MsolService -Credential $cred

ForEach ($Admin in $RemainingAdmins) {

	$tenantID = $Admin.Tenantid

	$upn = $Admin.UserPrincipalName

	Write-Output "Deleting user: $upn"

	Remove-MsolUser -UserPrincipalName $upn -TenantId $tenantID -Force

	$AdminProperties = @{
		TenantId = $tenantID
		CompanyName = $Admin.CompanyName
		DefaultDomainName = $Admin.DefaultDomainName
		UserPrincipalName = $upn
		Action = "REMOVED"

	$RemovedAdmins = @()
	$RemovedAdmins += New-Object psobject -Property $AdminProperties
	$RemovedAdmins | Select-Object TenantId,CompanyName,DefaultDomainName,UserPrincipalName,Action | Export-Csv -notypeinformation -Path $RemovedAdminsCsv -Append


Want to see all the current Office 365 global administrators in your customers tenants?

To confirm that all of the created admins from these scripts have been removed, or just to see which global administrators have access to your customer tenants, you can run the scripts here. If required, there’s a second script that will block the credentials of the admins that you leave in the exported CSV.


Resolving issues when migrating from Telstra syndicated Office 365 to Office 365 via CSP

I ran a small migration on Friday last week that took a lot longer than planned due to a few unexpected issues. The customer was migrating from Telstra’s Office 365 service (AKA syndicated account) to a Microsoft Office 365 account licensed via the CSP model.

This is a pretty standard migration that we run often, though this time I ran into a number of issues.

Since it was a small migration (4 users), and it needed to be moved over in a short amount of time, I opted to do a standard PST migration on an Azure VM. The plan was as follows:

  1. Disconnect the domain from the existing tenant
  2. Add it to the new tenant by moving it to new name servers and verifying the DNS
  3. Create the required users on the new tenant
  4. Set up the Exchange accounts for the old users via their addresses and export the mail via PSTs
  5. Set up the Exchange accounts for the new users and import the mail

Here’s a few important details regarding the source tenant:

  • Office 365 licensed via Telstra on a syndicated account
  • Domain DNS hosted on Microsoft’s name servers

Here are some important details regarding the destination tenant:

  • Office 365 licensed via CSP model
  • Domain DNS hosted on external name servers

Problem #1: Cannot create a user with the same UPN as before

My first issue occurred when I started to set up the users on the new tenant. Three out of four worked correctly, though the fourth mailbox (unfortunately the most important one) would not create correctly on the new tenant.

I was given the error OrgIdMailboxRecentlyCreatedException and the message:

Hang on, we’re not quite ready

It looks like your account [email protected] was created 1 hour ago. It can take up to 24 hours to set up a mailbox.

Hang on, we're not quite ready

I did some testing and discovered that the error message only appears when I set the User Principal Name to match what it was on the old tenant ([email protected]). If I set it up to use a variation of the old name ([email protected]), it worked instantly.

I suspected this was to do with some internal updating issue with how Office 365 maps usernames to mailboxes, and decided to give it a bit of time.

In the meantime, the customer needed to be able to send and receive as their previous email address. So I logged onto Exchange via Powershell and ran the following cmdlets:

set-mailbox paulm -windowsemailaddress [email protected]
set-mailbox paulm -emailaddresses [email protected], [email protected]

These cmdlets keep the user principal name as [email protected] though allow the user to send and receive mail as [email protected]

Once this was confirmed working, I set it up on the user’s Outlook profile and kicked off the migration.

I left this running overnight, and in the morning faced a new problem.

Problem #2: Cannot send to BigPond addresses due to SPF issue

The next issue was related to recipients using Telstra BigPond addresses. Emails sent to external domains worked fine, though emails sent to BigPond account would fail instantly. They’d return a message stating that the message was rejected by the recipient email server.

The error message was <[email protected]> Sender rejected. IB506

SPF error

The SPF Headers in the returned email state that there is no SPF records configured for the domain:

No spf record

Does not designate permitted sender hosts

I double checked the SPF records on the new name servers and they were configured correctly. I also sent test emails to my own external addresses, and the SPF headers were fine too.

I’m not 100% sure what the cause of the issue here is, though I suspect it is due to the fact that Microsoft used to manage the DNS for this domain via their internal Office 365 name servers. If the change is taking a while to propagate throughout the system, it may still looking at their internal name servers for records relating to this domain.

Solution: Switch Name Servers back to Microsoft

To resolve this issue, I transferred management of the DNS records back to Microsoft’s Office 365 name servers, and within about half an hour, all of the issues were resolved.

I was able to rename the user to the correct email ([email protected]) and users no longer had issues mailing BigPond accounts.

Since I was trying everything to resolve this issue, I’m not sure what the ultimate fix was. Though it seems that switching the name servers back to Microsoft on the new tenant did the most good.

I suspect that this was a DNS propagation problem within Office 365, and may have been resolved in time anyway. If you’re currently experiencing it, moving the DNS to Microsoft’s Office 365 name servers may speed up the resolution.


How to set up Office 365 email on iPhone – Video updated for iOS 10

Apple has (very slightly) changed the way we add Office 365 email accounts to iOS 10. We’ve taken the opportunity to update our most popular How to video. We’re also giving a big plug for Outlook for iOS. Microsoft have done a great job with this app, and we think it’s a must have for Office 365 users. See how you can get your email under control in our 3 minute video:

, ,

How to add branding to Office 365 login screens

You can customise your Office 365 login screens via a service called Azure Active Directory (or Azure AD).

Microsoft Azure continues to transition to the new portal at, and Azure AD is one of the last services to make the leap. Now that it’s in Preview on the new portal, we’ve made an updated video on how to easily brand your Office 365 login screens.


How to Install Office from Office 365 – Updated Video

As Office 365 evolves, we need to refresh our training materials. So here’s our updated video tutorial on how to install Office from Office 365.


Working with archive policies in Office 365 and SkyKick

Skykick automates the migration of email from other platforms onto Office 365, though occasionally it needs a bit of help.

This is especially true when moving from Google Apps or Google for Work to Office 365. These mailboxes can bloat in size due to how both systems manage email folders.

Office 365 (Microsoft Exchange) stores email in folders, while Google gives email labels. The difference is, in Exchange an email can be in only one folder, while in Google an email can have multiple labels. When migrating from Google for Work to Office 365, SkyKick will create Exchange folders for every Google label, and migrate emails that are assigned multiple labels into multiple folders.

This results in a bunch of duplication on the destination system.

When you’re using Skykick to migrate large mailboxes from Google for Work, you may occasionally receive a message advising that the mailbox may exceed the storage limits on Office 365. While this message appears, synchronisation will be paused.SkyKick May Exceed The Maximum Allowed in Office 365

In order to resume the migration for this mailbox you’ll need to do the following:

  1. Confirm you’re using the right retention policy
  2. Enable archiving on the mailbox.
  3. Ensure the archive is running.
  4. Mark the alert as completed.

What are Exchange Retention Policies?

In Exchange, each mailbox is assigned a Retention Policy that contain the retention settings for mail within the mailbox. Retention policies are made up of Retention Policy Tags.

Retention Policy Tags outline how long Exchange is going to keep a user’s mail before performing a specific action on it. For Retention Policy Tags, this action can be PermanentlyDelete, DeleteAndAllowRecovery or MoveToArchive.

You can also create Retention Policy Tags that only affect a specific type of folder, for example DeletedItems or JunkEmail. For a full list of options, see this Technet Article:

Why create your own Retention Policy?

When archiving is enabled on a mailbox, the default policy is to archive anything older than two years. This may be enough to get the migration running again, but just in case it’s not, you can create a new Retention Policy Tag with a shorter archive time limit, apply it to a new Retention Policy, then apply the policy to the user you want to archive mail for.

Alternatively, you can edit the default policy (known as Default MRM Policy) or its tags, though this will affect all users that have archiving enabled.

You can create a new Retention Policy and Retention Policy Tags via PowerShell or via the Exchange Control Panel. In Exchange Control Panel, these actions are performed under Compliance Management. In this tutorial we’ll be working in PowerShell.

Setting up a Retention Policy in PowerShell

This new Retention Policy will move any mail older than 1 year into a users archive. It will have one tag.

  1. Connect to Exchange Online via PowerShell as an Exchange Online Administrator
  2. Run the following PowerShell cmdlet
    New-RetentionPolicyTag "1 year move to archive" -Type All -RetentionEnabled $true -AgeLimitForRetention 365 -RetentionAction MoveToArchive

    Create New Retention Tag

  3. Create the new Retention Policy and link the tags
    New-RetentionPolicy "One Tag Policy" -RetentionPolicyTagLinks "1 year move to archive"

    Create New Retention Policy

  4. Assign a retention policy to a user
    Set-Mailbox -Identity UserAliasOrEmail -RetentionPolicy "One Tag Policy"

    Apply Policy To The User

  5. Confirm the Retention Policy was applied correctly by running:
    Get-Mailbox -Identity UserAliasOrEmail | ft Name,RetentionPolicy

    Confirm Policy Is Applied

Enable Archiving on a mailbox.

Once you’ve assigned the policy, you can enable archiving on the user’s mailbox. This can be done in the Exchange Control Panel under Recipients, Mailboxes on the right menu.

  1. In Powershell, you can run the following cmdlet while connected to Exchange Online.
    Enable-Mailbox -Identity UserAliasOrEmail -Archive

Ensure the Archive is running.

  1. The Archive won’t run immediately, though you can force it along. You can check the size of the archive using the Get-MailboxStatistics cmdlet.
    Get-MailboxStatistics -identity UserAliasOrEmail -Archive
  2. The default cmdlet return is to display the Name, ItemCount, StorageLimitStatus and LastLogonTime of the mailbox. To see more info, append ‘| fl *‘ (minus the quotations) to the cmdlet.
    Get-MailboxStatistics -identity UserAliasOrEmail -Archive | fl *
  3. Your archive will probably be empty right now. To start the archive, run the following cmdlet.
    Start-ManagedFolderAssistant UserAliasOrEmail

    Force Archive To Run Using StartManaged Folder Assistant

  4. Now, if you run the Get-MailboxStatistics cmdlet a few times more, you’ll see the ItemCount increasing. Providing of course, that there’s email older than a year in the mailbox.Confirm Archive Is Running
  5. You can also append ‘| fl *‘ to the end of the cmdlet to get the available statistics for the user’s mailbox too. Try it a few times and watch it reduce as items are archived.
    Get-MailboxStatistics -Identity UserAliasOrEmail | fl *

    Get All Mailbox Statistics

    Get-MailboxStatistics -Identity UserAliasOrEmail -archive | fl *

    Archive Size

Mark the alert as complete

Once your archive has begun processing, you can return to SkyKick and mark the alert as complete. The migration for the mailbox will kick off again.

, , ,

Forward email form entries into SharePoint Lists

EmailToSharePointA common requirement for our customers is to forward emails to SharePoint Online lists. This email data usually comes from website forms or enquiry pages, though there’s no out-of-the-box way to extract the form data from an email, and upload it to separate columns in SharePoint list.

Previously I was using Cloud2050 Email Sync, though it relied on software installed on a PC to work, and only worked while that PC was operational and Outlook was open.

Here’s a solution that operates completely in the cloud using Outlook Rules, and Microsoft Azure Logic Apps.

The solution looks like this:

  1. Office 365 forwards email from your website’s form to your address via an Outlook Rule or Exchange Transport Rule.
  2. receives the email, extracts the form data and sends it to an Azure logic app using a Generic HTTP Webhook.
  3. Your Azure Logic App receives the form data, connects to SharePoint Online and adds the form data into the appropriate SharePoint list columns.


  • Sign up for – a free 30 day trial is available
  • Sign up for Microsoft Azure – use your Office 365 account, a free 30 day trial is available
  • A SharePoint List set up with the fields required for your form

Setting up MailParser

  1. Once you’ve signed up for, sign in and click Create New InboxCreate New Inbox In
  2. Give it a name and add some notes:Name Mailparser Inbox
  3. You’ll be given an email address to forward your form emails to. Keep track of this address, as you’ll need it to receive the emails you send from Outlook or Exchange mail rules. Forward a couple of sample form emails to the address to get started.Get Mailparser Email
  4. Once your emails are received, you can set up your Parsing Rules:Add Mail Parsing Rules
  5. Usually, the mailparser will be able to automatically identify the field names and values from your forwarded email. If it doesn’t, click Try Something Else to give it some help, otherwise click OK, start with this.Automatic Mail Parsing Rule Set Up
  6. Now, we start setting up our Generic Webhook. Click Webhook Integrations in on the left menu, then click Add New Integration.
    Click Webhook Integrations
  7. Click Generic Webhook.Click Generic Webhook
  8. Give it a descriptive name and type in a sample URL (I used into the Target URL field. We need to use a sample first so that we can copy the webhook’s JSON payload. We then use this JSON payload to help generate the actual TargetURL from Azure Logic Apps in the next steps.Save And Test Webhook With Sample URL
  9. Next, click Save and test.
  10. Then Send test data. We expect this to fail, though it will give us the JSON payload.Send Test Data With Sample URL
  11. Copy the text from Body Payload into Notepad or Visual Studio Code.Sample URL Fails, Get Body Payload

Set up the Azure Logic App

  1. Log onto Azure at If you don’t already have a subscription, you can sign up using your Office 365 account.
  2. Click New, search for Logic App, and click Logic AppSearch For Logic App
  3. Click CreateCreate Logic App
  4. Complete the fields, placing the Azure Logic App in the region of your choice. You can name the Resource group whatever you like, or use an existing one. Click Create.Enter Logic App Details
  5. Click Edit to start editing your logic app.Edit Logic App
  6. Search for Request and click the Request TriggerCreate Request Trigger
  7. Now you can use your copied JSON Body Payload from as a reference for your Request Body JSON Schema.You’ll need to define the data type for each Key-Value Pair in your JSON payload. This allows you to use the separate fields in your Azure Logic App, and add the field data into the appropriate SharePoint columns.The syntax of the Request Body JSON Schema is as follows:
    "type": "object", 
    "properties": {
        "name": {
            "type" : " string"
        "email": {
            "type" : " string"
    "required":["name", "email"]

You can use Visual Studio Code, Notepad++ or Notepad to edit this schema so that it describes your JSON Payload.

Replace the properties values with the name of the keys in your JSON payload. Not all fields need to be added to the required array, only the ones that you need to create a valid SharePoint list entry.

In my case, this JSON body becomes the following JSON Schema.JSON Body In Visual Studio Code
JSON Request Body Schema

  1. Paste the Schema into the Request Body Schema and click Save.Save Request To Get POST URL
  2. You will then receive the URL that you can use in to send your requests:
  3. Next click + New step.Add New Step To Logic App
  4. Type SharePoint and click SharePoint – Create item.Create SharePoint List Item
  5. You may need to add a Connection to SharePoint Online. If you’re prompted, add a connection using an Office 365 account that has permission to write to the required SharePoint list. If you don’t have a SharePoint list available to accept the data, you’ll need to set one up now before proceeding.
  6. Next enter your site URL. The List Name drop down will be populated with the available lists. You should also see that the Outputs from the Request step are available to use.Enter SharePoint Site And List Details
  7. The list columns that can accept strings, as well as a few other column types will be available for you to modify. Click in each relevant column and select the relevant output.Add Outputs To SharePoint List
  8. Once you’re finished, go back to the Request Step in your Logic App and copy the URL from the Request stepCopy Request URL
  9. Return to, go back to Webhook integrations, and click Edit.Edit Webhook Integration
  10. Paste the URL from your Logic App Request step into the Target URL.Update Webhook Target URL
  11. Click Save and test.
  12. Click Send test data.Test Custom Webhook
  13. You should receive a response code of 202 to confirm it was sent successfully.Confirm Webhook Works
  14. You can now check Azure Logic Apps to confirm that it ran correctly.Logic App Runs Correctly
  15. You should also see the new entry in your SharePoint Online list.New Item In SharePoint

Setting up the Outlook Rule

Once you’ve confirmed it’s working, you can set up your mail rules in Outlook or via Exchange to automatically forward emails to your email address.

  1. Right click on an email sent via your web form. Click Rules, then Create rule.Right Click Rules Create Rule
  2. Choose a condition that matches all emails sent via your form, eg. Subject. Then click Advanced Options…Tick Subject Click Advanced Options
  3. Click Next.Click Next On Outlook Rule Wizard
  4. Tick forward it to people or public group, then click people or public group.Forward To People Or Public Group
  5. Enter the email address from, click OK, then click Next twice.Paste Email From Mail Parser
  6. Turn on the rule, and choose whether you want to run it on mail already in the same folder.Turn On Outlook Rule

And that’s it. From now on, any mail sent by your website’s form will be automatically forwarded into, broken up into the relevant fields, and added to SharePoint Online. You can also use Azure Logic Apps to automate a bunch of other business processes. Check out the documentation here.

Similar services to Azure Logic Apps include Microsoft Flow, Zapier and IFTTT.

Data Location

Act now to move your Office 365 data to Australia

Microsoft has delivered Office 365 from their Australian datacenters since the end of May 2015.

It was a big deal at the time, and it’s still a major selling point for their cloud platform, especially amongst businesses that have strict data residency requirements.

If you’ve purchased Office 365 since May 31, 2015 with an Australian billing address, you’ll be accessing your services from the Australian datacenters already. If you purchased it before then, some of your services might have moved automatically, though some may still be delivered from the Asia Pacific region.

If you’d like to move, be quick – the option is only available until October 31, 2016.

How to request a move to Australia’s datacenters

To make sure your organisation’s data is being hosted in Australia, or to request a move, follow these instructions.

  1. Log into the Office 365 Admin portal as a Global Administrator
  2. Click Settings then Organization Profile
    Organization Profile
  3. See your current Data locationData Location
  4. To move your data click Edit under Data residency optionData Residency Option
  5. Click the switch to Yes, then click SaveChanging Your Data Residency Option
  6. Within 12 months from October 31 2016, your data will be migrated to the Australian datacenters. You will be notified once it’s complete.Data Migration Confirmation

Since it’s a complex operation, no exact date for your migration can be given. See this link for more info:


Simplify External Contact and Group Management with Powershell

Connect To PowerShell First

We have a couple of customers that want to maintain distribution groups of external contacts that can be used company wide.

The way to do this as an Exchange admin is to create a Mail Contact for an external user first, and then add that mail contact to a distribution group. This can be quite an involved process, and you may not want to have users traversing the Exchange Admin Center to complete this sort of task.

To make this easier, we’ve put together a power shell script that you can download here.

Assign the minimum permissions

Any Global administrator will be able to run this powershell script, though if you want to give a user the ability to execute the commands, you’ll need to assign them to the appropriate role groups. These are Recipient Management and Organization Management. Keep in mind, even though these are the minimum permissions required to run this Powershell script, they still enable the relevant user to do pretty much everything within Exchange. For a full list of the permissions granted, see these links:

To give a user the correct permissions, connect to Exchange Online via Powershell as a global administrator and run the following commands. Replace [email protected] with the identity of the relevant user.

Add-RoleGroupMember "Recipient Management" -Member [email protected]
Add-RoleGroupMember "Organization Management" -Member [email protected]

Running the PowerShell Script

Once the user has been granted access they can run the powershell script under their own credentials.

  1. Download the script here. 
  2. Rename it with a file extension of .ps1 eg. DistributionGroups.ps1
    Rename Distribution Groups To DistributionGroups.ps1
  3. Run the script by right clicking the file and choosing Run in PowershellRight Click to Run With Powershell
  4. Press 1, then Enter to connect to Exchange Online. Press Enter again once the commandlets have downloaded.Connect To PowerShell First
  5. Follow the menu items within the PowerShell script to perform the following actions:
  • Add Mail Contacts to distribution groups
  • Get a list of distribution groups
  • Create a distribution group
  • Get a list of distribution group members
  • Remove a contact from all distribution groups