Using Logic Apps to Trigger Key Vault Rotation

Previously I’ve written about how we can use Azure Key Vault and PIM Groups as a secure password management solution. Something I didn’t cover at the time is the requirement in large environments to rotate passwords regularly. To achieve this rotation, we can leverage Azure Logic Apps to trigger email requests to rotate keys. The setup in my previous post will be used as a basis for the configuration below.

Configure Logic App

We can create a new Logic App in the Azure Portal, all we need is an Azure subscription (which we should have from setting up the Key Vault previously)

Create the App in the Azure Portal as below.

When the Logic App is provisioned, navigate to the “Identity”, enable the System Assigned Identity and click save.

Next, we need to grant permission on our Key Vault to the Logic App identity. Open the Key Vault and open up Access Policies. Create a new Access Policy and assign the appropriate access tot he service principal. For full control use the “Key, Secret & Certificate Management” template.

Now, back in our Logic App, we can start building out our logic. Firstly, add a trigger such as a recurrence pattern to schedule the app to run.

As we want to use our managed identity, we can’t use the default Key Vault connector so we will instead send an API request. Select the HTTP connector and select the HTTP action.

Fill in the HTTP connector as below with the following values:

HeadingValue
MethodGet
URIhttps://<KeyVaultName>.vault.azure.net/secrets/<secretname>/?api-version=7.1
Authentication TypeManaged Identity
Managed IdentitySystem Assigned Managed Identity
Audiencehttps://vault.azure.net

Now we can store our results in a variable for sending, Add an action for “Initialize Variable”. Here we can specify which values from our query we want, we are specifically looking for “Updated” attribute but we can capture the secret itself among other attributes the same way.

Now that we have the last updated date, we have a lot of options. We can aded a planner task, send a webhook to our Service Desk tool, send an email etc. This final step I’ll leave up to you but Logic Apps is extremely flexible and can interact with a massive number of systems.

Overall, adding this on top of the Key Vault functionality will help to really easily add value at a very low cost.

Onboarding Windows 10 Devices to the Microsoft 365 Compliance Portal

The Microsoft 365 Compliance Portal has a huge amount of nice features which can be used with cloud services. I’ve previously posted about the new Compliance Manager tool and how it can help to assess the controls in place in the tenancy while also recommending improvements. There are also tools such as DLP, Unified Labelling and Trainable Classifiers which provide some really flexible ways of protecting Data.

These features so far relate to how a user operates within the Microsoft 365 service but we also have some cool functionality available to us which we can extend to the end users device. We can leverage tools like Insider Risk Management and Endpoint DLP to extend our protection even further.

Prerequisites

To enable the device functionality, we first need to ensure we meet the prerequisites. Microsoft have published the below list for us to verify on our devices:

  1. Must be running Windows 10 x64 build 1809 or later.
  2. Antimalware Client Version is 4.18.2009.7 or newer. Check your current version by opening Windows Security app, select the Settings icon, and then select About. The version number is listed under Antimalware Client Version. Update to the latest Antimalware Client Version by installing Windows Update KB4052623. Note: None of Windows Security components need to be active, you can run Endpoint DLP independent of Windows Security status.
  3. The following Windows Updates are installed. Note: These updates are not a pre-requisite to onboard a device to Endpoint DLP, but contain fixes for important issues thus must be installed before using the product.
    • For Windows 10 1809 – KB4559003, KB4577069, KB4580390
    • For Windows 10 1903 or 1909 – KB4559004, KB4577062, KB4580386
    • For Windows 10 2004 – KB4568831, KB4577063
    • For devices running Office 2016 (and not any other Office version) – KB4577063
  4. All devices must be Azure Active Directory (Azure AD) joined, or Hybrid Azure AD joined.
  5. Install Microsoft Chromium Edge browser on the endpoint device to enforce policy actions for the upload to cloud activity. See, Download the new Microsoft Edge based on Chromium.
  6. If you are on Monthly Enterprise Channel of Microsoft 365 Apps versions 2004-2008, there is a known issue with Endpoint DLP classifying Office content and you need to update to version 2009 or later. See Update history for Microsoft 365 Apps (listed by date) for current versions. To learn more about this issue, see the Office Suite section of Release notes for Current Channel releases in 2020.

Enable Device Onboarding

When we have met the prerequisites in our environment, we can now enable Device Onboarding from the Compliance Portal. Navigate to https://compliance.microsoft.com and open up “Settings” then “Device Onboarding”.

From here, we turn on device onboarding and we’ll see that any of our devices already onboarded to Microsoft Defender for Endpoint will already be included… more on this in a bit. For now, click OK to enable Onboarding.

We might need to wait a few minutes for everything to kick in but when it is we are ready to onboard machines.

In the onboarding section, we can see the list of onboarding options available to us, you might notice that the list looks kind of familiar. For now we’ll select Local Script as we are testing on a small scale but there is a lot of flexibility in how we can deploy.

Select Local Script and download the package. Once it’s downloaded let’s open it up and see what it’s doing.

Opening up the downloaded script confirms the feeling of Déjà vu we might have been having. The onboarding process isn’t a unique Compliance Portal process, we are enrolling in Windows Defender for Endpoint which we may have already done in our tenancy. So the enrollment is the same thing. This makes sense as Windows Defender is the agent on the machine which actually enforces our controls.

Onboard a Device

Ok, now that we have our onboarding script (or whatever method we chose earlier) we just need to run it on the device. For the Script, we just copy to the machine and run as an admin.

We get the standard warning which we accept and the script will continue and onboard the machine for us.

On a larger scale I recommend using Microsoft Endpoint Manager / Intune for onboarding but for this demo the script has worked fine.

Verify The Machine Has Been Onboarded

After a minute or two we can hop back over to the Compliance portal and see our machine has been onboarded.

If we have the licensing, we will also see the device in the Windows Defender for Endpoint page.

Now that the device is onboarded, we can use some of the device based features of the Compliance center. I’ll be going through some if these in subsequent posts!

Exchange Online Native Tenant to Tenant Migrations (Preview)

With the proliferation of Microsoft 365 as the collaboration platform of choice in the enterprise space, it’s rare to find a large organization that hasn’t undergone some form of tenant to tenant migration. This can be a result of mergers, acquisitions or divestitures. Microsoft have not previously had any native tooling to facilitate this and third parties such as BitTitan and Quest have built up some really slick products to help organizations manage this technical transition.

This has slowly begun to change with the Microsoft acquisition of Mover in 2019 to help facilitate file migrations to Office 365. Microsoft seem to be making more native migration functionality available as part of the service. The most mature of the migration tools is also the oldest, the native Exchange on-premises migration tools using Exchange MRS functionality. This has also been improved recently with the availability of the Exchange modern hybrid configuration, removing the need to open up on-premises endpoints to the cloud by leveraging application proxy technology.

This Exchange functionality has now been extended to cross-tenant migrations allowing the migration of mailboxes from one tenancy to another using the familiar Exchange migration tools.

Prepare for Migration

First we need to set up our environments for the tenant to tenant migration. To understand the configuration, Microsoft have published the below diagram which explains the process in detail:

So from this diagram, we can see the high level componants of the migration infrastructure are:

  • A Tenant relationship application registration in the destination tenancy with the below API permissions
    • Exchange: Mailbox.Migration
    • Graph: Directory.ReadWrite.All
  • Azure KeyVault stores the app secret details for this app
  • The Source Tenant grants consent to the tenant relationship app created in the destination tenant
  • A two way Organization Relationship
  • A mail enabled security group in the source tenant to scope mailboxes for migration

Luckily, Microsoft have automated a lot of this setup with PowerShell scripts located on GitHub:

Source – SetupCrossTenantRelationshipForResourceTenant.ps1

Target – SetupCrossTenantRelationshipForTargetTenant.ps1

Prepare the Target Tenant

To prepare the target tenant, download the SetupCrossTenantRelationshipForTargetTenant.ps1 script.

To run the setup script, ensure you have the ExchangeOnlineManagement, AzureAD (the Preview Module doesn’t seem to work) and AzureRM PowerShell modules installed, if you don’t, you can do that with the below commands:

Install-Module ExchangeOnlineManagement
Install-Module AzureRM
Install-Module AzureAD

Once the modules are installed, connect to Exchange Online with:

Connect-ExchangeOnline

Now we can finally run the first script. The following paramaters are required to run:

  • -ResourceTenantDomain The mail domain of the source tenant
  • -ResourceTenantAdminEmail The email address for the admin account in the source tenant. Ensure this account has a valid mailbox.
  • -TargetTenantDomain the mail domain of the target tenant
  • -ResourceTenantId The source tenant Azure AD Directory ID
  • -SubscriptionId The Subscription ID to create the KeyVault in
  • -ResourceGroup A name for the KeyVault Resource Group
  • -KeyVaultName A name for the KeyVault
  • -CertificateName A name for the certificate
  • -CertificateSubject A certificate subject name: “CN=admin_seanmc”
  • -AzureAppPermissions The permissions to grant: Exchange, MSGraph
  • -UseAppAndCertGeneratedForSendingInvitation
.\SetupCrossTenantRelationshipForTargetTenant.ps1 -ResourceTenantDomain <Source Tenant mail domain> -ResourceTenantAdminEmail <Source Tenant Admin Account Email> -TargetTenantDomain <Target tenant domain> -ResourceTenantId <Source Tenant Directory ID> -SubscriptionId <Azure Subscription ID> -ResourceGroup "CrossTenantMoveRG" -KeyVaultName "adminseanmc-Cross-TenantMovesVault" -CertificateName "adminseanmc-cert" -CertificateSubject "CN=admin_seanmc" -AzureAppPermissions Exchange, MSGraph -UseAppAndCertGeneratedForSendingInvitation 

This script will prompt for destination tenant credentials twice during its run and then will pause, asking for you to grant consent to the new app registration. In Azure AD App Registrations, open the new app and grant consent to the API permissions.

When consent is granted, hit enter on the script to continue and set up the Organization relationship.

Finally, note down the Application ID that is saved to the $AppID variable in the PowerShell session. If you miss this you can get it from the Azure AD app registrations page also.

Prepare the Source Tenant

Now that the destination tenant is configured, we can move on to the source tenant. When running the previous script, we were asked for an admin email address in the source tenant. When we log into this account we will find a B2B invitation from the destination tenant admin. Open this mail and accept the invitation.

Next, accept the permission request from the application to allow it to pull mailbox data.

With the permissions in place, we now create a mail-enabled security group to manage our migration scope. All mailboxes to be migrated will be part of this group. To create a group you can run the below Exchange Online PowerShell Command in the source tenant.

New-DistributionGroup t2tmigrationscope -Type security

Then add any in-scope mailboxes to the group with the below command.

Add-DistributionGroupMember -Identity t2tmigrationscope -Member <Mailbox to add>

With our scope in place, we can now prepare and run the source tenant preparation script. To run the script, we need the following parameters:

  • SourceMailboxMovePublishedScopes – This is our mail enabled security group created previously
  • ResourceTenantDomain – This is our source tenant mail domain
  • TargetTenantDomain – This is our target tenant mail domain
  • ApplicationId – This is the Application ID we noted during the target configuration
  • TargetTenantId – Azure AD Directory ID of the target tenant

With all of this information to hand, run the script SetupCrossTenantRelationshipForResourceTenant.ps1 as below:

SetupCrossTenantRelationshipForResourceTenant.ps1 -SourceMailboxMovePublishedScopes <security group identity> -ResourceTenantDomain <source tenant mail domain> -TargetTenantDomain <target tenant domain> -ApplicationId s<AppID> -TargetTenantId <source tenant directory ID>

When this is complete we have all permissions in place and our Organization Relationship is in place so we can move on to preparing our users.

Prepare Destination User Accounts

To migrate a mailbox cross-tenant, we need to have a valid mail user in the destination tenant. There are several attributes we need to ensure align between the two to make sure the migration is successful. To gather the required data, run the below command against the mailbox(s) you wish to move in the source tenant.

get-mailbox <mailbox> |select exchangeguid,archiveguid,legacyexchangedn,userprincipalname,primarysmtpaddress 

This will give an output similar to the below.

Use this output to create a new mail user in the destination tenant. This setup can vary depending on if your destination environment is synchronized with Active Directory but for a non-synchronized environment, the below commands in Exchange Online PowerShell should create the user with the appropriate attributes.

New-MailUser <alias> -ExternalEmailAddress <source tenant email> -PrimarySmtpAddress <destination tenant email> -MicrosoftOnlineServicesID <destination tenant username>    

PS C:> Set-MailUser debrab -ExchangeGuid <exchangeGUID from source> -ArchiveGuid <archiveGUID from source> -EmailAddresses @{Add="x500:<LegacyExchangeDN from Source>"}                                                          

Finally, once these attributes are present, give the new user(s) a valid Exchange Online license. If everything was done correctly, no Exchange Online mailbox will be provisioned when the user is licensed.

With the account(s) created, finally all the prep work is done so we can now move on to testing migrations.

Start Cross-Tenant Migration Batch

Before starting the migration, we can create a comma delimited CSV file so we can import our batch. the CSV only needs a single column named ‘EmailAddress’ and should specify each target tenant email address for our user batch.

To create a new cross tenant migration request, we navigate to the new Exchange Admin Center at https://admin.exchange.microsoft.com from the destination tenant and open up the “Migration” section. From here we create a new migration batch and select “Migration to exchange Online”

Next we select the migration type “Cross tenant migration”

We can see the prerequisites we’ve worked through listed on the next page, since we’ve done all the work already, we can hit next.

On the next page, we select the migration endpoint our script configured and hit next.

Next, upload the CSV file we prepared earlier.

Finalize the standard move configuration settings.

Configure any scheduling we need to perform and finally hit “save” to kick off the migration batch.

When the batch is created, we’ll see the success page below and then we can check the status throughout via the migration batches or by PowerShell.

After a little while the migration is synced. We can complete it as we would with any other migration batch.

We have now successfully migrated from one Exchange Online Tenant to another with native tools. When this functionality goes GA, it could really change the way a lot of Organizations approach multi-tenant configurations and migrations. For more information on Tenant to Tenant migrations, see the official Microsoft documentation here: Cross-tenant mailbox migration – Microsoft 365 Enterprise | Microsoft Docs

Project Oakdale Renamed to Microsoft Dataverse for Teams

In a previous post I went though how we can use Project Oakdale to create some pretty flexible apps in Microsoft Teams. The platform which allows for a subset of the CDS functionality to be used inside of Microsoft Teams has just gone GA! With the GA release, we’ve also got a new name.

In line with the renaming of CDS (Common Data Service) to Microsoft Dataverse, we see Project Oakdale (essentially CDS lite) be rebranded as Microsoft Dataverse for Teams. Personally I always though Project Oakdale was a bit of a stupid name… Long live Dataverse.

Cheesy names aside, Dataverse for Teams provides a great set of features that were previously locked behind a full Dataverse (Or CDS) SKU. While it’s not on the level of Dataverse, it’s a fantastic tool for small, team scale low-code solutions.

For more information on the functionality available in Dataverse and Dataverse for Teams, see the launch notes here:

Reshape the future of work with Microsoft Dataverse for Teams—now generally available | Microsoft Power Apps

Ingest PST Files Into Exchange Online From A Custom Azure Storage Account

I hate when I see PST files being used as a business solution. It’s quite common when performing a discovery that you will find a host of users who heavily rely on PST files as their “Personal Archives”. This usually stems from a lack of storage space in an on premises Exchange environment, with users being asked to clean up their mailboxes when they start getting larger. Even with increases in available storage, this practice never really went away for a lot of people.

It doesn’t always make sense to end users who have neatly collected twenty years worth of PST archives on their local C drive as to why this is such a bad idea (and no, OneDrive isn’t a solution for this). Generally, as part of an Exchange Online migration, I recommend collecting and ingesting the PST files into the users online archive.

PST Ingestion Service

The Exchange Online PST Ingestion Service has been around for a long time. It’s a fantastic tool that is really easy to use and helps organizations to work towards removing PST files from use without users losing any data they may have stored in them.

The service is quite straight forward and can be kicked off from the compliance center as long as you have assigned the “mailbox Import Export” role in Exchange Online. The ingestion service will provide a free Azure Storage Blob to upload all the files to, it will also recommend using AZ Copy for network upload and walk through the syntax for the upload process. A mapping file can then be provided to import each PST into the relevant mailbox or archive. This is a great, easy way of getting the PST files ingested into Exchange Online and made available to users within a secure, redundant solution.

For very large data sets, there is also the option of shipping a drive directly to Microsoft to avoid uploading all that data over the network. There is a small cost associated with this method but it is a valid option for very large organizations.

Using Custom Storage Accounts

The service provides an excellent method of ingestion for 99% of organizations however for some more complicated organizations with more strict data governance controls in place, this may not tick all the boxes. For those use cases, there is another method that’s not quite as well known.

The ingestion service really runs the exchange cmdlet “New-MailboxImportRequest” in the background with a host of flags pointing to Azure Blob Storage so we can cut out the middle man and run this command ourselves to import from locations other than the Microsoft provided blob.

To do this, we obviously need to create a new storage account and blob container in our Microsoft Azure subscription. Once it’s created, we can use AZ Copy to upload (or any other connection to the blob such as Storage Explorer). We can even use the new built in Storage Explorer in the Azure Portal to upload our files.

Once data is uploaded, we need to generate a SAS URI for the import request to use.

Finally, we need to format our import command as below:

New-MailboxImportRequest -Name <Job Name> -Mailbox <user@contoso.com> -AzureBlobStorageAccountUri <Blob Endpoint / Folder Name / PST Name> -AzureSharedAccessSignatureToken <SAS Token> -TargetRootFolder <Root folder for import> -BadItemLimit <Amount of bad items before import fails> -LargeItemLimit <Amount of large items before import fails>

For example, for the above PST file the command would look like:

New-MailboxImportRequest -Name "demo import" -Mailbox sean@adminseanmc.com -AzureBlobStorageAccountUri "https://pstingestion.blob.core.windows.net/pstfiles/PSTData.pst" -AzureSharedAccessSignatureToken "?sv=2019-12-12&ss=b&srt=sc&sp=rwdlacx&se=2020-11-16T08:36:32Z&st=2020-11-16T07:36:32Z&spr=https&sig=o4FmfojEUfsf4b1rVssometextremovedhere" -TargetRootFolder "/" -BadItemLimit 20 -LargeItemLimit 20

Note: A target root folder of “/” will import into the top level of the mailbox, merging with existing folders.

For more information on ingesting PST files, see the Microsoft Documentation on the ingestion service “Overview of importing your organization’s PST files – Microsoft 365 Compliance | Microsoft Docs

Protect Corporate Data Within Windows 10 Apps With Windows Information Protection

With the massive shift towards user mobility and BYOD devices, it’s important to consider how we can help users be at their most productive while maintaining control over data. For mobile devices (Android, iOS) we have Microsoft Endpoint Manager (Intune) Mobile Application Management (MAM) Policies. MAM Policies work extremely effectively for BYOD devices and help provide the security needed by sandboxing the mobile applications used to connect to our corporate data.

For Windows devices, there are session control policies which can allow limited, read only access via a web browser from any device and even put some complex rules in place to define what exactly can happen within that session. This is great for web access but when users need some more flexibility around client applications, or when a web application just doesn’t meet the requirements, we need to add some control to how the application works in the context of the users device.

To meet this use case, we have Microsoft Windows Information Protection (WIP). WIP allows us to control how data moves throughout the end user device by designating data as either corporate or personal. Through the use of WIP Enlightened Apps, we can add controls around locations that data will be protected such as SharePoint Online or specific network shared, whether non-enlightened apps can access data marked as corporate, control copy/paste functionality and also add Microsoft Information Protection / Sensitivity Labels to data extracted from corporate locations.

NOTE: It’s important to note the limitations of WIP, as it is not a rock solid DLP solution, but rather another layer in the stack of protections available.

Creating a WIP Policy

To test out WIP, first ensure that the MAM Scope and URLs are up to date in Azure AD by Navigating to ‘Azure AD’ -> ‘MDM & MAM’ -> ‘Microsoft Intune’, verify the MAM user scope contains your target users and hit ‘Restore default MAM URLs’ if you have changed them previously.

Now we can create an App Protection Policy in the Microsoft Endpoint Manager Admin Portal. Navigate to ‘Apps’ -> ‘App Protection Policies’ and create a new Windows 10 Policy.

We create a new policy and choose if it applies to enrolled (MDM) or unenrolled devices (MAM). There are a few important differences to consider when deciding this as outlined in the Microsoft Documentation

  • MAM has additional Access settings for Windows Hello for Business.
  • MAM can selectively wipe company data from a user’s personal device.
  • MAM requires an Azure Active Directory (Azure AD) Premium license.
  • An Azure AD Premium license is also required for WIP auto-recovery, where a device can re-enroll and re-gain access to protected data. WIP auto-recovery depends on Azure AD registration to back up the encryption keys, which requires device auto-enrollment with MDM.
  • MAM supports only one user per device.
  • MAM can only manage enlightened apps.
  • Only MDM can use BitLocker CSP policies.
  • If the same user and device are targeted for both MDM and MAM, the MDM policy will be applied to devices joined to Azure AD. For personal devices that are workplace-joined (that is, added by using Settings > Email & accounts > Add a work or school account), the MAM-only policy will be preferred but it’s possible to upgrade the device management to MDM in Settings. Windows Home edition only supports WIP for MAM-only; upgrading to MDM policy on Home edition will revoke WIP-protected data access.

Next we choose the targeted and excluded apps. Targeted apps will be the “enlightened” apps that we protect. Excluded apps will be able to access corporate data without restrictions applied.

We can then choose the mode our policy operates in. We can straight away block moving data out of corporate locations, we can allow a user to override the corporate classification or we can run in a silent “monitoring” mode where we can view reporting and assess the impact of enabling WIP. This is recommended before enabling to ensure the impact is understood.

Next we configure the settings of our policy. We can configure for our Proxy server also to ensure endpoints that use a proxy are still protected. We can also assign an RMS template ID to corporate data to add an extra layer of security.

For now, we will just specify our cloud resources to protect as our SharePoint and OneDrive locations. We can add the following list of locations to our scope:

We add SharePoint and OneDrive as Cloud resources, specifying the URL <contoso.sharepoint.com> and <contoso-my.sharepoint.com>.

Now when we assign our policy to a user we can see the added functionality to help protect corporate data.

User Experience

When our user now connects to corporate data using the apps specified, we can see WIP in action. The first thing to note is that in the policy we enabled the “Show the enterprise data protection icon” which is off by default. This essentially tells the user that the app is working in a “corporate” context.

Clicking on the icon informs the user that they are working in a managed app.

When our user tried to download a file from our corporate location, they will see a briefcase icon indicating that the document came from the corporate environment. We also see the “File ownership” column in file explorer which tells us if the file is corporate data or personal

If we allowed overrides in our policy the user can right click to change the file ownership. We didn’t in our policy so this is greyed out.

The user can however, classify personal documents as corporate.

Once this txt file is classified, we can see the WIP icon appear in Notepad.

Extracting Corporate Data

With all this protection in place, let’s look at what happens when a user drops a corporate document into their personal OneDrive folder. As OneDrive personal is not listed as a corporate location, the user get’s a message telling them that this action isn’t allowed.

We can see similar behavior when the user tries to copy to a removable drive or an unprotected network share.

Summary

WIP does not meet every use case and is not a complete protection solution. There are ways around the controls here for savvy users who really want to do it. It is however a great addition to help protect users from making mistakes and make it a little bit harder for someone to carelessly extract corporate data without thinking.

Overall a great technology and another layer of protection that is (relatively) seamless to end users. For more information on configuring WIP, check out the Microsoft Documentation for an in depth guide.

Coming Soon: Expiration For Guest User Access To SharePoint Online / OneDrive Documents

External file sharing is always a struggle to get right in Microsoft 365. Based on factors like company culture, industry and relationship to partner companies, there are a huge amount of variables that will influence the exact policy and controls that are required to tailor the relevant features.

I can’t stress enough how much setting this up right, as early as possible will save time in the long run. Providing a flexible, secure and scalable solution to external collaboration is a key factor to setting up any Office 365 tenant. Luckily, there are a myriad of features available in Office 365 to help provide the governance we need.

When we need to protect key specific information types we have Microsoft Information Protection and Data-Loss Prevention. When we need to protect Identity and Authorization, we have things like B2B, Identity Governance, Identity Protection and Conditional Access. The list goes on.

Each of these features (which mostly require licensing) are fantastic at meeting complex requirements but are not a replacement for a good baseline configuration of the toolset. One of the most important pieces of configuration we can get in place for our tenant with regards to file sharing and collaboration are the base sharing settings in the SharePoint Online Admin Center.

These settings control who we allow our users to share with, which users we allow to share externally and what controls apply to our guest users who are shared information.

A new addition to these controls is due in Jan 2021is the ability to set an admin level expiration on all OneDrive and SharePoint sharing links to Guests. This is a nice feature to have and will help to avoid perpetual access to externally shared data.

This setting will be disabled by default on roll-out so it is worth considering if it will fit into your configuration and planning communication to users.

How To Use Microsoft 365 Productivity Score To Drive Digital Transformation

With the recent public release of the Microsoft 365 Productivity Score. It has never been easier to assess your organization’s adoption of Microsoft collaboration tools, identify areas for improvement and plan to help users get the most out of the tools available.

The productivity score is a great baselining tool in the same vein as the Secure Score and Compliance Score. Giving a numerical value based on a wide set of statistics, along with suggestions for improvement.

Enable the Productivity Score

The first step to understand how Microsoft 365 tools are used in your organization is to enable the tool. This can be done from the Microsoft 365 Admin Center by navigating to ‘Reports’ -> ‘Productivity Score’. From here, we can enable the tool, this might take up to 24 hours to finish, then we can start using it.

Once the Productivity Score dashboard is enabled, it will show up in this section of the Admin Center.

Score Breakdown

The overall Productivity Score is broken down into two overall ‘buckets’. The total score is the sum of the ‘People’ score and the ‘Technology’ score. You can track your score against the benchmark set by similar organizations.

This provides an ‘at a glance’ metric of how the Microsoft 365 toolset is being used in our organization. Along with the current score, we also see how our score has changed over time, this gives us insight into the usage trend in the organization.

People Score

The ‘People’ Score helps us track how our users are leveraging the tools available to them across the various classifications (‘Communication’, ‘Meetings’, ‘Content Collaboration’, ‘Teamwork’ and ‘Mobility’).

We can dive into one of these classifications to see the metrics that make up our score in a particular area. In ‘Communication’ for example, we can see the different methods our users are using to communicate.

For each metric, we can view content such as videos and articles to help promote the particular method of communication. This gives us access to some prepackaged user training and communications which can help maximizes the utilization.

We can also see a per-user breakdown of active communication methods, allow us to identify areas of the business that need the most help.

Using the resources in the ‘People’ score, we can easily get a view of how the tools are being used, and by who, and guide our adoption and change management efforts efficiently. In the above examples we can see that our efforts should focus on promoting and providing materials around Yammer much more than Teams chat. We can also see a subset of our users are actively using ‘@ mentions’ however some users aren’t, despite active Teams chat usage.

Technology Score

In our ‘Technology’ score, we can see how the Technical aspects may be affecting our user productivity. In Technology, we see Endpoint Analytics from Microsoft Endpoint Manager (this needs to be enabled), The impact of Network Connectivity and the health of our Microsoft 365 Apps.

In the Microsoft 365 Apps health section for example, we can see the versions of the Office apps our users and connecting with and the associated update channels.

In this example, we can see quite number of our users have unsupported versions of the Microsoft 365 Apps suite and the update channel is configured to ‘Semi-Annual’. This will

As with the ‘User’ score, we can dive into some useful resources around how to manage and improve this baseline, which will in turn increase our total score.

Summary

As with the Secure Score and Compliance Score, I do recommend that these metrics are taken with a pinch of salt and context is considered before circulating the score. For instance if corporate structure or policy blocks a large group of our users from using Teams, our score there will always be lower. If licensing models or technical restrictions dictate we are using Office 2019 in our organization, that will obviously affect our score.

The Productivity Score is a great addition to the toolset, providing some quick insights and prepackaged metrics that are easily consumed at C-Level in our organization. Overall it is nice to have this information readily available but please don’t rely on any of the Microsoft ‘Scores’ to accurately depict the nuances of your organization.

Protecting Exchange Online Mail With DKIM

DomainKeys Identified Mail is a security standard for email which leverages public and private certificates to secure the transit of mails by signing them as they leave the source environment. The recipients can then verify the sender by looking up the public key via DNS lookup. Having DKIM set up for your mail domains goes a long way towards protecting your domain from being used in Phishing/Spoofing attacks.

To set up DKIM on any mail system requires a good deal of setup including key hosting, IIS configuration, firewall rules and DKIM addons or plugins for different mail systems. In Exchange Online, a lot of the heavy lifting around DKIM has been done for us by Microsoft. We leverage our tenant name (“Onmicrosoft” Domain) as the default signing domain and redirect our custom domains back to that.

Setting Up DKIM in EOL

Setting up DKIM for your custom domains in Exchange Online is very straightforward. To enable DKIM functionality for a domain, first run the below cmdlet in the Exchange Online Management Shell to add the functionality.

New-DkimSigningConfig -DomainName <DomainName.com> -KeySize 2048 -Enabled $True

This will give you an output of two CNAME records to configure. A DKIM record for Exchange Online will look like the below in the public DNS of your custom domain:

Record TypeHost NameValueTTL
CNAMEselector1_domainkeyselector1-domain-com._domainkey.clouddomain.onmicrosoft.com3600
CNAMEselector2_domainkeyselector2-domain-com._domainkey.clouddomain.onmicrosoft.com3600

Once the records are in place, the DKIM configuration can be found in the ATP Policy section of the Security & Compliance Center, under ‘Additional Policies’ – ‘DKIM’.

Here we can select our domain and select ‘Enable’ to turn on DKIM signing for the domain, easy! Note that DNS may take a while to replicate so we may need to wait a while for this to be detected and enable successfully.

We can also manually rotate the signing keys at intervals from this page.

DKIM is a great security measure to protect your email domains from being spoofed. Along with DKIM, make sure your SPF and DMARC records are up to date to provide your recipients with the best possible information about your source environment. If you have alternate mail sources, such as independent relays, make sure to factor them into your DKIM configuration also.

Finally, ensure that your mail system, whatever it may be is inspecting the SPF, DKIM and DMARC of inbound mails and taking appropriate actions when these policies arent met!

Project Moca – Organize Your Content in Outlook on the Web

With so many different productivity tools at our disposal today it can be hard to stay organized. We have apps like Planner and To-Do (which now roll up into Tasks in Teams) that can help track progress on work and remind us of our task list. We also have OneNote which is great for personal or shared note taking. We can flag or pin mails in outlook, pin channels in Teams and create collections in Edge. All of these tools help us work towards being organized and efficient but at times can feel disjointed, particularly when we are looking for flexibility for personal organization.

A underutilized tool that a lot of users swear by is the Notes in Outlook functionality, which essentially brings the Windows 10 Sticky Note function into the Outlook client. This functionality has been around for a long time but never really became a key feature as it was quite limited.

Microsoft’s new “Project Moca” tool aims to provide users with a flexible, user driven organizational space which can be used for a large array of use cases. Project Moca which is currently in preview, is based in Outlook on the Web and provides users with the ability to create a ‘space’ dedicated to a particular area such as projects, daily tasks, personal plans etc.

Enabling Project Moca in your Organization

To enable Project Moca all users or a subset of users in your Organization, we need to set it as enabled in the OWA Mailbox Policy for the users. If there are no custom policies then we can set in the default policy to enable for all users. Run the below command in Exchange Online Management Shell, entering the name of your desired policy:

Set-OwaMailboxPolicy <PolicyName> -ProjectMocaEnabled $true 

Once enabled it can take a little while to apply to all users.

Using Project Moca

Once it is enabled, you will find Project Moca in Outlook on the Web from the Outlook module switcher at the bottom left:

When we first navigate to Project Moca, we are given a list of templates to start from, or we can start a new space from scratch. Here I’ve selected a new Project Plan space where we can collect details about a new customer project:

We give the space a name and add some keywords and people to help identify the content relating to this project:

Now that our space is set up, we can begin adding content. We can add new buckets and post tasks, notes, documents, locations, weather, mails, URLs and events to our space and organize then into the buckets we create. We are also presented with a list of dynamically detected content based on the users and keywords we entered when we set up the space:

We can customize the space to look however we want

We can add touches like color and icons to each bucket to help keep things organized and easy to understand.

While Project Moca won’t suit everyone, there is a lot of flexibility here to help the people who buy into it to stay organized and capture a lot of information in one place. No doubt there will be improvements and additions over time but as a new addition, there are a lot of use cases for Project Moca, even if it won’t suit every user.