Use PowerShell to report on Azure AD Enterprise Application Permissions

Many Microsoft customers are now taking steps to try and modernise and centralise SaaS app identity by using Enterprise Applications within Azure AD to provide authentication, provisioning and reporting services.

This can be done by Administrators by adding applications into the AzureAD tenant and assigning users to them, or by Users (if you let them) who can self-service applications (think the Log in with Facebook / Google buttons). Applications which are added will have certain permissions assigned which will allow said application to be able to access AzureAD properties via the Microsoft Graph API.

These permissions can be as simple as allowing the application to read the users displayname, all the way to having full access to all files which the user can access in Office 365. You can see these permissions in the GUI by logging onto portal.azure.com and navigating to Azure Active Directory>Enterprise Applications>Application Name>Permissions, as seen in the screenshot below. We can see that the Adobe Document Cloud application has had Admin consent to have full access to all files the user can access, and to sign in and read user profile. You can see the full range of available permissions in the Microsoft Graph, and what they all mean here.

perms

This GUI feature is great for looking at individual applications, but if you are allowing users to provide consent themselves, or you are making full use of the Enterprise Applications feature, you are likely to have many applications listed here, and checking them one by one using the GUI is not efficient.

As always, PowerShell is able to come to the rescue. If we connect to the AzureAD v2 Powershell module by using Connect-AzureAD, we can export these permissions. Unfortunately, because of the way the data is presented, we need to do a little data massaging to make this possible.

Firstly, we need to get a list of all applications, and this can be done using:

Get-AzureADServicePrincipal | Select DisplayName,Homepage,ObjectID,AppDisplayName,PublisherName,
ServicePrincipalType | Export-Csv c:\reports\azureadsp.csv

This PS command will get a list of all the Service Principals (read: applications) you have configured, however it will not list the permissions. We need another cmdlet for that. The item we are most interested in for the Service Principal is the ObjectID, as this is the value we can use to map the Service Principal to the Permissions.

The next PS command we need is:

Get-AzureADOAuth2PermissionGrant | Select ClientID,Scope,ConsentType | Export-CSV :\oaauthperms.csv

This PS command will get a list of all the permissions granted in AzureAD. The important value here is the ClientID, which refers to the application, and the Scope, which refers to the permission level as described in the Graph Permissions article.

With this data we have two .csv files, and we need to compare the ObjectID from azureadsp.csv with the ClientID from oauthperms.csv. If we find a match, we need to copy the Now I’m no Excel expert, and there are probably better ways of doing this, but this was my method.

I copied the columns from azureadsp.csv into the oauthperms.csv. Let’s say the ObjectID value from azureadsp.csv ended up on row J. I would then create a new column called Application Name, at column A. I then used the INDEX, MATCH formula to look for identical ObjectID and ClientID values, and if a match was found, populate the Application Name.

indexmatch

The formula used looks like this:

=INDEX($H$2:$H$101,MATCH($B2,$J$2:$J$101,0))

Substituting the column names for logical names looks like this:

=INDEX($DisplayName$2:$DisplayName$101,MATCH($ClientID2,$ObjectID$2:$ObjectID$101,0))

This gives us a value in Application Name which shows us the application which has been given rights to the Microsoft Graph and can enable us to easily see and filter which permissions have been given to which application. This can be used for management purposes, reporting and security auditing.

Hopefully this is useful for you, and if you think this could be improved upon please let me know in the comments!

Advertisements

Where’s Wally?

Over the last year, misstech.co.uk has, to my shame, been left in a corner to gather dust. This wasn’t my intention. I enjoy writing, and have thoroughly enjoyed seeing people use this site to find solutions to the problems they have come across during their migrations, deployments and installations. It wasn’t due to a lack of enthusiasm that I went quiet, but due to a couple of things, which I’d like to explain for you now.

For those of you with a keen eye, or who know me personally, you will have noticed that I changed both my name and gender over the last two years. The URL of the website changed too; it used to be doubledit.co.uk, a half-accidental double entendre using the initials of my old name. You’ll still see references to it in older posts. Now it’s probably no surprise to learn that being transgender and going through a gender transition takes a pretty hefty toll on time and energy, and to be honest, I had nothing left to give outside of my normal day to day life. Thankfully, I can say that most of the major change is behind me now, and instead I can enjoy what is ahead of me.

The second reason why misstech.co.uk has gone quiet, is that I had a fairly significant change in job. For the last year and a half, I have been working at Microsoft as a Premier Field Engineer. It was a big change and my focus has shifted from technical deployment and troubleshooting, to education and assessment. Although being a PFE requires deep technical knowledge of my product areas (Identity, Security and Networking in Office 365), it is also much less hands on. I simply wasn’t seeing the kind of problems I used to write about in my blog. In all honesty, I didn’t know what to write about.

The reason I am writing this post is because I want to try and reinvigorate this site. Whilst I can’t write articles to help the masses fix that one particular error message in that one particular situation, I can still indulge in my love of technology and modern computing.

I plan to write something at least once a month from now on. It may be on a totally random subject, like gaming or a personal IoT project. It might be about the evolution of the modern workplace, or even a classic technical blog on how to fix that big screen of red text. Either way, I want to use this little platform that I’ve carved out for myself in the corner of the internet.

I started misstech.co.uk because I love writing and I love technology, and that’s why I want it to continue ūüôā

 

P.S. See if you can find Wally. Took me aaaaaaaages.

FDKBZ

 

Report Email Traffic By The Hour

It’s a well known fact that reporting is the sexiest topic in IT. To that end, I thought I’d post a quick one liner about email flow reporting in your organisation. This came about following a request from one of my favourite customers, who needed a way to report on how much email was being sent and received out of hours.

Get-MailTrafficReport -StartDate 01/14/2018 -EndDate 01/22/2018 -AggregateBy Hour -EventType GoodMail | select Date,Direction,MessageCount | Export-csv C:\users\emily\Desktop\mailflowreport.csv

This PS command is run in Exchange Online Powershell and will result in a CSV which shows an hourly breakdown of email sent / received in a given time period. It’s possible to add specific times to the dates (eg “01/14/2018 05:00”). I used the -EventType GoodMail variable to only report on Accepted mail in this example. You can also filter on -Direction (Inbound or Outbound). Below is a snapshot of the results:

Date Event Type Direction Action Message Count
------ ---- ---------- --------- ------ -------------
 15/01/2018 14:00:00 GoodMail Inbound 430
 15/01/2018 15:00:00 GoodMail Inbound 230
 15/01/2018 16:00:00 GoodMail Inbound 187
 15/01/2018 18:00:00 GoodMail Inbound 57
 15/01/2018 18:00:00 GoodMail Outbound 124
 15/01/2018 19:00:00 GoodMail Inbound 34
 15/01/2018 19:00:00 GoodMail Outbound 87

The TechNet article on the Get-MailTrafficReport cmdlet is here

This is a very versatile reporting function which can yield interesting data. This data can then be fed into PowerBI or a.n.other reporting tool to add some visual showmanship to the results!

Post offering a free Recreational Vehicle!

Social Engineering and your users

All my customers want to talk about security these days. So much of our day to day work is done on the internet now, and the security landscape has changed significantly over the last 5-10 years. Our perimeter is no longer restricted to our LAN and firewalls, but instead lives with the users identity. Users are highly likely to use their corporate network password on other sites; who knows whether this other site has been breached or not?

Things like this scare us IT admins, and even scarier now is the gargantuan amount of misinformation and misdirection on the internet. From that advert that looks like a “next page “button to the link to an article which promises to show you the most amazing thing you’ve ever seen a koala do, users will literally click on anything. Today I want to share one of these examples. I noticed this morning that a friend of mine had shared this post on Facebook:

Post offering a free Recreational Vehicle!

Wow! A free RV. Amazing right?¬†But this post looked a little strange. Who can afford to give away an RV, let alone 15 of them? And what did “can’t be sold because they have been stock this year” mean? My curiosity got the better of me and I decided to click through. This is what I found:

RV Main Page

Now let me give you a few facts:

  • This page only had a single post. This one.
  • It has 83,045 shares and 35,015 comments. In 3 days. Both of these numbers have gone up by around 1,000 while I have been writing this.
  • The page had nothing in the about section. No website, no contact details, nothing. Nada. Zip. It was basically an empty page with one post on it.

This was clearly a ruse and nobody was going to get themselves a free RV. I mean hey, maybe I’m wrong and an awful cynic who has been scarred by the internet. But in reality, this was a more than likely a social engineering experiment or phishing scam. Maybe the “lucky winners” would be contacted and asked for some personal details so that they could claim their free prize? Maybe they were directed to a fake Facebook login page? And maybe their Facebook login password was the same as their corporate network password?

This post shows us all just how easy it is to get people to click on something, or believe something, on the internet. And this stuff is everywhere we look. As an internet user, we are faced with a constant stream of misinformation and misdirection, never quite knowing when something is real and when it isn’t.

Security has never been more important. Using web filtering, multi factor authentication and implementing features for mail scanning like Safe Links and Safe Attachments (found in Exchange Online Protection) can help to a certain degree,  but a very large part of this fight is user education. People should be taught that their first response should be one of doubt, not of excitement about the amazing thing they are about to see, or the prize they will never win.

It’s a dangerous world we live in. But at least I’ll have a new RV to protect myself.

Exchange 2016 on Server 2016 – A reboot from a previous installation is pending

Recently I was attempting to install Exchange 2016 on Server 2016. On attempting to run the setup.exe /preparealldomains /iacceptexchangeserverlicenseterms command, I was receiving a failure when checking prerequisites which stated that:

PS E:\> .\Setup.EXE /preparealldomains /iacceptexchangeserverlicenseterms

Performing Microsoft Exchange Server Prerequisite Check

Prerequisite Analysis FAILED

A reboot from a previous installation is pending. Please restart the system and then rerun Setup.
For more information, visit: http://technet.microsoft.com/library(EXCHG.150)/ms.exch.setupreadiness.RebootPending.asp

I had rebooted the server a few times and ensured that no restarts were pending.

In versions of Server prior to Server 2016, I would be looking for the UpdateExeVolatile registry key and the PendingFileRenameOperations registry key under HKEY Local Machine. However these didn’t appear to be in their normal place. Eventually I did a search of the Registry and discovered that PendingFileRenameOperations has moved to:

HKLM\System\ControlSet001\Control\Session Manager\PendingFileRenameOperations

The previous location of this key was:

HKLM\System\CurrentControlSet\Control\Session Manager\PendingFileRenameOperations

Removing the entries in PendingFileRenameOperations resolved the problem in this case.

 

VNET Peering – When to use and when not to use

VNET Peering has been an available feature for almost a year now and has proved to be a very useful, popular, and for a long time the most requested feature. That said, as much as we would like to mesh together all our Azure VNETs into one lovely firewalled network topology, this isn’t always possible or suitable.

The situations whereby VNET peering (or it’s associated features)¬†cannot be used are as follows:

  • VNETs in different regions cannot have a peering relationship
  • VNETs with overlapping address spaces cannot be peered
  • VNETs which are both created using the Classic Deployment model cannot be peered
  • VNETs which are created using mixed deployment models cannot be peered across different subscriptions (although this will be available in the future)
  • Both VNETs must be created using the Resource Manager Deployment Model for Gateway Chaining (using a gateway in a peered VNET) to function
  • There is a default limit of 10 VNET peers per VNET. This can be raised to a maximum of 50 using Azure Support requests

This still leaves many applicable situations whereby VNET peering can be very useful and can provide the hub and spoke, high speed, low latency network which your Azure subscription/s need.

 

ADFS Additional Authentication Rules – MFA Issue Type

ADFS Claim / Additional Authentication rules can appear very complex and confusing, and that’s because they are! One thing that tripped me up recently is related to the issue section of a claim rule whereby¬†MFA is specified.¬†During a project, I¬†created a rule from a template I had used for another customer. Upon saving the rule I found that it didn’t apply MFA as I was expecting, and instead caused an error message in ADFS during logon attempts.

The rule I had used was issuing a claim for the Azure MFA Server rather than the Azure MFA Cloud Service. To clarify, the difference in the claim type is as follows:

Azure Cloud MFA

=> issue(Type = "http://schemas.microsoft.com/claims/authnmethodsreferences", Value = "http://schemas.microsoft.com/claims/multipleauthn");

Azure MFA Server

=> issue(Type = "http://schemas.microsoft.com/ws/2008/06/identity/claims/authenticationmethod", Value = "http://schemas.microsoft.com/claims/multipleauthn");

 

This is an important distinction and needs to be considered when applying different types of authentication flows in ADFS.