ADFS Certificate Renewal

I don’t usually do this, but I love this post so much I needed to tell my readers about it. ADFS has 3 certificates assigned to it, and it’s uncommon for the token-signing and token-decrypting certificates to be trusted, 3rd party certs. They are usually left as self-signed certs. Their use scenario doesn’t demand that 3rd party certs are used, and in all honesty using these would provide no tangible benefit. So whenever I deploy ADFS, only the Service Communications certificate uses a trusted 3rd party cert.

The renewal of these self-signed certificates can be a pain and can easily be forgotten until it’s too late, so whenever I deploy ADFS for a customer, I always set the duration of these certificates to 100 years, using this handy guide!

http://www.expta.com/2015/03/how-to-update-certificates-for-ad-fs-30.html

Azure App Cloud Discovery & PAC Files

Azure App Cloud Discovery is a seriously cool piece of technology. Being able to scan your entire computer estate for cloud SaaS applications in either a targeted, or catch-all manner can really help discover the ‘Shadow IT’ going on in your environment. Nowadays, users not having local admin rights won’t necessarily stop them from using cloud SaaS apps in any way which is going to increase their productivity. Users don’t generally think about the impact of using such applications, and the potential for data leakage.

But, as with lots of Microsoft’s other cloud technologies which are being launched left, right and centre at the moment, the Enterprise isn’t catered for as it might hope. Most Enterprise IT departments leverage some kind of web filtering, or proxying. This may be using transparent proxying, in which case you can count your blessings as Cloud App Discovery will work just fine. If you are explicitly defining a proxy in your internet settings, then you can get around that by adding particular registry keys. However if you are using a PAC file to control access to the internet, then unfortunately Cloud App Discovery will not work for you. This is a shame as it is, in my opinion, the best way to approach web proxying in an Enterprise, but that’s another story. From what I have heard, a feature is in the works which will allow you to configure Cloud App Discovery agents to log their findings to an internal data collector. This data collector can sit on a local server and then upload data to Azure on your behalf, which is a much more elegant solution to the problem of data collection from multiple machines. However as far as I know, this feature is not available yet. I’ll be keeping my ear to the ground and will let you know if this changes.

In the meantime, if you are desperate to get your data collection up and running in the meantime, you could change to explicitly defined proxying, and configure registry settings for your clients as per the following MS article:

https://azure.microsoft.com/en-gb/documentation/articles/active-directory-cloudappdiscovery-registry-settings-for-proxy-services/

Cloud App Discovery is a feature of Azure Active Directory Premium, a toolkit designed to take your Azure Active Directory to new, cloudier heights. Azure AD Premium can be bought standalone, or comes bundled with the Enterprise Mobility Suite. I would highly recommend it to Office 365 customers as it can give you and your users some great new features which can help make your Azure AD the best it can be!

Edit: It looks like PAC file support has been added rather surreptitiously. No announcement was made, and the KB articles haven’t been updated. I happened to check the Change Log today and Release 1.0.10.1 includes an option to tweak your PAC file to support Cloud App Discovery. https://social.technet.microsoft.com/wiki/contents/articles/24616.cloud-app-discovery-agent-changelog.aspx

Alternatively, if you use a PAC file (Proxy Auto Configuration) to manage proxy configuration, please tweak your file to add https://policykeyservice.dc.ad.msft.net/ as an exception URL.

I’m yet to test this myself but it looks promising!

Shameless TechNet self promotion

I’ve just realised that I never shared this link. I wrote this technical piece for the TechNet UK Blog back in July 2015, and just thought I’d give it a bump. The subject matter is regarding the post-hybrid Office 365 landscape, and what you should be doing once you’ve migrated all your mailboxes (apart from get yourself an ‘I am a cloud god’ mug).

Office 365 – The Journey Continues

Happy New Year

Eighteen days into 2016, and after a short flurry of new blog posts, I thought I’d check in and say Happy New Year! I hope all my readers had a great 2015 and also some epic plans for 2016. You will be hearing plenty from me about interesting project related issues and fixes I come across. Hopefully the IT community can continue to help each other keep the lights on in an industry which is becoming increasingly vital to the day to day running of businesses worldwide.

Most people thought that 2015 was the year of the cloud, and indeed a lot of uptake was seen in this area. However I’m convinced that 2016 is the year that the platform really matures and starts to make good on it’s promises. It’s true to say that lots of people spent a lot of time in 2015 working even harder than usual; trying to balance the day to day running of their existing On Premise infrastructure with various cloud migration projects, and this will surely continue in 2016. As we move through the year though, businesses who have already made this time investment should be able to allow their IT departments to stop worrying about keeping the lights on, and start planning new ways to make IT awesome for their end users.

For me, that’s what cloud is really about. It takes the hypervisor patching, hardware maintenance and repetitive tasks out of our schedules, and allows us to focus on making IT awesome. If enterprise IT is to evolve in the way it needs to in order to keep up with consumer tech and users expectations, we need to invest our time into discovering and deploying new and interesting technologies which will help our businesses succeed. Cloud computing can help give us both the time, and the technology to evolve the services we offer our users and customers. Machine learning and data analytics are making huge advances at the moment, and the Internet of Things is finally taking off and giving us ways to harvest data which can be used for a real purpose.

I’m very excited about what 2016 has to offer, and you should be too. See you on the other side =]

 

 

Exchange 2013 Hybrid – Content was blocked because it was not signed by a valid security certificate

Hello again. The last few days have given me lots of new things to do, so apologies if you are being inundated with blog posts!

So today I went to enable a new Exchange 2013 Hybrid configuration. I used the Start Menu launcher for ‘Exchange Administrative Centre’, which to be honest I don’t usually do. This took me to https://localhost/ecp/?ExchClientVer=15. I then went to Hybrid and enabled the Hybrid Configuration. I logged into Office 365 and was greeted by this friendly message of doom:

Content was blocked because it was not signed by a valid security certificate

This error is quite easily solved; do not use localhost as the server name when you access the ECP. Use your client access namespace instead. For example, if my CAS name was mail.misstech.co.uk, I would browse to https://mail.misstech.co.uk/ecp/?ExchClientVer=15.

Just be sure to put outlook.office365.com and your CAS name into your Intranet Zone too or you’ll then get an error about Cookies!

412 - Cookies are disabled

Thanks for reading!

Exchange 2013 installation and annoying Outlook certificate Security Alert

I know, I know….why am I blabbering on about an Exchange version which is 3 years old?! The answer is because I still install it all the time, mainly for the purposes of Exchange Hybrid deployments. And this is probably old news to most of you, but if you didn’t know, Exchange 2013 can be particularly annoying when you first install it.

Once the install is completed, an SCP record is registered in Active Directory for your shiny new server (which still has all of its out of the box settings). If you faff around at all after the installation has completed, drinking tea and making merry at the water cooler, you will find that your users start moaning at you about the certificate errors they are receiving.

Outlook Certificate Error

This is because your new server has, without your consent, started merrily responding to Autodiscover and EWS requests made through Active Directory. This new server doesn’t have your public certificate installed, and also is using internal server names for it’s URLs.

What you need to do on Exchange 2013 to get around this is:

  • Install your trusted 3rd party SAN/wildcard certificate and assign it to the IIS service. Restart IIS
  • Configure, as a minimum, the Autodiscover, EWS and OAB Internal URLs to reflect your Exchange namespace
    • set-webservicesvirtualdirectory -identity ‘Servername\EWS (default web site)’ -internalurl ‘https://namespace.domain.com/ews/exchange.asmx’
    • set-oabvirtualdirectory -identity ‘Servername\OAB (default web site)’ -internalurl ‘https://namespace.domain.com/OAB’
    • set-clientaccessserver -server servername -autodiscoverserviceinternaluri ‘https://autodiscover.domain.com/autodiscover/autodiscover.xml’

This should mitigate the problem while you actually configure your server. Unfortunately it’s just part of the way Autodiscovery works, and personally I’d rather it was this way round, rather than having to remember to enable the SCP record at some point. Because knowing me, I’d forget.

You could also mitigate this problem by following the guidance on my blog post about Autodiscover optimisation and disabling SCP lookups temporarily for your users. If you are going to be using Hybrid in the future, this may be desirable anyway.

 

Azure Backup and Resource Manager Virtual Machines

Ask anybody in the Azure teams at Microsoft and they will tell you that Resource Manager is the future of Microsofts Azure strategy. And I agree; it’s much more versatile, robust, and finally gives us the ability to multitask rather than waiting for one task to complete before starting another. For all it’s benefits though, it is still an immature system. One big bugbear of mine is to do with changing Availability Sets once a VM is created, but that’s not what I wanted to write about today.

Microsoft recommend that all new services in Azure should be created using the Azure Resource Manager model. Which is all well and good, unless you want to back these servers up using Azure Backup. In which case you will have a problem.

I recently attempted to do this and was happily running through the guide provided by Microsoft entitled Deploy and manage backup to Azure for Windows Server/Windows Client using PowerShell. This article includes the following warning:

Warning: Azure Backup

I didn’t have a backup provider yet so I tried to run the command, only to be told that:

register-AzureProvider : The term 'register-AzureProvider' is not recognized as the name of a cmdlet, function, script file, or operable program. Check the spelling of the name, or if a path was included, verify that the path is correct and try again. At line:1 char:1 + register-AzureProvider -ProviderNamespace Microsoft.DevTestLab + ~~~~~~~~~~~~~~~~~~~~~~ + CategoryInfo : ObjectNotFound: (register-AzureProvider:String) [], CommandNotFoundException + FullyQualifiedErrorId : CommandNotFoundException

It seems that this cmdlet has been deprecated in Azure PowerShell v1.0.

Additionally, when looking into this error, I found the following tidbit of information on Microsofts Preparing your environment to back up Azure virtual machines page:

Backing up Azure Resource Manager-based (aka IaaS V2) virtual machines is not supported.

Bit of a showstopper eh? Luckily myself and my colleague simply wanted to schedule a backup of the ADFS database, so to workaround this we added a secondary Data disk to Azure and used Windows Server Backup to take a System State backup of the Primary ADFS server. For those planning a more extensive Azure Backup Strategy, you may need to rethink your use of v2 (Resource Manager) virtual machines.

As with all things Azure, things change all the time, and I’m sure Microsoft will add support for this feature very soon.