I’ve recently been working on reviewing conditional access policies in Azure AD. Thankfully this process has become much easier than the early days with the introduction of Azure Monitor and Report-Only mode conditional access policies which allow you to properly pilot a configuration before going live.
I needed to grab an export of all sign-ins that were failing a particular report-only policy that was set up to block legacy authentication. This led me down the path of Azure Monitor and writing my first KQL query.
This KQL query grabs all sign-ins that have failed a report-only conditional access policy, and outputs the sign-in data alongside information about the policy in question:
Here’s the KQL query code:
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters. Learn more about bidirectional Unicode characters
Uses mvexpand to expand the ConditionalAccessPolicies collection that’s included along with each sign-in’s data. The collection contains one object per conditional access policy in the Azure AD environment
Narrows down the list to only sign-ins where the result of a policy was a “reportOnlyFailure”
Uses the ‘project’ operator to retrieve only the data we’re interested in
From here, you can export the data to CSV and work your magic with it.
Every now and then I find myself wishing I had a documented copy of a clean Default Domain Policy GPO and Default Domain Controllers GPO lying around for reference.
I was setting up a Server 2016 AD lab in Azure today and thought I’d take the opportunity to save a copy of the GPO reports in HTML and PDF format while I was at it. Here they are, in case anybody’s interested:
Microsoft had to go and reinvent the wheel, and replace good ‘ole calc.exe in Windows 10 since late 2017. I can see why they did it – to make it touch-friendly.
I’ve seen an error that prevents the new calculator app from even loading in the first place. I experienced that error on my own machine after clearing out my Windows profile and logging on fresh:
I can’t work without a calculator app, I use it all the time, so I had to set off and try to find a solution. There are all sorts of involved solutions out there, but what worked for me was as simple as these two lines of PowerShell (run as an admin):
After that, my calculator app started working again:
Note: This post has been sitting in my drafts folder for almost a year, waiting for an additional screenshot. I decided to publish it today, but there may now be better solutions to this problem.
I inherited a Windows PKI setup that had the Root CA installed on a Windows Server 2008 R2 Domain Controller, with the root certificate signed with a SHA1 hash. That DC was in the process of being decommissioned, and I also wanted to move to a better PKI design.
I’d previously set up 2-tier Windows PKI infrastructures with offline Root CAs, so I knew that this was the route I was going to take again (note that this is for an SMB environment).
I would, however, highly recommend reading up on the topic before blindly following a guide. PKI is a complex topic, and you want to make the correct decisions up-front to avoid issues later on. Some additional recommended reading:
There are many recommendations around where to publish/advertise the AIA and CDP. Some of these include:
In the default location – LDAP and locally via HTTP on the CA server
To an internally-hosted web server, and then reverse-proxy connections from the Internet
To an externally-hosted web server
I’d already used Azure Blob Storage to store some other small files, so I thought I’d have a go at seeing if it’s able to be used for AIA and CDP storage. As it turns out, it’s quite easy to do, and you don’t even need to mess around with double-escaping like you would need to if you hosted on IIS or an Azure Web App:
Playing with AzCopy and Blob Storage as a PoC for PKI CRL storage. It's pretty handy! https://t.co/yrBzDyBEJJ
TLDR; The CA saves the CRL files to the default location of C:\Windows\System32\CertSrv\CertEnroll, and AzCopy then copies them up to an Azure Blob Storage account that’s configured with a custom domain of pki.yourdomain.com
Here are the requirements to get this all set up:
CDP and AIA on Enterprise/issuing CA configured to save to the default C: location, and also advertise availability at http://pki.yourdomain.com
A folder in the blob storage named ‘pki’ (not necessary, but you’ll need to adjust the script if you don’t use this folder)
A SAS key with read/write/change access to blob storage only (don’t assign more access than necessary)
A scheduled task running hourly as NETWORK SERVICE to call the below PowerShell script
Ensure that NETWORK SERVICE has modify permissions to the log location (default is %ProgramData%\ScriptLogs\Invoke-UpdateAzureBlobPKIStorage.log)
You’ll need to manually copy your offline root CA certificate and CRL to the blob storage location. This script is designed to copy the much more frequent CRLs and Delta CRLs from your Enterprise CA to blob storage.
As it turns out, AzCopy is perfect for this because it supports the /XO parameter to only copy new files. That allows us to schedule the script to run hourly without incurring additional data transfer costs for files that already exist in the storage account.
I wrote a PowerShell script that does the following:
Checks that AzCopy is installed
Determines if the C:\Windows\System32\CertSrv\CertEnroll folder exists
Copies only changed files with extension .CRL to to the blob storage account
Logs successful and failed transfers to %ProgramData%\ScriptLogs\Invoke-UpdateAzureBlobPKIStorage.log
Use pkiview.msc on a domain-joined machine to check the status of your CDP and AIA
Generating a SAS with least-privilege for AzCopy to use. Note that you’ll need to set Allowed Protocols to HTTPS and HTTP, not HTTPS only
The script’s archive log, showing the successful transfer of the CRL and Delta CRL
As always, use this at your own risk and your mileage may vary. Please drop me a comment below if you have any questions, feedback, or run into issues with the script.
Most IT departments would have some type of service desk system in place, but are they using it for more than just the basic support scenarios and change control?
Any modern service desk system should also be able to schedule tickets and change requests, and perhaps even perform more advanced workflow functions.
I’m using the excellent Freshservice SaaS app, and I’ve recently been taking advantage of the scheduling and workflow features to automatically generate tickets to:
Schedule manual password changes for service accounts that don’t support the use of Managed Service Accounts (or gMSAs)
Schedule guest Wi-Fi network key rotation
Moving these types of tasks out of the minds and calendars of individual staff is important. It ensures that these sometimes critical actions continue regardless of staff turnover.
Another benefit is that within each scheduled ticket you can include clear written instructions on how to carry out the task. You also gain a long-term audit trail and notes for each time the task was carried out.
One final related note – you could also look into pointing your email security and other notifications to the service desk if you aren’t already doing so. Again, you’ll get a clear owner for each outstanding task, an audit trail of what was done, and you can assign priorities and SLAs. For example:
Testing FileSite 64-bit, I ran into an issue on my own PC. I had 64-bit Office 2016 installed, but the FileSite installer refused to continue and presented me with the following message:
In an attempt to locate the cause of the issue, I fired up the trusty Sysinternals Process Monitor, and set up a filter to capture activity from msiexec.exe. I then further refined that filter to capture only RegQueryValue operations, and re-ran the installer.
Sure enough, Process Monitor picked up some instances of the installer reading from the registry to determine the ‘bitness’ of Office and other iManage products. In my case, there was a lingering registry entry that led the installer to conclude that I still had the 32-bit version of FileSite installed:
Because I didn’t have any iManage products installed at the time, it was safe for me to delete the entire HKLM\SOFTWARE\WOW6432Node\Interwoven reg key.
Today I became aware of this interesting/potentially dangerous default behaviour in Internet Explorer when you use a proxy configuration PAC/WPAD file. Yes, I know that WPAD is a bad idea for other reasons, too.
To quote the IEInternals blog: “One sometimes surprising aspect of proxy scripts is that they impact the Internet Explorer Security Zone determination…. if a proxy script is in use and returns DIRECT, the target site will be mapped to the Local Intranet Zone.”
This is a non-issue if your PAC file only bypasses the proxy server for internal sites, but if you for some reason need to bypass the proxy for an external site, it’s suddenly running outside of Protected Mode and is without the protections in place that the default Internet Zone settings offer.
Here’s a test with the settings in the default state, and the PAC file instructing all HTTPS traffic to BYPASS the proxy:
The solution to this is to ensure that the following box is un-checked.
This setting can be found in Internet Explorer under Internet Options > Security (tab) > Local Intranet > Sites (button)
In a corporate environment, you can disable this “feature” via GPO, under Computer/User Configuration > Policies > Administrative Templates > Windows Components > Internet Explorer > Internet Control Panel > Security Page > Intranet Sites: Include all sites that bypass the proxy server
Disabling via GPO will result in the checkbox being greyed out:
Another test run after making the above changes, showing the correct zone assignment:
Post-publishing footnote:
I discovered that you also need to ensure that Automatically detect intranet network is not checked.
This can be achieved via GPO under Computer/User Configuration > Policies > Administrative Templates > Windows Components > Internet Explorer > Internet Control Panel > Security Page > Turn on automatic detection of intranet (set to disabled)
On the odd occasion that I need to use variables within Group Policy Preferences, I sometimes find myself wishing that there was a blog post that lists out exactly what the variables resolve to.
For example, does the %ProgramFilesDir% value include a trailing backslash? Or do I need to include one myself?
Sure, you can press F3 to bring up the list of variables, but it doesn’t provide example values:
I decided to use Group Policy Preferences itself to generate a list of the variables and their values. This was achieved through the INI file extension:
I’ve exported these preference items to XML, so you can import them into a fresh GPO and test for yourself. Get the files here.
I couldn’t get the User preferences extension to generate an INI file, and ran out of time to troubleshoot, but here’s all the variables pertaining to a Computer policy (I’ve obfuscated some values):
Apologies for the image-based table. WordPress.com doesn’t make inserting tables particularly easy.
I still find Custom Views useful when troubleshooting on individual workstations, and I’d recently been wondering if it was possible to push them out via GPP or similar. I started creating some views manually, as a test, but it was taking too long.
I’d recently been working on implementing Palantir’s WEF/WEC setup, and wondered whether I could leverage their legwork to automate the creation of these custom views.
The script I came up with took a fraction of the time to write, as opposed to the manual method. It does the following:
Downloads the Palantir ‘windows-event-forwarding’ repo in ZIP format into a temporary folder
Extracts the Event Log query out of each file in the ‘wef-subscriptions’ folder, and
turns it into an appropriately-named custom Event Viewer view (XML) file in %PROGRAMDATA%\Microsoft\Event Viewer\Views
I love how simple PowerShell makes it to work with XML.
The script needs to be run as an admin in order to create the view files in %PROGRAMDATA%, unless you change the output path in the $templateStoragePath variable. It’ll also need to be able to connect to the Internet to download the ZIP file from GitHub.
I’ve started storing my scripts in my PowerShell GitHub repo rather than as Github Gists, and it’s harder to embed them on wordpress.com. View the code via the link below:
There’s so much that can be done with the built-in Windows tools to prevent commodity malware or ransomware attacks before you even spend a cent on 3rd party tools. All of these things can (and should be) combined to create a good multi-layered strategy:
Using newer Office features to prevent execution of macros in files downloaded from the Internet
Patching applications, keeping them up-to-date
Preventing script hosts and command interpreters from connecting to the Internet
The last point has been on my to-do list for some time now. I was again reminded of it the other day while watching Sami Laiho’s recent Microsoft Ignite session about PAWs.
A lot of email-delivered malware begins with a macro or via DDE attack, and then attempts to connect to the Internet to pull down more nasties.
Today I came across this great blog post by Branden, in which he describes a handy method to prevent applications from communicating with hosts out on the Internet, while still allowing them to communicate within the internal network.
I set about manually creating a list of outbound firewall rules, including a whole bunch to mitigate the application whitelisting bypasses highlighted by the brilliant Casey Smithhere. Doing this via the GUI is painful, and I wouldn’t wish it on anybody:
Here’s a screenshot of PowerShell connecting to the web, before putting the firewall rule in place:
And here’s one taken after I enabled the firewall rule:
But PowerShell can still connect to an internal web server:
There are obviously going to be exceptions to these rules, for example to enable your IT staff to access Azure AD or other cloud-based services via PowerShell, but those things should be done from dedicated administrative hosts anyway. This ruleset is more for the general user population.
When the time came to think about sharing this ruleset here on my blog, I discovered that it’s possible to export the rules from the registry and re-import them elsewhere, however that has its own potential issues.
I instead created the following PowerShell script that will generate all of the appropriate rules using the New-NetFirewallRule cmdlet. It’s also much easier to review this script to see what it does, rather than read a registry export file.
You could extend this script to apply the rules directly to the appropriate GPO by using the -GPOSession parameter on the New-NetFirewallRule cmdlet.
As usual, run at your own risk, and test thoroughly before deploying:
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters. Learn more about bidirectional Unicode characters
Create-MitigationFirewallRules – Creates Windows Firewall rules to mitigate certain app whitelisting bypasses and to prevent command interpreters from accessing the Internet
.DESCRIPTION
A script to automatically generate Windows Firewall with Advanced Security outbound rules
to prevent malware from being able to dial home.
These programs will only be allowed to communicate to IP addresses within the private IPv4 RFC1918 ranges: