Managing DNS Aging and Scavenging settings using Powershell

Aging and scavenging of DNS records is a topic that is pretty well covered on the web. I’m not really looking to rehash all the information out there with this post. I will however put out some resources for whoever wants to do the reading:

  • This post has a good “primer” for DNS aging and scavenging and the steps for implementing it.
  • This post gives a real life example of how unscavenged records impact authentication mechanisms in Windows
  • This post explains how the configuration of aging and scavenging can be done, either via GUI or batch command line.

I’ll paint the bigger picture for the environment I’m working on right now, perhaps a good example of how typical Windows Infrastructure services are setup in global corporations.

  • AD integrated DNS zones that replicate to all DCs in forest, zones allow secure updates only. This means that if we…
  • Run local DHCP services on all locations in the infrastructure we need to standardise DHCP scopes lease time to a single value, for Windows client scopes prior to working on enabling DNS Aging + Scavenging on all our DNS zones. (the other scopes we don’t care, they can’t add/update records in DNS, they’re not domain joined and the zone only allows secure updates). Link #2 gives us the correlation between DHCP lease time and DNS aging / scavenging of records.
  • We also have clients register their DNS records, not the DHCP server itself (this hasn’t come up for change until now).

What I am going to script about is what Josh Jones from link #1 above referred to as “the setup phase”. In this phase we are merely configuring the DNS zones to age DNS records according to our requirements. The guys over at cb5 do a fine job of explaining the various scenarios to change this via DNSCMD, via the wizard and all the “bugs” of the GUI wizards.

That may be fine for just a few zones, but when you have tens of DNS zones (most of them reverse DNS) the clicky business starts to sound less fun. Also working with DNSCMD might not be everyone’s cup of tea. Luckily I’m writing this in 2013, a few months after the release of Windows Server 2012 and the shiny new cmdlets it brings, and yes, there are DNS server ones.

So you will need a client running either Windows 8 + Windows Server 2012 RSAT or a Windows Server 2012 box (doesn’t need to be domain controller or DNS server, a member server is fine).

Get DNS Aging and Scavenging Settings

If (-not (Get-Module DNSServer -ErrorAction SilentlyContinue)) {
 Import-Module DNSServer
 }

#Report on Existing Server settings
$DnsServer = 'dc1.contoso.com'
$filename = "c:\temp\AD\$($DNSServer)_Before_AgScavConfig_$(get-date -Uformat "%Y%m%d-%H%M%S").csv"
$zones = Get-DnsServerZone -computername $DnsServer
$zones | %{ Get-DnsServerZoneAging -ComputerName $DnsServer -name $_.ZoneName} | Export-Csv -NoTypeInformation $filename

There’s nothing too fancy about this part. We get all the Zones we need using Get-DNSServerZone, then we pass the value to Get-DNSServerZonesAging. The output would return following information:

ZoneName Name of the DNS Zone
ScavengeServers Servers where this zone will be scavenged
AgingEnabled Flag wether records are aged or not
AvailForScavengeTime Time when the zone is eligible for scavenging of stale records
NoRefreshInterval Interval when the Timestamp attribute cannot be refreshed on the DNS Record
RefreshInterval Interval when the Timestamp attribute can be refreshed on the DNS Record

If no one ever configured Scavenging on the servers, the output should be pretty much blank.

Configure Aging of DNS records for all zones

This snippet accomplishes this:

If (-not (Get-Module DNSServer -ErrorAction SilentlyContinue)) {
	Import-Module DNSServer
	}

#Set New values
$DnsServer = 'dc1.contoso.com'
$DNSIP = [System.Net.DNS]::GetHostAddresses($dnsServer).IPAddressToString
$NoRefresh = "3.00:00:00"
$Refresh = "5.00:00:00"
$zones = Get-DnsServerZone -computername $DnsServer | ? {$_.ZoneType -like 'Primary' -and $_.ZoneName -notlike 'TrustAnchors' -and $_.IsDsIntegrated -like 'False'}
$zones | % { Set-DnsServerZoneAging -computerName $dnsServer -Name $_.ZoneName -Aging $true -NoRefreshInterval $NoRefresh -RefreshInterval $Refresh -ScavengeServers $DNSIP -passThru}

Learning Points

The $Zones variable now contains a filtered list of zones, the Primary zones, those that are not “TrustAnchors” and those that are not AD Integrated (the 0.in-addr.arpa … 127.in-addr.arpa and 255.in-addr.arpa zones).

Why we do this? Well in our case we only run primary and stub zones, so that explains the “primary” filter. The “Trust Anchors” Zone we don’t have a use for (more info on Trust Anchors here). Lastly the filter removes zones that are not AD integrated (we will never be able to get an IP from those zones, since they are either network addresses, loopback addresses or broadcast addresses).

Note: If you fail to filter the “0, 127 and 255” zones your last command will spit out an error like below. I looked the Win32 9611 error code up in the windows 32 error code list  and it means “Invalid Zone Type”. So filter it, ok ?!

Set-DnsServerZoneAging : Failed to set property ScavengeServers for zone 255.in-addr.arpa on server dc1.contoso.com.

<em id="__mceDel">At line:1 char:14
+ $zones | % { Set-DnsServerZoneAging -computerName $dnsServer -Name $_.ZoneName - ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
 + CategoryInfo : InvalidArgument: (ScavengeServers:root/Microsoft/...ServerZoneAging) [Set-DnsServerZoneA
 ging], CimException
 + FullyQualifiedErrorId : WIN32 9611,Set-DnsServerZoneAging

You should also be careful that the commandlet expects the Refresh/No-Refresh Intervals in a specific format, and the ScavengeServers parameter  needs to be an IP address, not a hostname.

The -PassThru switch displays some output on the console, as by default the commandlet doesn’t generate output.

The last commandlet (Set-DNSServerZoneAging) has kind of little documentation about it flying on the web, and I actually found some documentation to some non-existing parameters that got me all excited, something like a “SetAllZones”, but the actual parameter doesn’t exist as of this time (February 2013). So I had to use a foreach loop to configure each zone.

Wow, Initially I wanted this to be a short post, but apparently it added up to something not  so short. I hope it is useful, and helps with your DNS Aging configuration. If there are other  more simpler/better ways to accomplish this I would like to hear about them, just a leave a note in the comments.

How to remove a KMS Server from your infrastructure

These days I took a swing at some clean-up I had to do in our KMS servers list. In any large environment you are bound to find some configurations you either did not put in place (there is usually more than 1 person managing it) or put in place for testing and forgot to remove them. I’m mainly referring to KMS servers that may have once been used to activate Windows licenses, or people have attempted to set them up that way (but failed for one or more reasons). You might have this problem too in your environment, and not know about it. Usually any “rogue” or unauthorized KMS servers also publish their KMS service in DNS. This means that when a client tries to activate it will pick one of the servers that offer the _VLMCS service (license activation) in the _TCP node of the DNS suffixes he has configured or his own domain name. By default all KMS hosts publish their Service Record with equal priority and weight, so with few KMS hosts, there’s a high chance you will get sent to the wrong/rogue KMS. If the client picks the correct KMS host, all is well with the world, if not, they get an error and you get an unneeded support call that users can’t activate their Windows.

To fix this you should first find the rogue KMS hosts. Since the information is published in your DNS, this nslookup query should reveal your servers:

nslookup -q=srv _vlmcs._tcp.contoso.com

Run this for all your subdomain’s fqdn to list all servers. A sample output would be this:

Server: dc1.contoso.com
Address: 192.100.5.10

_vlmcs._tcp.contoso.com SRV service location:
 priority = 0
 weight = 0
 port = 1688
 svr hostname = KMS01.contoso.com
_vlmcs._tcp.contoso.com SRV service location:
 priority = 0
 weight = 0
 port = 1688
 svr hostname = John-Desktop.contoso.com
KMS01.contoso.com internet address = 192.41.5.4
John-Desktop.contoso.com internet address = 192.20.50.20

As you see, we have 2 KMS host entries, one seems valid, the other looks like someone attempted to activate his PC the wrong way and ended up publishing KMS service records in DNS. Here’s how to remove this, for good. Some of the steps are taken from technet documentation, some are from social.technet site.

  •  Login/RDP/PSEXEC to the affected host (John-Desktop) and uninstall KMS product key. To do this, run this from an elevated command prompt:
cscript %windir%\system32\slmgr.vbs /upk
  • Install the default KMS client key, found here:
cscript %windir%\system32\slmgr.vbs /IPK [KMS client Setup Key]"
  • Activate the computer as a client using the command below. In our case it would go to the KMS01.constoso.com host
cscript %windir%\system32\slmgr.vbs /ato"
  • Now you should stop this record from being published in DNS. You guessed it, just because you uninstalled the KMS host key and put in the client Key doesn’t mean he stopped advertising KMS in DNS. If you are running Windows 2008 R2, slmgr.vbs has  a switch which does this for you:
cscript %windir%\system32\slmgr.vbs /cdns"

Important Note: If you are running Windows 2008 not Windows 2008 R2 there is no /cdns switch. Also you cannot run slmgr.vbs from a 2008 R2 box over the 2008 machine with that switch, it will say the something like this:


Microsoft (R) Windows Script Host Version 5.8
Copyright (C) Microsoft Corporation. All rights reserved.

The remote machine does not support this version of SLMgr.vbs

This is also a good “failsafe” command in case the /cdns switch didn’t work for Windows 2008 R2. Changing this registry key worked for me, other people suggested other fixes (here) but along the same lines, I didn’t test them. You need to run this command from an elevated command prompt:

reg add "HKLM\SOFTWARE\Microsoft\Windows NT\CurrentVersion\SL" /v DisableDnsPublishing /t REG_DWORD /d 1
  • Stop and Start the Software Licensing Service:
net stop SLSVC

net start SLSVC

Update: If running Windows 2008 R2 you should look at restarting Software Protection Service

net stop "Software Protection"

net start "Software Protection"
  • Remove the _vlmcs KMS service record for John-Desktop from the contoso.com _tcp node. You can do this via dnsmgmt.msc console

That’s about it, Hope someone finds this one useful. Any comments are welcome.

Active Directory Domain Controller Backups – Part 3

Time for the last part of the Active Directory Backup Series. The last 2 posts(#1 and #2) have been about defining what needs to be backed up and the scripts / commands that are used to describe. This time we will just discuss some administrative things, they involve Powershell and some Group Policy Preferences (GPP) Settings. So you have all the parts that make the thing “backup”, now how do you put them to work, regularly, automatically, with as little maintenance as possible from your side. This how I chose to do it, from a high level:

  • I created a central location to store your scripts then deploy a GPO with GPP that copies over the scripts  from that central location.
  • For running the backups script I used scheduled tasks created using a Powershell scripts on all remote systems.

I admit it is not the most elegant, zero-touch approach, but I had to make a compromise. Some of you might think, “hey, why didn’t you use GPP to deploy a scheduled task? That would have been easier, no reason to have this hassle with a script that creates your scheduled tasks.”

The reason why I chose to do it via a script is I need access to the network to run the actual scripts (they save data to a file share) so using system account to run the batch jobs is out of the question as it doesn’t have network access, I need a domain account (member of backup operators) and with access to the share. This means I have to be careful when passing credentials to configure the scheduled tasks. As it turns out, when passing credentials in a GPP, they are just obscured, not encrypted (more info here and here) so no way am I giving backup operators credentials over the sysvol share, it is not secure  So, I either create the backup tasks manually (2 * number of scripts * number of domains) or create a script to do it for me. They say “Efficiency is the most intelligent form of laziness”, so I wrote a script.

With this out of the way, let’s handle each task, first off….

Distributing The Scripts

Create a folder in a central location where my scripts and files would be located. In case you don’t know what I’m talking about, it’s the scripts from this post.

Create a GPO object, map it to the Domain Controllers OU, and configure its GPP settings to copy the folder you setup  at step 1, locally on the DCs like in the picture below (I’m showing you just 1 file, make sure your GPP has all 3 files included). The GPO also changes script execution policy  so you can run scripts as batch jobs. 
GPP_Settings

I’ve applied this GPO to the Domain Controllers OU but restricted it to only apply to a certain Security group in AD (and yes you guessed it, I put the DCs I want to backup in that group).

Creating the Scheduled Jobs

I found Ryan Dennis’s blog here where he gives a sample of how to create a scheduled task, and he took it form another smart man, over here.  I took his sample script and tweaked it a little, I needed to make it a little more generic, and to be able to accept credentials as parameters. Then I created another script that calls the “Create-scheduledtask.ps1” to connect to each DC and create the scheduled tasks. Needless to say you need to be Domain / Enterprise admin to run these scripts

$BackupTasks = Import-CSV -useculture "C:\Temp\AD\BCP\BackupSource.csv"
$Domains = $BackupTasks | group-object -property Domain
$DomainCreds = @()
foreach ($domain in $domains) {
 $Creds = Get-Credential -Credential "$($Domain.Name)\dom_backup"
 $row = "" | select Domain,UserName,Password
 $row.Domain = $domain.Name
 $row.UserName = $Creds.UserName
 $row.Password = $Creds.Password
 $DomainCreds += $row
 }

#Create_Tasks
Foreach ($BackupTask in $BackupTasks) {
 $curCred = $DomainCreds | ? { $_.domain -eq $BackupTask.Domain }
 $SchedTaskCreds = New-Object System.Management.Automation.PsCredential($curCred.UserName,$curCred.Password)
 #$SchedTaskCreds
 $ScriptFullPath = $BackupTask.ScriptFolder+ "\" + $BackupTask.ScriptName
 .\Create-ScheduledTask.ps1 -HostName $BackupTask.HostName -Description $BackupTask.TaskName -ScriptPath $ScriptFullPath -SchedTaskCreds $SchedTaskCreds
 }<span style="color: #222222; font-family: 'Courier 10 Pitch', Courier, monospace; line-height: 21px;">

As far as learning points go, I first determined which domains I needed to get credentials from, then I ask the user interactively to type the account and password. This saves a lot of password prompts when the scheduled task is created

The 2 scripts I mentioned are included in this rar file. SchedTask_Creation

This is it mostly. The scheduling script is the initial version, it would be more elegant if he just pulled host names from the AD group, then simply built with each name the script files and names from the .csv file.

A further refinement and securing of this process would be to sign the scripts using a windows CA and only allow execution of signed scripts on the DCs.