Thursday, August 25, 2011

Duplicating an IIS application in Azure with PowerShell

I have a need to add an additional application pool into IIS in my Azure Web Role.

The requirements from the developer are:

  1. I copy an existing application folder under IIS to a new folder.
  2. I turn this copy of the folder into an application.
  3. I add this application to the same application pool of the application I am copying.

I knew what I needed to do.  It can be done straight through the IIS administration console and Explorer.  Find the physical location of the IIS site, copy the folder, then in IIS right click the folder and convert to an application, selecting the existing application pool.  Sounds easy enough.  But scripting it was not that straightforward, or was it..

First, this is an Azure Web Role – no “Default Web Site” happening here.  Second, I have an installer that is creating site I am duplicating.  Third, I have an entire folder of IIS files to copy.

Begin with adding the Azure Service Runtime snap-in and importing the IIS PowerShell module.  And get information about the role where this script is running (we need that later).

Import-Module WebAdministration

add-pssnapin microsoft.windowsazure.serviceruntime

$role = Get-RoleInstance –Current

I have variables that I am passing in that will be Virtual Directory paths (note the foreslash).

$auth_virtual_path = "/MyApp/Authentication"
$fed_virtual_path = "/MyApp/FederatedIdentity"

Get the site object through the provider.  I get it this way because I need the PSPath attribute later on.

$site = Get-ChildItem IIS:\sites | where {$_.Name -match ($}

I mentioned there is no default web site, instead it is named with the Role Instance Id.

I now work through where IIS created the physical location of the default web site and stuck my virtual paths.  This is Azure, no assumptions as to where things might happen.

$authSitePath = (($site.PhysicalPath) + $auth_virtual_path) -replace "/", "\"
"The physical path of " + $auth_virtual_path + " is:  " + $authSitePath
$fedSitePath = (($site.PhysicalPath) + $fed_virtual_path) -replace "/", "\"
"The physical path of " + $fed_virtual_path + " will be:  " + $fedSitePath

Copy the original folder and contents to the new folder.

Copy-Item $authSitePath -destination $fedSitePath –recurse

Get the existing web application because I need to assign my copy to the same application pool.

$authApp = Get-WebApplication | where {$_.Path -match $auth_virtual_path}

Create the string for the new PSPath location.

$fedAuthPsPath = (($site.PSPath) + $fed_virtual_path) -replace "/", "\"

Convert the folder that I copied into an application and add it to the existing application pool.

ConvertTo-WebApplication -ApplicationPool $authApp.applicationPool -PSPath $fedAuthPsPath

That is it. 

Tuesday, August 23, 2011

NetJoinDomain failed with error code 8557

This was an interesting little one that happened in my Azure Service while using Azure Connect to join my Role instances to my on-premise domain controller.  Let me lay out the scenario..

Trying to apply some best practice to my environment I an using a regular domain user account in my Role configuration for Azure Connect (why would you ever embed a domain administrator account in a static configuration file ?!?!).


DomJoiner is simply a regular user, users can join machines to a domain.

Everything was working along perfectly fine until yesterday.  I applied the Roles to my Virtual Network group in the Azure Portal and nothing happened.  My machines did not reboot (domain join), they did not appear in the domain, nothing.

Finally I ran across a specific Azure Connect log file “integrator.log” found at %programfiles%\Windows Azure Connect\Endpoint\Logs

Within this log I could see the configuration being received, Azure Connect linking up and my error:

RRAS interface connected

DNS server configured on RRAS interface

NetJoinDomain failed with error code 8557. Target domain......

Oh, an error code, lets go trolling.  Search was letting me down.  All the error references were for Server 2000 Active Directory and I am using Server 2008 R2.  Also, no references to the error and Azure.  I can’t image I am the only one that has seen this.

Finally, the details of two articles had the knowledge:  KBs 251335 and 314462.  A user has a default (out of the box) limit of being able to join 10 computers to a domain.

I opened ADSIEDIT.msc, selected the properties of the correct naming context, then cleared the setting ms-DS-MachineAccountQuota.


This all happened because I am using the prudent practice of using a regular user account ( not a domain administrator) to join my Azure Role instances to my domain with Connect.  But then most developers I know would only be using a Domain Administrator account and may never see this issue.

Friday, August 19, 2011

Importing a Certificate Revocation List with PowerShell

This was an interesting one and a follow-up to my post about importing a Certificate (.cer) with PowerShell.

I now ran into the situation where I have an application that is highly enforcing certificate use by using the .Net libraries.  The disconnect came into play because the application was testing the Certificate Revocation List of the certificate that I provided with my private Certificate Authority.

Mind you, if you use a public Certificate Authority you will most likely never see problems as long as your machine can get to the internet to test the revocation list.  In my case my Certificate Authority server is not on the internet, but my application only is (yes, flying high in Azure).

The second hitch came because PowerShell does not have a method to deal with certificate revocation lists within the certificate handling object ( System.Security.Cryptography.X509Certificates ).

Thank goodness that my target system is an Azure Web Role with IIS installed, as that gave me a tool; certutil.exe

In my Azure Web Role I have a folder with all my scripts and additional binaries for use with Start Up Tasks:


In my PowerShell script I test for the path where the script is running from:

$exPath = Split-Path -parent $MyInvocation.MyCommand.Definition

I then look in this same path (the folder in the snip above) for the .cer and .crl (now just focusing on the .crl)

"Looking for included *.crl.."
$crlFile = get-childitem $exPath | where {$_.Extension -match "crl"}

If I found one, I try to import it into the same store where I imported the root CA certificate of my private Certificate Authority:

if ($crlFile -ne $NULL) {
    "Discovered a .cer as part of the Service, installing it in the LocalMachine\Root certificate store.."
    certutil -addstore Root $crlFile.FullName

Put it all together and it looks like this:



###  Import a .cer to the LocalComputer\Root Certificate store
<# This allows the addition of a .cer that is the certificate of a private or Enterprise (non-public) Certificate Authority.
   The .cer is simply included with the other scripts in the Role project in Visual Studio #>

"Looking for included *.crl.."
$crlFile = get-childitem $exPath | where {$_.Extension -match "crl"}
if ($crlFile -ne $NULL) {
    "Discovered a .cer as part of the Service, installing it in the LocalMachine\Root certificate store.."
    certutil -addstore Root $crlFile.FullName

Wednesday, August 17, 2011

Getting remote PowerShell output back to your local machine

PowerShell remoting has been around for some time now but I have not had a reason to use it.  I finally found one, I needed some information from within a file on a remote system.

This is where WMI let me down.  It could give me information about the file(s) I needed, but I needed to read and parse the content of the file and I further needed to process that at the client machine that was calling the remote machine.

Searching on this really did very little for me, in the end the solution was relatively simple.  Cast my session output to a variable.

So…  PSRemoting.  First of all you need a PSCredential.  I am hitting remote systems, so I need to build a credential object using a username and password.  My password begins as plain text.

$securePass = ConvertTo-SecureString $password -AsPlainText -force
$creds = New-Object System.Management.Automation.PSCredential($userName,$securepass)

I then open a PS session.

$s = new-pssession -ComputerName $computername -Credential $creds

Now, I can simply invoke commands using this session.  Like this.

invoke-command -Session $s -ScriptBlock { 
    $file = get-wmiobject CIM_DataFile -filter 'Extension="config" and FileName="MyFile"'

    $fileContents = get-content -path ([string]$file.DesiredContent)

On the screen I have exactly what I want.  I have captured it to $fileContents.  The problem is that cannot use $fileContents, it does not exist within the local scope – it exists within the remote (PSSession) scope. 

My C# classes make me think global and remote variables.  But that isn’t quite right either.  It is closer to user security context and the fact that they are separate.  But I want to pass between them.  And I don’t just want to pass a variable in, that is straightforward.

Come to find out, I needed to change where I capture $fileContents.

$fileContents = invoke-command -Session $s -ScriptBlock {
$file = get-wmiobject CIM_DataFile -filter 'Extension="config" and FileName="MyFile"'

get-content -path ([string]$file.DesiredContent)

Now, I have a local object of $fileContents that I can manipulate.

Just remember to close your remote PS session if you are going to make multiple calls.