Signing existing OS Configuration Discovery scripts

Very simply, this is an example of how to take existing configuration item discovery scripts that are present in a given Configuration Baseline and sign each of them.  This can be useful if you are importing scripts from the SCAP extensions etc..  Depending on your configuration item source, you may have several hundred scripts to sign.

Continue reading


Exporting Office 365 Licenses to CSV

Recently, I was asked by a collegue to review a script that is publically available in the technet gallery (

The author mentions in the comments “If you find a way to improve this code, please share it.”  As a result, I thought I’d post my object based version.

Regarding the “English” names for the license, you can update the $sku hashtable to change the name of the license in the exported CSV.

As far as overall design, I start with a hashtable of fixed property names ($licenseHash).  This is the template for our object.  Then, I loop through the licenses and services available in the tenant and add these as keys to the hashtable.  Once I have discovered all the licenses and services, I discover all the users in the tenant.  Foreach user, I take a copy of $licenseHash and hard type it to a [pscustomobject] ($licenseduser).  This allows me to treat each of the licenses as properties of the object.  After mapping the licenses to matching properties, I use Export-CSV to append it to the spreadsheet.

Without further ado, here is the script!

Updated Dec 28 2017 to account for the service status as per JBros comment

This script will create a comma separated file with a line per user and the following columns:
Display Name, Domain, UPN, Is Licensed?, all the SKUs in tenant, all the services,
Errors, ImmutableId and BlockCredential.
Based on previous script by Marcus Tarquinio
After starting the script it will ask for the credentials to connect to Office 365
CSV specified by $csvpath
Version: 1.0
Author: Matthew DeBoer
Creation Date: September 27 2017
Purpose/Change: Initial script development
Import-Module MSOnline
# CSV output path
$csvpath = 'C:\temp\OfficeLicenseCounts.csv'
#Translate SKUs to English
$Sku = @{
"DESKLESSPACK" = "Office 365 (Plan K1)"
"DESKLESSWOFFPACK" = "Office 365 (Plan K2)"
"LITEPACK" = "Office 365 (Plan P1)"
"EXCHANGESTANDARD" = "Office 365 Exchange Online Only"
"STANDARDPACK" = "Enterprise Plan E1"
"STANDARDWOFFPACK" = "Office 365 (Plan E2)"
"ENTERPRISEPACK" = "Enterprise Plan E3"
"ENTERPRISEPACKLRG" = "Enterprise Plan E3"
"ENTERPRISEWITHSCAL" = "Enterprise Plan E4"
"STANDARDPACK_STUDENT" = "Office 365 (Plan A1) for Students"
"STANDARDWOFFPACKPACK_STUDENT" = "Office 365 (Plan A2) for Students"
"ENTERPRISEPACK_STUDENT" = "Office 365 (Plan A3) for Students"
"ENTERPRISEWITHSCAL_STUDENT" = "Office 365 (Plan A4) for Students"
"STANDARDPACK_FACULTY" = "Office 365 (Plan A1) for Faculty"
"STANDARDWOFFPACKPACK_FACULTY" = "Office 365 (Plan A2) for Faculty"
"ENTERPRISEPACK_FACULTY" = "Office 365 (Plan A3) for Faculty"
"ENTERPRISEWITHSCAL_FACULTY" = "Office 365 (Plan A4) for Faculty"
"ENTERPRISEPACK_B_PILOT" = "Office 365 (Enterprise Preview)"
"STANDARD_B_PILOT" = "Office 365 (Small Business Preview)"
"VISIOCLIENT" = "Visio Pro Online"
"POWER_BI_ADDON" = "Office 365 Power BI Addon"
"POWER_BI_INDIVIDUAL_USE" = "Power BI Individual User"
"POWER_BI_STANDALONE" = "Power BI Stand Alone"
"POWER_BI_STANDARD" = "Power-BI standard"
"PROJECTCLIENT" = "Project Professional"
"PROJECTONLINE_PLAN_1" = "Project Online"
"PROJECTONLINE_PLAN_2" = "Project Online and PRO"
"EMS" = "Enterprise Mobility Suite"
"RIGHTSMANAGEMENT_ADHOC" = "Windows Azure Rights Management"
"MCOMEETADV" = "PSTN conferencing"
"SHAREPOINTSTORAGE" = "SharePoint storage"
"PLANNERSTANDALONE" = "Planner Standalone"
"BI_AZURE_P1" = "Power BI Reporting and Analytics"
"INTUNE_A" = "Windows Intune Plan A"
# Connect to Office 365 (need modules installed)
write-verbose "Connecting to Office 365..."
$credential = Get-Credential
Connect-MsolService -Credential $credential
# Get a list of all licences that exist within the tenant
write-verbose "Geting the licenses available in tenant"
$licensetype = Get-MsolAccountSku | Where {$_.ConsumedUnits -ge 1}
# License Object. This forms the property names of the user objects we populate later
$licensehash = @{
#Get all account SKUs in tenant
$AccountSkus = Get-MsolAccountSku
#Loop through each license in tenant and get the sku
foreach ($license in $licensetype)
    if($license.SkuPartNumber -notin $licensehash.keys){
        if($license.SkuPartNumber -in $sku.keys){
            $licensename = $sku.($license.SkuPartNumber)
            $licensename = $license.skupartnumber

        # Get a list of all the services in the tenant
        $services = ($AccountSkus | where {$_.AccountSkuId -eq $license.AccountSkuId}).ServiceStatus.serviceplan.servicename
        ForEach($service in $services){
            if($service -in $sku.keys){
                $servicename = $sku.($service)
                $servicename = $service
            if($servicename -notin $licensehash.keys){

# Get a list of all the users in the tenant
write-verbose "Getting all users in the Office 365 tenant..."
$users = Get-MsolUser -All
# Loop through all users found in the tenant
foreach ($user in $users)
    $displayname = $user.displayname -Replace ",",""
    $licenseduser = [pscustomobject]$licensehash
    $licenseduser.Displayname = $displayname
    $licenseduser.Domain = $user.UserPrincipalName.Split("@")[1]
    $licenseduser.UPN = $user.userprincipalname
    $licenseduser.ImmutableID = $user.immutableid
    $licenseduser.Errors = $user.errors
    $licenseduser.blockcredential = $user.blockcredential
    $licenseduser.IsLicensed = $user.IsLicensed
    if ($user.isLicensed){
        ForEach($userlicense in $user.licenses){
            if($userlicense.AccountSkuID.ToString() -in $licensetype.AccountSKUid){
                $usersku = (($userlicense.accountskuid.tostring()) -split ':')[1]
                if($usersku -in $sku.keys){
                    $usersku = $sku.($usersku)
                if($usersku -in $licensehash.keys){
                    $licensedUser.$usersku = $true
            $UserLicenseConfiguredServices = $userlicense.ServiceStatus | Where{$_.provisioningstatus}
            ForEach($service in $UserLicenseConfiguredServices){
                if($service.ServicePlan.ServiceName -in $sku.keys){
                    $servicename = $sku.($service.ServicePlan.ServiceName)
                    $servicename = $service.ServicePlan.ServiceName
                if($servicename -in $licensehash.keys){
                    $licensedUser.$servicename = $service.ProvisioningStatus

$licenseduser | Export-csv -path $csvpath -Force -Append -notypeinformation



Azure Pack Automation (SMA) Get-SMAJobOutput fails

I recently ran into an issue where regardless of the method used, I was unable to get any job output from Service Management Automation.

Get-SMAJobOutput -JobID '5c773933-5a9b-4021-a793-768b2efd0165' -WebServiceEndPoint $WebServiceEndpoint -Stream Output

simply returned:

Get-SmaJobOutput : The job 5c773933-5a9b-4021-a793-768b2efd0165 cannot be found: System.Data.Services.Client.DataServiceClientException: <?xml version="1.0" 
encoding="utf-8" standalone="yes"?><error xmlns=""><code></code><message xml:lang="en-US">An error 
occurred while executing the command definition. See the inner exception for details.</message></error>
   at System.Data.Services.Client.BaseAsyncResult.EndExecute[T](Object source, String method, IAsyncResult asyncResult)
   at System.Data.Services.Client.QueryResult.EndExecuteQuery[TElement](Object source, String method, IAsyncResult asyncResult)
At line:1 char:1
+ Get-SmaJobOutput -Id $id -WebServiceEndpoint $webserviceendpoint
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    + CategoryInfo          : NotSpecified: (:) [Get-SmaJobOutput], InvalidOperationException
    + FullyQualifiedErrorId : System.InvalidOperationException,Microsoft.SystemCenter.ServiceManagementAutomation.GetSmaJobOutput

Note that this is not the issue where the output fails with the same error if using the wrong -Stream case.

Additionally, getting the job output from the Azure Pack admin site, simply returned a red exclamation mark.

Next, I adapted a REST based function to get the output from the REST api (originally posted here:

# Original script from
# Modified by Christopher Keyaert (

 Function Get-SMARESTJobOutput{
# Ignore SSL Self-Signed certificate  
    [System.Net.ServicePointManager]::CertificatePolicy = new-object IgnoreSelfSignedCertificate
add-type @"
    using System.Net;
    using System.Security.Cryptography.X509Certificates;
        public class IgnoreSelfSignedCertificate : ICertificatePolicy {
        public IgnoreSelfSignedCertificate() {}
        public bool CheckValidationResult(
            ServicePoint sPoint, X509Certificate cert,
            WebRequest wRequest, int certProb) {
            return true;
"@ -ErrorAction SilentlyContinue
    [System.Net.ServicePointManager]::CertificatePolicy = new-object IgnoreSelfSignedCertificate
    $SMAServer = "${WebServiceEndpoint}:9090"
    $VerbosePreference = "Continue"
    #$VerbosePreference = "SilentlyContinue"
    Write-Verbose ""

    $URI =  "$SMAServer/00000000-0000-0000-0000-000000000000/Jobs(guid'" + $ID  + "')"
    $Response = if($Credentials){Invoke-RestMethod -Uri $URI  -Method Get -Credential $credentials}else{Invoke-RestMethod -Uri $URI  -Method Get} 
    $JobStatus = $ 
    Write-Verbose "Job Status = $JobStatus"
    $URI =   "$SMAServer/00000000-0000-0000-0000-000000000000/JobStreams/GetStreamItems?jobId='" + $ID +"'&streamType='$Stream' "
        $Result = if($Credentials){Invoke-RestMethod -Uri $URI  -Method Get -Credential $credentials}else{Invoke-RestMethod -Uri $URI  -Method Get}
    }Catch [exception]{
        $Exception = $_
    Write-verbose "StreamText = $($"
    $outputname = "Stream${Stream}"
    New-object -TypeName PSCustomObject -ArgumentList @{'ID'=$ID;'Status'=$JobStatus;$outputname=$}

This returned the same error as Get-SMAJobOutput (An error
occurred while executing the command definition. See the inner exception for details)

The more I researched this, the more it smelled like a performance issue querying the database.
After doing a bit of digging, I found that my [SMA].[Stream].JobStreams table was ~17,000,000 rows. Using the following command, I dropped the job history down to 30 days (we had previously set it to 90):

Set-SmaAdminConfiguration -PurgeJobsOlderThanCountDays 30 -WebServiceEndpoint $webserviceendpoint

This took 8 hours to run and dropped the table down to 9,000,000 rows. The issue still remained, so I dropped it further to 25 days and also disabled debugging and process logging on all runbooks. This dropped the table down to 7,500,000 rows. Now the issue occurred ~75% of the time….Progress!

Since I had now verified that it was an SQL performance issue, I turned to the Activity Monitor of SQL Studio.
Sure enough, the JobStreams table did not have an index usable to query the data required by the Get-SMAJobOutput powershell command. Adding the following index dropped the query time to instantaneous:

ON [Stream].[JobStreams] ([TenantId],[JobId])

The issue was now resolved! Please note that this fix may not be supported by Microsoft. Please use at your own discretion.

Modifying SharePoint Calendar via Powershell

First off, you’ll need the SharePoint Client Components for your appropriate version of SharePoint. For myself, I needed the 2013 version available here

In order to be able to reference the Sharepoint Client objects, you’ll need to load the assemblies:

# Import Sharepoint client Assemblies
$null = [System.Reflection.Assembly]::LoadWithPartialName("Microsoft.sharepoint.client")
$null = [System.Reflection.Assembly]::LoadWithPartialName("Microsoft.sharepoint.client.runtime")

Next, we’ll create a client object, load the site, site credentials and the site lists (Since Calendars are really just Sharepoint lists with a defined display)

# Connect to Site and get site lists
$ctx = New-object Microsoft.Sharepoint.client.clientcontext $siteurl
$web = $ctx.Web
$CredCache = New-object
$credcache.Add((New-Object system.uri $siteURL), "NTLM", [System.Net.CredentialCache]::DefaultNetworkCredentials)
$ctx.credentials = $credcache

There are a couple things to note here.
1. The CredCache portion may not be necessary for your Sharepoint installation. For myself, it was necessary for me to be able to for the NTLM authentication. Otherwise, see the links below for other authentication methods.
2. Note that this isn’t acting like nice .NET objects. In order for you to populate the properties of each object, you must use the client context object to load the properties. Then you need to send this request to the server. What actually defines what properties are available, or which properties need special load commands are still somewhat of a mystery to me.

Next we’ll get the Sharepoint calendar we are interested in and get all the items in the list.

# Get Patching calendar and items
$cal = $web.lists.getbytitle('Name of your Sharepoint calendar here')
$query = [Microsoft.SharePoint.Client.CamlQuery]::CreateAllItemsQuery(1000)
$items = [Microsoft.SharePoint.Client.ListItemCollection]$cal.GetItems($query)

Notice that in order to get the items in the calendar, we need to use the load and executequery methods twice (once for each action)

Next, I wanted to delete all calendar entries newer than today and older than 30 days

Foreach($i in ($items | %{$_.FieldValues.ID})){
    $listitem = $items.GetById($i)
    $datediff = ($Today - [datetime]$listitem.FieldValues.EventDate).totaldays 
    if(($dateDiff -gt 30) -or ($datediff -lt 0)){


Since the previous code block had already loaded the $Items object, we can check the FieldValues.ID property of the object. We’ll iterate through each ID of calendar entries and check if we want to delete them. Note, however that if we did ForEach($item in $items) $items would no longer be valid once we deleted the first item. This is because $items will have changed. This is why we are iterating through an array of IDs rather than the objects themselves. Also worth mentioning, all of the fields of the listitem are availale in $listItem.FieldValues but only after you load the load($listitem) and executequery() methods. After this is complete we can do a date comparison and delete the listitem from the calendar.

Next, we’ll add a new list item

$listCreationInformation = New-object Microsoft.SharePoint.Client.ListItemCreationInformation
$listitem = $cal.AddItem($listCreationInformation)
$listitem.ParseAndSetFieldValue('Title', "Title of your new event")
$listitem.ParseAndSetFieldValue('Description', "Description of your new event")
$listitem.ParseAndSetFieldValue('EventDate', $startTime)
$listitem.ParseAndSetFieldValue('EndDate', $endTime)

If you are looking for field names to use when setting your values, it can be useful to query an item in the list:

$items = [Microsoft.SharePoint.Client.ListItemCollection]$cal.GetItems($query)

If you are looking for more information on editing Sharepoint with Powershell, I highly recommend the following resources. Each of these provided a portion of understanding to what I have laid out above. These links however unfortunately do not cover parsing through entries in the list or the NTLM authentication mechanism that was required for my environment:

SAPIEN Powershell Studio – Scaling form contents when resizing

Update: June Blender  kindly reached out to the experts on this and provided the following method.

The anchor property defines that that the specified edge of the object should maintain its position in relation to its parent object on that edge.

This means that an anchor set to

  • None will float proportionally to the parent object
  • Left will maintain the number of pixels between the left side of the object and the left side of the parent object
  • Right will maintain the number of pixels between the right side of the object and the right side of the parent object
  • Top  will maintain the number of pixels between the upper edge of the object and the upper edge of the parent object
  • Bottom  will maintain the number of pixels between the lower edge of the object and the lower edge of the parent object

Continue reading

MS16-072 Causing GPO Application Problems

3 days after patch Tuesday, this has been a fairly well reported issue.
There have been some other blog posts about identifying troublesome GPOs
(see here:

The issue is that only the Domain Computers group or Authenticated Users group may read (not apply) Group Policy objects. This means that any group policy that is missing this read permission will not apply even if the user or computer has the GPOApply permission delegated from another group.

Depending on the extent of your group policy environment, previous scripts (see link above) may not work so well. The issue is that I may have upwards of 100 group policies to identify and fix. Since I’d rather identify the GPO objects that need fixing and then add the permissions with a script it would be more beneficial if this were a function with object output. This allows filtering of output with standard commands such as Where-Object, or exporting to Export-CSV or reusing the output for a fixme script.

Here is my modified version:

Function Test-GPOAuthenticatedUsers{
    #Load GPO module
    Import-Module GroupPolicy

    #Get all GPOs in current domain
    $GPOs = Get-GPO -All

    #Check we have GPOs
    if ($GPOs) {
        #Loop through GPOs
        foreach ($GPO in $GPOs) {
            #Get Authenticated Users and Domain Computers permissions
            $AuthUser = Get-GPPermissions -Guid $GPO.Id -TargetName “Authenticated Users” -TargetType Group -ErrorAction SilentlyContinue
            $DomComp = Get-GPPermissions -Guid $GPO.Id -TargetName “Domain Computers” -TargetType Group -ErrorAction silentlycontinue

            #Check Authenticated Users and Domain Computers permissions
            if ((!$AuthUser) -and (!$DomComp)) {
                $Message= 'Missing both Domain Computers and Authenticated Users Permissions'
                $Status = 'Error'
                #Check if Domain Computers have read
                if ($DomComp.Permission -notin @('GPORead','GPOApply')) {
                    $Message= 'Missing Authenticated User Permission, but found Domain Computers Permissions found not matching read/apply'
                    $Status = 'Error'
                    $Message= 'Missing Authenticated User Permission, but found Read or Apply Domain Computers Permissions'
                    $Status = 'Warning'
                #Check if Authenticated Users has read
                if ($AuthUser.Permission -notin @('GPORead','GPOApply')) {
                    $Message= 'Authenticated Users Permissions found not matching read/apply'
                    $Status = 'Error'
                    $Message= 'Read or Apply Authenticated User Permissions found'
                    $Status = 'Good'

Usage is quite simple:


Test-GPOAuthenticatedUsers | Where{$_.status -eq 'Error'}

#Export to CSV
Test-GPOAuthenticatedUsers | Export-CSV -path 'C:\temp\Test-GPOAuthenticatedUsers.csv' -NoTypeInformation

This is all fine and nice for reporting purposes…but lets actually fix something:

Function Fix-GPOAuthenticatedUsers{
      #Path to the log file 
      #Which AD Group to use
      [ValidateSet('Domain Computers','Authenticated Users')] 
        #Load GPO module
        Import-Module GroupPolicy
        ForEach($GUID in $GPOID){
            Write-Verbose "Processing GPO $GUID"
            #Get the GPO
            if(!(Get-GPO -id $GUID)){
                Write-Error "Unable to find GPO matching $GUID"

            #Try Set the permissions
            $null = Set-GPPermissions -Guid $GUID -PermissionLevel GpoRead -TargetName $Group -TargetType Group -ErrorAction Stop

            #Test GPO perms
            $AuthUser = Get-GPPermissions -Guid $GUID -TargetName “Authenticated Users” -TargetType Group -ErrorAction SilentlyContinue
            if ($AuthUser.Permission -notin @('GPORead','GPOApply')) {
                $Message= 'Authenticated Users read/apply permissions found not after setting'
                Write-Error $Message
            Write-Verbose "Completed Processing GPO $GUID"

Usage again is quite simple…specify a GPO Guid(s) and whether you like to use Authenticated users or Domain Computers and Voila!

$BadGPOs = Test-GPOAuthenticatedUsers | Where{$_.status -eq 'Error'}
Fix-GPOAuthenticatedUsers -GPOID $ -Group 'Authenticated Users'


ConfigMgr Client Fails to Install: Unable to Compile UpdatesAgent.mof

We’ve had a couple of computers in the past being unable to re-install the Configuration Manager client due to the error:
“Unable to compile UpdatesAgent.mof”

This error can have a couple of different causes.

As such, here are a couple of steps you can try:

1. Reinstall the Windows Update agent.
2. Uninstall any existing ConfigMgr client, stop the ccmsetup service and delete c:\windows\ccm, c:\windows\ccmsetup and c:\windows\ccmcache folders
3. Run the following commands to delete the ConfigMgr namespaces completely from WMI:

Gwmi –query “Select * from __Namespace Where Name=’sms’” –NameSpace “root\cimv2” | Remove-WmiObject
Gwmi –query “Select * from __Namespace Where Name=’ccm’” –NameSpace “root” | Remove-WmiObject
Gwmi –query “Select * from __Namespace Where Name=’smsdm’” –NameSpace “root” | Remove-WmiObject

Since #3 is quite drastic, you will want to try steps 1 and 2 first before 3. However if attempting step 3, you will want to complete both steps 2 and 3 together. After this, the ConfigMgr client should successfully install.

Hopefully this helps!

Recursively Discover WMI Namespaces

Sometimes when building custom functions based on WMI it can be helpful to discover all of the WMI namespaces registered on a machine. Here is a function to do just that!

Function Find-WMINamespaces{
        [string]$StartingNameSpace = 'root',
        [boolean]$recurse = $true
    $childNamespaces = gwmi -namespace $StartingNameSpace -class "__Namespace" -computername $computer | select -expandproperty Name
    ForEach($child in $childNamespaces){
            Find-WMINamespaces -Computer $Computer -StartingNamespace "${StartingNamespace}\${child}" -recurse $true

Content Library Explorer – The Legacy Package Does Not Have Exactly One Content

I recently ran into an issue, where my Primary site server was running low on disk space. This turned into a general spring cleaning of the ConfigMgr environment. As part of the cleanup process, I wanted to check the distribution points for old or stale packages.

Microsoft has provided a toolkit for cleanup operations such as this:

Part of this toolkit is the Content Library Explorer. However, after aiming this at my distribution point, I was confronted with the following error:


Not exactly an insightful message. I did however find a useful thread regarding this issue:

Using the provided script, I happily identified 3 packages that were causing issues. I simply removed the extra old folders and redistributed these. The extra folders were now gone, but the error message remained.

After doing some more digging with procmon, I identified the verification steps the content explorer appears to make as well as 3 possible different problems which could lead to the above error message.

1. More than 1 data folder exists for a given package in the datalib subfolder of the SCCMContentLib folder. (This is addressed by the script in the link above)
2. There exists an ini file in the pkglib subfolder of the SCCMContentLib folder, but the associated ini file in the datalib folder is missing.
3. There are multiple content versions listed in the ini file located in the pkglib folder.

I have written the following function to test the SCCMContentLib folder for problems. Problems that are found by this script are fixable by removing any extra folders for the given package from the DataLib folder, removing the distribution point from the package, waiting for the files to disappear and redistributing the package to the distribution point.

Function Test-DPLegacyContent{

    #Calculate child folders
    $pkgdir = join-path -path $DPFolderPath -ChildPath 'pkglib'
    $datadir = join-path -path $DPFolderPath -childpath 'datalib'
    $childdatafolders = Get-ChildItem -Directory $datadir

    ForEach($file in (get-childitem -file $pkgdir)){
        $filecontent = Get-content $file.FullName
        $expectedcontent = $filecontent.split('`n')[1].replace('=','')
        if($expectedcontent -match $file.basename){
            #legacy package
            $packageID = $file.basename
            #Check for missing INI files
            if(!(test-path (join-path $datadir -ChildPath "${expectedcontent}.ini"))){
                [pscustomobject]@{'PackageID'=$PackageID; 'Error'="Ini file missing in datalib for $packageID"}

            #Check for mismatch in folder count
            [array]$matchingFolders = [array]($childdatafolders | Where{$_.Name -match $packageID})
            $foldercount = $matchingfolders.count
            if($foldercount -ne 1){
                [pscustomobject]@{'PackageID'=$PackageID; 'Error'="$foldercount folders found"}
        #Check for multiple content versions in pkg ini
        if(($filecontent.split('`n')[2].replace('=','')) -match $file.basename){
            [pscustomobject]@{'PackageID'=$PackageID; 'Error'="Multiple package versions found in pkglib ini"}

Test-DPLegacyContent -dpfolderpath '\\DPServer\d$\SCCMContentLib\'

To use this script, simply change the dpfolderpath parameter to the path of your SCCMContentLib.

Launching a Java JNLP file with an Old version of JRE

Back in February I wrote a post regarding using Oracle Deployment Ruleset to control the version of Java that executes a given jar file on a given website. Since that time, you may have discovered that applying this to a jnlp file is tricky at best. The problem here is that Internet Explorer downloads the jnlp file to your Temporary Internet files before executing it. As a result, the location it is launching from is not the online url, but the local jnlp file. It may be technically possible to find the certificate hash of each jar file referenced in the jnlp file and add these to you deploymentruleset.jar file but I have not tested this.

As with most Java applets, problems begin to occur when a version of JRE is installed that is higher than the version that the applet is designed for. At first what I tried to do is run a ProcMon on a computer with just the required version of Java installed.

Continue reading

Automating Download of Google Chrome

If you manage multiple computers, you will want to ensure that Google Chrome is always up to date. This important as Google contains flash built in by default as well as to patch any security vulnerabilities in Chrome. In order to accomplish this, historically you had two options:
1. Allow all your computers to auto – update. This works, but all your computers are downloading approximately 50 MB every time it updates. If you are on a slow connection, or you have lots of computers, this will be an issue.
2. Manually download the Google Chrome for Business msi file and deploy it via Microsoft System Center Configuration manager or group policy

You may also have a situation where neither of these are possible/desirable. May I present a 3rd option:
3. Have a scheduled task (or perhaps an Orchestrator or SMA runbook) check the latest available version and download it if there is a newer version available. Use another scheduled task on each computer to install the update from your central download cache if it is newer than the version installed.
Continue reading

SysJam Powershell RightClick Tool – Part 7 – Querying 64bit Registry Strings with 32bit Powershell

Suppose you are writing a script that requires a 32 bit powershell window. How can you query 64bit registry keys from this script?

If you are connecting to a remote computer your most obvious and best option is Enter-PSSession or New-PSSession to open a native powershell session on the remote computer.

You also have other options for querying registry strings but we need to be careful

Option 1: Get-Item
Lets first consider using Get-Item directly. Here is a very simple function with no error control (or warranty):

Function Get-RegStringValueByGetItem{
    $objRegKey = Get-ItemProperty $KeyPath -Name $StringName
    Return $objRegKey.($stringName)

When running from a 64bit powershell session, this returns as expected:

Get-RegStringValueByGetItem -KeyPath 'HKLM:\software\microsoft\Windows\CurrentVersion' -StringName 'CommonFilesDir'

C:\Program Files\Common Files

Get-RegStringValueByGetItem -KeyPath 'HKLM:\software\Wow6432Node\microsoft\Windows\CurrentVersion' -StringName 'CommonFilesDir'

C:\Program Files (x86)\Common Files

Lets try this from a 32bit powershell session:

Get-RegStringValueByGetItem -KeyPath 'HKLM:\software\microsoft\Windows\CurrentVersion' -StringName 'CommonFilesDir'

C:\Program Files (x86)\Common Files

Get-RegStringValueByGetItem -KeyPath 'HKLM:\software\Wow6432Node\microsoft\Windows\CurrentVersion' -StringName 'CommonFilesDir'

C:\Program Files (x86)\Common Files

So, you can see that when running from a 32bit session we get redirected to the Wow6432Node without warning.

Option 2: Microsoft.Win32.RegistryKey
We know that Powershell has access to the DotNet Classes lets try it through the Microsoft.Win32.RegistryKey class.
You can read more about this class here:

Function Get-RegStringValueBYMicrosoft.Win32.RegistryKeyCLASS{
        'HKLM' {$strKeyRoot = [Microsoft.Win32.RegistryHive]'LocalMachine'}
        'HKCR' {$strKeyRoot = [Microsoft.Win32.RegistryHive]'ClassesRoot'}
        'HKCU' {$strKeyRoot = [Microsoft.Win32.RegistryHive]'CurrentUser'}
        'HKU'  {$strKeyRoot = [Microsoft.Win32.RegistryHive]'Users'}

    $objReg = [Microsoft.Win32.RegistryKey]::OpenRemoteBaseKey($strKeyRoot, 'localhost')
    $strKeyPath = $keyPath -replace '\\','\\'
    $objSubKey = $objReg.OpenSubKey($strKeyPath)
    $strValue = $objSubKey.GetValue($StringName)

    Return ($strValue)

When running from a 64bit powershell session, this returns as expected:

Get-RegStringValueBYMicrosoft.Win32.RegistryKeyCLASS -KeyPath 'software\microsoft\Windows\CurrentVersion' -StringName 'CommonFilesDir' -keyRoot 'HKLM'

C:\Program Files\Common Files

Get-RegStringValueBYMicrosoft.Win32.RegistryKeyCLASS -KeyPath 'software\Wow6432Node\microsoft\Windows\CurrentVersion' -StringName 'CommonFilesDir' -keyRoot 'HKLM'

C:\Program Files (x86)\Common Files

Lets try this from a 32bit powershell session:

Get-RegStringValueBYMicrosoft.Win32.RegistryKeyCLASS -KeyPath 'software\microsoft\Windows\CurrentVersion' -StringName 'CommonFilesDir' -keyRoot 'HKLM'

C:\Program Files (x86)\Common Files

Get-RegStringValueBYMicrosoft.Win32.RegistryKeyCLASS -KeyPath 'software\microsoft\Windows\CurrentVersion' -StringName 'CommonFilesDir' -keyRoot 'HKLM'

C:\Program Files (x86)\Common Files

So, you can see that once again when running from a 32bit session we get redirected to the Wow6432Node without warning.

Option 3: WbemScripting.SWbemNamedValueSet
This last option may take a little to wrap your head around.

Function Get-RegStringValueBYWbemScripting.SWbemNamedValueSetCOM{

        'HKLM' {$strKeyRoot = '&h80000002'}
        'HKCR' {$strKeyRoot = '&h80000000'}
        'HKCU' {$strKeyRoot = '&h80000001'}
        'HKU'  {$strKeyRoot = '&h80000003'}

    #Use the wbem scripting com object to enumerate the 64 bit standard registry provider
    $objNamedValueSet = New-Object -COM 'WbemScripting.SWbemNamedValueSet'
    $objNamedValueSet.Add('__ProviderArchitecture', 64) | Out-Null
    $objLocator = New-Object -COM 'Wbemscripting.SWbemLocator'
    $objServices = $objLocator.ConnectServer('localhost', 'root\default', '', '', '', '', '', $objNamedValueSet)
    $objStdRegProv = $objServices.Get('StdRegProv')
    $Inparams = ($objStdRegProv.Methods_ | where { $ -eq 'GetStringValue' }).InParameters.SpawnInstance_()

    # Add the input parameters
    $regkey = $keyPath -replace '\\','\\'
    ($Inparams.Properties_ | where { $ -eq 'Hdefkey' }).Value = $strkeyroot
    ($Inparams.Properties_ | where { $ -eq 'Ssubkeyname' }).Value = $regkey
    ($Inparams.Properties_ | where { $ -eq 'Svaluename' }).Value = $StringName	

    #Execute the method
    $Outparams = $objStdRegProv.ExecMethod_('GetStringValue', $Inparams, '', $objNamedValueSet)
    Return ($Outparams.Properties_ | where { $ -eq 'sValue' }).Value


You can read more about this COM object here: and the underlying StdRegProv here:

Essentially what we are doing is using the WMI scripting Com object to reference the 64bit Registry provider.

Lets see how this works in a 64 bit session:

Get-RegStringValueBYWbemScripting.SWbemNamedValueSetCOM -KeyPath 'software\microsoft\Windows\CurrentVersion' -StringName 'CommonFilesDir' -keyRoot 'HKLM'

C:\Program Files\Common Files

Get-RegStringValueBYWbemScripting.SWbemNamedValueSetCOM -KeyPath 'software\Wow6432Node\microsoft\Windows\CurrentVersion' -StringName 'CommonFilesDir' -keyRoot 'HKLM'

C:\Program Files (x86)\Common Files

And from a 32bit session:

Get-RegStringValueBYWbemScripting.SWbemNamedValueSetCOM -KeyPath 'software\microsoft\Windows\CurrentVersion' -StringName 'CommonFilesDir' -keyRoot 'HKLM'

C:\Program Files\Common Files

Get-RegStringValueBYWbemScripting.SWbemNamedValueSetCOM -KeyPath 'software\Wow6432Node\microsoft\Windows\CurrentVersion' -StringName 'CommonFilesDir' -keyRoot 'HKLM'

C:\Program Files (x86)\Common Files


You can see that we needed to use an intermediate session (the WBEM provider) to query the 64 bit registry from a 32 bit shell. This is also the method I used to get the execution history in the SysJam Powershell Rightclick tool from an SCCM client which you can download at and is usable with any of the methods defined by the StdRegProv class.

SysJam Powershell RightClick Tool – Part 6 – Getting Running ConfigMgr Jobs with Powershell

One of the key functionality improvements I wanted to include in the Sysjam Powershell RightClick tool was realtime running job monitoring. There are few tools that don’t require you to click some type of refresh button to see the running jobs.

Part 1 of providing this functionality is the using powershell jobs combined with a timer object to polling any data on a refresh cycle and updating the form. I have covered this previously here

Part 2 of this is querying WMI for the status of each running jobs…and translating this to english.

This post will concentrate on Part 2

Continue reading

SysJam Powershell RightClick Tool – Part 5 – By-Passing User Logon Requirement for a Program

Most of the time when deploying software I’ll set the program to run only with the user logged off. This is to avoid situations when the user may have an older version of the application open when they receive the advertisement. For testing however, this can be a pain….which is why I included the “ByPass User Logon Requirement (Temporary)” button in the Sysjam Powershell RightClick tool. This button sets the requirement to “None” temporarily within WMI. The next time the system does a Machine Policy refresh this setting gets overwritten.
Continue reading

Modifying Permissions on an Active Directory OU with Powershell

There are a few other blogs about this topic, but even so this is not an easy topic to wrap your head around.

To start you off, you’ll want to read: (This is a must read!)
Continue reading