Random Clients fail to download content (0x80070003)



******Update July 27 2017:  I have confirmed that the below issue is addressed via: KB4035759.  Thanks to all who helped! ******



This posts outlines an issue that I am seeing in Configuration Manager starting in 1702. I currently have a call open with MS and will update the post once the issue is resolved. I am currently aware of 2 environments with the issue and am posting this in case this is also an issue for others. If you have this issue in your environment, please shoot me a message with any cases you have open so Microsoft can see the commonalities in our environments.

Steps to Reproduce:
1. Issue occurs on newly built computers (this may be for the simple fact as they are installing the most software). We have reproduced the issue on computers built with our current gold image, last December’s gold image as well as using a new task sequence and VLSC ISO. Failure rates are around 20% so a minimum of 5 computers should be built. For our tests we built 15 computers total all in the same subnet.
2. Mass add computers to software deployment collections

Continue reading

Advertisements

Silent Scripted PNP Driver Installation

Occasionally, you may find the need to push a new driver to computers.  Perhaps a driver is causing BSOD issues or whatever the reason.  Since DotNet does not have a direct way to do this, you are usually left with depending on the driver publisher to include an silent installation method.  In reality this rarely happens.  You definitely don’t want to run around and manually install the drivers, and tools like Configuration Manager don’t have support for post OS deployment of drivers.

Continue reading

Azure Pack Automation (SMA) Get-SMAJobOutput fails

I recently ran into an issue where regardless of the method used, I was unable to get any job output from Service Management Automation.

Get-SMAJobOutput -JobID '5c773933-5a9b-4021-a793-768b2efd0165' -WebServiceEndPoint $WebServiceEndpoint -Stream Output

simply returned:

Get-SmaJobOutput : The job 5c773933-5a9b-4021-a793-768b2efd0165 cannot be found: System.Data.Services.Client.DataServiceClientException: <?xml version="1.0" 
encoding="utf-8" standalone="yes"?><error xmlns="http://schemas.microsoft.com/ado/2007/08/dataservices/metadata"><code></code><message xml:lang="en-US">An error 
occurred while executing the command definition. See the inner exception for details.</message></error>
   at System.Data.Services.Client.BaseAsyncResult.EndExecute[T](Object source, String method, IAsyncResult asyncResult)
   at System.Data.Services.Client.QueryResult.EndExecuteQuery[TElement](Object source, String method, IAsyncResult asyncResult)
At line:1 char:1
+ Get-SmaJobOutput -Id $id -WebServiceEndpoint $webserviceendpoint
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    + CategoryInfo          : NotSpecified: (:) [Get-SmaJobOutput], InvalidOperationException
    + FullyQualifiedErrorId : System.InvalidOperationException,Microsoft.SystemCenter.ServiceManagementAutomation.GetSmaJobOutput

Note that this is not the issue where the output fails with the same error if using the wrong -Stream case.

Additionally, getting the job output from the Azure Pack admin site, simply returned a red exclamation mark.

Next, I adapted a REST based function to get the output from the REST api (originally posted here: http://www.laurierhodes.info/?q=node/105)

# Original script from http://www.laurierhodes.info/?q=node/105
# Modified by Christopher Keyaert (christopher.keyaert@inovativ.be)
 

 Function Get-SMARESTJobOutput{
    Param(
        [Parameter(Mandatory=$true)]
        [string]$ID,
        [Parameter(Mandatory=$true)]
        [validateset('Any','Progress','Output','Warning','Error','Debug','Verbose')]
        [String]$stream,
        [string]$webServiceEndPoint,
        [System.Management.Automation.PSCredential]$Credential
    )
# Ignore SSL Self-Signed certificate  
Try{
    [System.Net.ServicePointManager]::CertificatePolicy = new-object IgnoreSelfSignedCertificate
}Catch{
add-type @"
    using System.Net;
    using System.Security.Cryptography.X509Certificates;
 
        public class IgnoreSelfSignedCertificate : ICertificatePolicy {
        public IgnoreSelfSignedCertificate() {}
        public bool CheckValidationResult(
            ServicePoint sPoint, X509Certificate cert,
            WebRequest wRequest, int certProb) {
            return true;
        }
    }
"@ -ErrorAction SilentlyContinue
    [System.Net.ServicePointManager]::CertificatePolicy = new-object IgnoreSelfSignedCertificate
} 
    $SMAServer = "${WebServiceEndpoint}:9090"
    $VerbosePreference = "Continue"
    #$VerbosePreference = "SilentlyContinue"
 
    Write-Verbose ""

    $URI =  "$SMAServer/00000000-0000-0000-0000-000000000000/Jobs(guid'" + $ID  + "')"
    $Response = if($Credentials){Invoke-RestMethod -Uri $URI  -Method Get -Credential $credentials}else{Invoke-RestMethod -Uri $URI  -Method Get} 
    $JobStatus = $Response.entry.properties.JobStatus 
    
    Write-Verbose "Job Status = $JobStatus"
   
    $URI =   "$SMAServer/00000000-0000-0000-0000-000000000000/JobStreams/GetStreamItems?jobId='" + $ID +"'&streamType='$Stream' "
    Try{
        $Result = if($Credentials){Invoke-RestMethod -Uri $URI  -Method Get -Credential $credentials}else{Invoke-RestMethod -Uri $URI  -Method Get}
    }Catch [exception]{
        $Exception = $_
    }
  
    Write-verbose "StreamText = $($Result.content.properties.StreamText.InnerText)"
 
    $outputname = "Stream${Stream}"
    New-object -TypeName PSCustomObject -ArgumentList @{'ID'=$ID;'Status'=$JobStatus;$outputname=$Result.content.properties.StreamText.InnerText}
}

This returned the same error as Get-SMAJobOutput (An error
occurred while executing the command definition. See the inner exception for details)

The more I researched this, the more it smelled like a performance issue querying the database.
After doing a bit of digging, I found that my [SMA].[Stream].JobStreams table was ~17,000,000 rows. Using the following command, I dropped the job history down to 30 days (we had previously set it to 90):

Set-SmaAdminConfiguration -PurgeJobsOlderThanCountDays 30 -WebServiceEndpoint $webserviceendpoint

This took 8 hours to run and dropped the table down to 9,000,000 rows. The issue still remained, so I dropped it further to 25 days and also disabled debugging and process logging on all runbooks. This dropped the table down to 7,500,000 rows. Now the issue occurred ~75% of the time….Progress!

Since I had now verified that it was an SQL performance issue, I turned to the Activity Monitor of SQL Studio.
Sure enough, the JobStreams table did not have an index usable to query the data required by the Get-SMAJobOutput powershell command. Adding the following index dropped the query time to instantaneous:

USE [SMA]
GO
CREATE NONCLUSTERED INDEX [NC_JobIDTenantID]
ON [Stream].[JobStreams] ([TenantId],[JobId])
INCLUDE([StreamTime],[StreamTypeName],[Stream],[StreamText],[RunbookVersionId])
GO

The issue was now resolved! Please note that this fix may not be supported by Microsoft. Please use at your own discretion.

Modifying SharePoint Calendar via Powershell

First off, you’ll need the SharePoint Client Components for your appropriate version of SharePoint. For myself, I needed the 2013 version available here

In order to be able to reference the Sharepoint Client objects, you’ll need to load the assemblies:

#*************************************
# Import Sharepoint client Assemblies
#*************************************
$null = [System.Reflection.Assembly]::LoadWithPartialName("Microsoft.sharepoint.client")
$null = [System.Reflection.Assembly]::LoadWithPartialName("Microsoft.sharepoint.client.runtime")

Next, we’ll create a client object, load the site, site credentials and the site lists (Since Calendars are really just Sharepoint lists with a defined display)

#*************************************
# Connect to Site and get site lists
#*************************************
$ctx = New-object Microsoft.Sharepoint.client.clientcontext $siteurl
$ctx.load($ctx.site)
$web = $ctx.Web
$ctx.load($web)
$CredCache = New-object system.net.credentialcache
$credcache.Add((New-Object system.uri $siteURL), "NTLM", [System.Net.CredentialCache]::DefaultNetworkCredentials)
$ctx.credentials = $credcache
$ctx.load($web.lists)
$ctx.ExecuteQuery()

There are a couple things to note here.
1. The CredCache portion may not be necessary for your Sharepoint installation. For myself, it was necessary for me to be able to for the NTLM authentication. Otherwise, see the links below for other authentication methods.
2. Note that this isn’t acting like nice .NET objects. In order for you to populate the properties of each object, you must use the client context object to load the properties. Then you need to send this request to the server. What actually defines what properties are available, or which properties need special load commands are still somewhat of a mystery to me.

Next we’ll get the Sharepoint calendar we are interested in and get all the items in the list.

#*************************************
# Get Patching calendar and items
#*************************************
$cal = $web.lists.getbytitle('Name of your Sharepoint calendar here')
$ctx.load($cal)
$ctx.ExecuteQuery()
$query = [Microsoft.SharePoint.Client.CamlQuery]::CreateAllItemsQuery(1000)
$items = [Microsoft.SharePoint.Client.ListItemCollection]$cal.GetItems($query)
$ctx.load($items)
$ctx.ExecuteQuery()

Notice that in order to get the items in the calendar, we need to use the load and executequery methods twice (once for each action)

Next, I wanted to delete all calendar entries newer than today and older than 30 days

Foreach($i in ($items | %{$_.FieldValues.ID})){
    $listitem = $items.GetById($i)
    $ctx.load($listitem)
    $ctx.ExecuteQuery()
    $datediff = ($Today - [datetime]$listitem.FieldValues.EventDate).totaldays 
    if(($dateDiff -gt 30) -or ($datediff -lt 0)){
        $listitem.DeleteObject()
        $cal.Update()
        $ctx.load($cal)
        $ctx.ExecuteQuery()
    }

}

Since the previous code block had already loaded the $Items object, we can check the FieldValues.ID property of the object. We’ll iterate through each ID of calendar entries and check if we want to delete them. Note, however that if we did ForEach($item in $items) $items would no longer be valid once we deleted the first item. This is because $items will have changed. This is why we are iterating through an array of IDs rather than the objects themselves. Also worth mentioning, all of the fields of the listitem are availale in $listItem.FieldValues but only after you load the load($listitem) and executequery() methods. After this is complete we can do a date comparison and delete the listitem from the calendar.

Next, we’ll add a new list item

$listCreationInformation = New-object Microsoft.SharePoint.Client.ListItemCreationInformation
$listitem = $cal.AddItem($listCreationInformation)
$listitem.ParseAndSetFieldValue('Title', "Title of your new event")
$listitem.ParseAndSetFieldValue('Description', "Description of your new event")
$listitem.ParseAndSetFieldValue('EventDate', $startTime)
$listitem.ParseAndSetFieldValue('EndDate', $endTime)
$listitem.update()
                
$ctx.load($listitem)
$ctx.ExecuteQuery()

If you are looking for field names to use when setting your values, it can be useful to query an item in the list:

$items = [Microsoft.SharePoint.Client.ListItemCollection]$cal.GetItems($query)
$ctx.load($items)
$ctx.ExecuteQuery()
$items[0].FieldValues

If you are looking for more information on editing Sharepoint with Powershell, I highly recommend the following resources. Each of these provided a portion of understanding to what I have laid out above. These links however unfortunately do not cover parsing through entries in the list or the NTLM authentication mechanism that was required for my environment:

https://www.itunity.com/article/connecting-spo-csom-api-powershell-1038
https://www.itunity.com/article/completing-basic-operations-sharepoint-csom-api-powershell-1278
http://www.hartsteve.com/2013/06/sharepoint-online-powershell/
https://msdn.microsoft.com/en-us/library/office/fp179912.aspx

SAPIEN Powershell Studio – Scaling form contents when resizing


Update: June Blender  kindly reached out to the experts on this and provided the following method.


The anchor property defines that that the specified edge of the object should maintain its position in relation to its parent object on that edge.

This means that an anchor set to

  • None will float proportionally to the parent object
  • Left will maintain the number of pixels between the left side of the object and the left side of the parent object
  • Right will maintain the number of pixels between the right side of the object and the right side of the parent object
  • Top  will maintain the number of pixels between the upper edge of the object and the upper edge of the parent object
  • Bottom  will maintain the number of pixels between the lower edge of the object and the lower edge of the parent object

Continue reading

MS16-072 Causing GPO Application Problems

3 days after patch Tuesday, this has been a fairly well reported issue.
There have been some other blog posts about identifying troublesome GPOs
(see here: https://blogs.technet.microsoft.com/poshchap/2016/06/16/ms16-072-known-issue-use-powershell-to-check-gpos/)

The issue is that only the Domain Computers group or Authenticated Users group may read (not apply) Group Policy objects. This means that any group policy that is missing this read permission will not apply even if the user or computer has the GPOApply permission delegated from another group.

Depending on the extent of your group policy environment, previous scripts (see link above) may not work so well. The issue is that I may have upwards of 100 group policies to identify and fix. Since I’d rather identify the GPO objects that need fixing and then add the permissions with a script it would be more beneficial if this were a function with object output. This allows filtering of output with standard commands such as Where-Object, or exporting to Export-CSV or reusing the output for a fixme script.

Here is my modified version:

Function Test-GPOAuthenticatedUsers{
    #Load GPO module
    Import-Module GroupPolicy

    #Get all GPOs in current domain
    $GPOs = Get-GPO -All

    #Check we have GPOs
    if ($GPOs) {
        #Loop through GPOs
        foreach ($GPO in $GPOs) {
            #Get Authenticated Users and Domain Computers permissions
            $AuthUser = Get-GPPermissions -Guid $GPO.Id -TargetName “Authenticated Users” -TargetType Group -ErrorAction SilentlyContinue
            $DomComp = Get-GPPermissions -Guid $GPO.Id -TargetName “Domain Computers” -TargetType Group -ErrorAction silentlycontinue

            #Check Authenticated Users and Domain Computers permissions
            if ((!$AuthUser) -and (!$DomComp)) {
                $Message= 'Missing both Domain Computers and Authenticated Users Permissions'
                $Status = 'Error'
            }elseif(!($AuthUser)){
                #Check if Domain Computers have read
                if ($DomComp.Permission -notin @('GPORead','GPOApply')) {
                    $Message= 'Missing Authenticated User Permission, but found Domain Computers Permissions found not matching read/apply'
                    $Status = 'Error'
                }else{
                    $Message= 'Missing Authenticated User Permission, but found Read or Apply Domain Computers Permissions'
                    $Status = 'Warning'
                }
            }else{ 
                #Check if Authenticated Users has read
                if ($AuthUser.Permission -notin @('GPORead','GPOApply')) {
                    $Message= 'Authenticated Users Permissions found not matching read/apply'
                    $Status = 'Error'
                }
                else{
                    $Message= 'Read or Apply Authenticated User Permissions found'
                    $Status = 'Good'
                }
            }
            [pscustomobject]@{'DisplayName'=$GPO.DisplayName;'ID'=$GPO.ID;'Message'=$Message;'Status'=$Status}
        } 
    } 
}

Usage is quite simple:

#Regular
Test-GPOAuthenticatedUsers

#Filtered
Test-GPOAuthenticatedUsers | Where{$_.status -eq 'Error'}

#Export to CSV
Test-GPOAuthenticatedUsers | Export-CSV -path 'C:\temp\Test-GPOAuthenticatedUsers.csv' -NoTypeInformation

This is all fine and nice for reporting purposes…but lets actually fix something:

Function Fix-GPOAuthenticatedUsers{
    [CmdletBinding()]
    Param(
      #Path to the log file 
      [parameter(Mandatory=$True)] 
      [string[]]$GPOID,
      
      #Which AD Group to use
      [parameter(Mandatory=$True)] 
      [ValidateSet('Domain Computers','Authenticated Users')] 
      [string]$Group 
    )
    Begin{
        #Load GPO module
        Import-Module GroupPolicy
    }
    Process{
        ForEach($GUID in $GPOID){
            Write-Verbose "Processing GPO $GUID"
            #Get the GPO
            if(!(Get-GPO -id $GUID)){
                Write-Error "Unable to find GPO matching $GUID"
            }

            #Try Set the permissions
            $null = Set-GPPermissions -Guid $GUID -PermissionLevel GpoRead -TargetName $Group -TargetType Group -ErrorAction Stop

            #Test GPO perms
            $AuthUser = Get-GPPermissions -Guid $GUID -TargetName “Authenticated Users” -TargetType Group -ErrorAction SilentlyContinue
            if ($AuthUser.Permission -notin @('GPORead','GPOApply')) {
                $Message= 'Authenticated Users read/apply permissions found not after setting'
                Write-Error $Message
            }
            Write-Verbose "Completed Processing GPO $GUID"
        }
    }
    End{
    }
}

Usage again is quite simple…specify a GPO Guid(s) and whether you like to use Authenticated users or Domain Computers and Voila!

$BadGPOs = Test-GPOAuthenticatedUsers | Where{$_.status -eq 'Error'}
Fix-GPOAuthenticatedUsers -GPOID $BadGPOs.id -Group 'Authenticated Users'

Enjoy!

ConfigMgr Client Fails to Install: Unable to Compile UpdatesAgent.mof

We’ve had a couple of computers in the past being unable to re-install the Configuration Manager client due to the error:
“Unable to compile UpdatesAgent.mof”

This error can have a couple of different causes.

As such, here are a couple of steps you can try:

1. Reinstall the Windows Update agent. https://support.microsoft.com/en-ca/kb/949104
2. Uninstall any existing ConfigMgr client, stop the ccmsetup service and delete c:\windows\ccm, c:\windows\ccmsetup and c:\windows\ccmcache folders
3. Run the following commands to delete the ConfigMgr namespaces completely from WMI:

Gwmi –query “Select * from __Namespace Where Name=’sms’” –NameSpace “root\cimv2” | Remove-WmiObject
Gwmi –query “Select * from __Namespace Where Name=’ccm’” –NameSpace “root” | Remove-WmiObject
Gwmi –query “Select * from __Namespace Where Name=’smsdm’” –NameSpace “root” | Remove-WmiObject

Since #3 is quite drastic, you will want to try steps 1 and 2 first before 3. However if attempting step 3, you will want to complete both steps 2 and 3 together. After this, the ConfigMgr client should successfully install.

Hopefully this helps!

Recursively Discover WMI Namespaces

Sometimes when building custom functions based on WMI it can be helpful to discover all of the WMI namespaces registered on a machine. Here is a function to do just that!

Function Find-WMINamespaces{
    Param(
        [string]$Computer,
        [string]$StartingNameSpace = 'root',
        [boolean]$recurse = $true
    )
    $childNamespaces = gwmi -namespace $StartingNameSpace -class "__Namespace" -computername $computer | select -expandproperty Name
    ForEach($child in $childNamespaces){
       [PSCustomObject]@{'Name'="${StartingNamespace}\${child}"}
       if($recurse){
            Find-WMINamespaces -Computer $Computer -StartingNamespace "${StartingNamespace}\${child}" -recurse $true
        }
    }
}

Content Library Explorer – The Legacy Package Does Not Have Exactly One Content

I recently ran into an issue, where my Primary site server was running low on disk space. This turned into a general spring cleaning of the ConfigMgr environment. As part of the cleanup process, I wanted to check the distribution points for old or stale packages.

Microsoft has provided a toolkit for cleanup operations such as this:
https://www.microsoft.com/en-us/download/details.aspx?id=50012

Part of this toolkit is the Content Library Explorer. However, after aiming this at my distribution point, I was confronted with the following error:

ContentLibExpl-LegacyPackage1Content

Not exactly an insightful message. I did however find a useful thread regarding this issue:
https://social.technet.microsoft.com/Forums/en-US/c7757792-3e5c-41c3-bb74-e57b0fe7258f/content-library-explorer-says-the-legacy-package-does-not-have-exactly-one-content?forum=configmanagergeneral

Using the provided script, I happily identified 3 packages that were causing issues. I simply removed the extra old folders and redistributed these. The extra folders were now gone, but the error message remained.

After doing some more digging with procmon, I identified the verification steps the content explorer appears to make as well as 3 possible different problems which could lead to the above error message.

1. More than 1 data folder exists for a given package in the datalib subfolder of the SCCMContentLib folder. (This is addressed by the script in the link above)
2. There exists an ini file in the pkglib subfolder of the SCCMContentLib folder, but the associated ini file in the datalib folder is missing.
3. There are multiple content versions listed in the ini file located in the pkglib folder.

I have written the following function to test the SCCMContentLib folder for problems. Problems that are found by this script are fixable by removing any extra folders for the given package from the DataLib folder, removing the distribution point from the package, waiting for the files to disappear and redistributing the package to the distribution point.

Function Test-DPLegacyContent{
    Param($DPFolderPath)

    #Calculate child folders
    $pkgdir = join-path -path $DPFolderPath -ChildPath 'pkglib'
    $datadir = join-path -path $DPFolderPath -childpath 'datalib'
    $childdatafolders = Get-ChildItem -Directory $datadir

    ForEach($file in (get-childitem -file $pkgdir)){
        $filecontent = Get-content $file.FullName
        $expectedcontent = $filecontent.split('`n')[1].replace('=','')
        if($expectedcontent -match $file.basename){
            #legacy package
            $packageID = $file.basename
            #Check for missing INI files
            if(!(test-path (join-path $datadir -ChildPath "${expectedcontent}.ini"))){
                [pscustomobject]@{'PackageID'=$PackageID; 'Error'="Ini file missing in datalib for $packageID"}
            }

            #Check for mismatch in folder count
            [array]$matchingFolders = [array]($childdatafolders | Where{$_.Name -match $packageID})
            $foldercount = $matchingfolders.count
            if($foldercount -ne 1){
                [pscustomobject]@{'PackageID'=$PackageID; 'Error'="$foldercount folders found"}
            }
        }
    
        #Check for multiple content versions in pkg ini
        if(($filecontent.split('`n')[2].replace('=','')) -match $file.basename){
            [pscustomobject]@{'PackageID'=$PackageID; 'Error'="Multiple package versions found in pkglib ini"}
        }
    }
}

Test-DPLegacyContent -dpfolderpath '\\DPServer\d$\SCCMContentLib\'

To use this script, simply change the dpfolderpath parameter to the path of your SCCMContentLib.

Launching a Java JNLP file with an Old version of JRE

Back in February I wrote a post regarding using Oracle Deployment Ruleset to control the version of Java that executes a given jar file on a given website. Since that time, you may have discovered that applying this to a jnlp file is tricky at best. The problem here is that Internet Explorer downloads the jnlp file to your Temporary Internet files before executing it. As a result, the location it is launching from is not the online url, but the local jnlp file. It may be technically possible to find the certificate hash of each jar file referenced in the jnlp file and add these to you deploymentruleset.jar file but I have not tested this.

As with most Java applets, problems begin to occur when a version of JRE is installed that is higher than the version that the applet is designed for. At first what I tried to do is run a ProcMon on a computer with just the required version of Java installed.

Continue reading

Automating Download of Google Chrome

If you manage multiple computers, you will want to ensure that Google Chrome is always up to date. This important as Google contains flash built in by default as well as to patch any security vulnerabilities in Chrome. In order to accomplish this, historically you had two options:
1. Allow all your computers to auto – update. This works, but all your computers are downloading approximately 50 MB every time it updates. If you are on a slow connection, or you have lots of computers, this will be an issue.
2. Manually download the Google Chrome for Business msi file and deploy it via Microsoft System Center Configuration manager or group policy

You may also have a situation where neither of these are possible/desirable. May I present a 3rd option:
3. Have a scheduled task (or perhaps an Orchestrator or SMA runbook) check the latest available version and download it if there is a newer version available. Use another scheduled task on each computer to install the update from your central download cache if it is newer than the version installed.
Continue reading

SysJam Powershell RightClick Tool – Part 7 – Querying 64bit Registry Strings with 32bit Powershell

Suppose you are writing a script that requires a 32 bit powershell window. How can you query 64bit registry keys from this script?

If you are connecting to a remote computer your most obvious and best option is Enter-PSSession or New-PSSession to open a native powershell session on the remote computer.

You also have other options for querying registry strings but we need to be careful

Option 1: Get-Item
Lets first consider using Get-Item directly. Here is a very simple function with no error control (or warranty):

Function Get-RegStringValueByGetItem{
    Param([string]$KeyPath,
            [string]$StringName)
    $objRegKey = Get-ItemProperty $KeyPath -Name $StringName
    Return $objRegKey.($stringName)
}

When running from a 64bit powershell session, this returns as expected:

Get-RegStringValueByGetItem -KeyPath 'HKLM:\software\microsoft\Windows\CurrentVersion' -StringName 'CommonFilesDir'

C:\Program Files\Common Files

Get-RegStringValueByGetItem -KeyPath 'HKLM:\software\Wow6432Node\microsoft\Windows\CurrentVersion' -StringName 'CommonFilesDir'

C:\Program Files (x86)\Common Files

Lets try this from a 32bit powershell session:

Get-RegStringValueByGetItem -KeyPath 'HKLM:\software\microsoft\Windows\CurrentVersion' -StringName 'CommonFilesDir'

C:\Program Files (x86)\Common Files

Get-RegStringValueByGetItem -KeyPath 'HKLM:\software\Wow6432Node\microsoft\Windows\CurrentVersion' -StringName 'CommonFilesDir'

C:\Program Files (x86)\Common Files

So, you can see that when running from a 32bit session we get redirected to the Wow6432Node without warning.

Option 2: Microsoft.Win32.RegistryKey
We know that Powershell has access to the DotNet Classes lets try it through the Microsoft.Win32.RegistryKey class.
You can read more about this class here: https://msdn.microsoft.com/en-us/library/Microsoft.Win32.RegistryKey(v=vs.110).aspx

Function Get-RegStringValueBYMicrosoft.Win32.RegistryKeyCLASS{
    Param([string]$KeyPath,
            [string]$StringName,
            [string]$keyRoot
            )
    Switch($keyRoot){
        'HKLM' {$strKeyRoot = [Microsoft.Win32.RegistryHive]'LocalMachine'}
        'HKCR' {$strKeyRoot = [Microsoft.Win32.RegistryHive]'ClassesRoot'}
        'HKCU' {$strKeyRoot = [Microsoft.Win32.RegistryHive]'CurrentUser'}
        'HKU'  {$strKeyRoot = [Microsoft.Win32.RegistryHive]'Users'}
    }

    $objReg = [Microsoft.Win32.RegistryKey]::OpenRemoteBaseKey($strKeyRoot, 'localhost')
    $strKeyPath = $keyPath -replace '\\','\\'
    $objSubKey = $objReg.OpenSubKey($strKeyPath)
    $strValue = $objSubKey.GetValue($StringName)

    Return ($strValue)
}

When running from a 64bit powershell session, this returns as expected:

Get-RegStringValueBYMicrosoft.Win32.RegistryKeyCLASS -KeyPath 'software\microsoft\Windows\CurrentVersion' -StringName 'CommonFilesDir' -keyRoot 'HKLM'

C:\Program Files\Common Files

Get-RegStringValueBYMicrosoft.Win32.RegistryKeyCLASS -KeyPath 'software\Wow6432Node\microsoft\Windows\CurrentVersion' -StringName 'CommonFilesDir' -keyRoot 'HKLM'

C:\Program Files (x86)\Common Files

Lets try this from a 32bit powershell session:

Get-RegStringValueBYMicrosoft.Win32.RegistryKeyCLASS -KeyPath 'software\microsoft\Windows\CurrentVersion' -StringName 'CommonFilesDir' -keyRoot 'HKLM'

C:\Program Files (x86)\Common Files

Get-RegStringValueBYMicrosoft.Win32.RegistryKeyCLASS -KeyPath 'software\microsoft\Windows\CurrentVersion' -StringName 'CommonFilesDir' -keyRoot 'HKLM'

C:\Program Files (x86)\Common Files

So, you can see that once again when running from a 32bit session we get redirected to the Wow6432Node without warning.

Option 3: WbemScripting.SWbemNamedValueSet
This last option may take a little to wrap your head around.

Function Get-RegStringValueBYWbemScripting.SWbemNamedValueSetCOM{
    Param([string]$KeyPath,
            [string]$StringName,
            [string]$keyRoot)

    Switch($keyRoot){
        'HKLM' {$strKeyRoot = '&h80000002'}
        'HKCR' {$strKeyRoot = '&h80000000'}
        'HKCU' {$strKeyRoot = '&h80000001'}
        'HKU'  {$strKeyRoot = '&h80000003'}
    }

    #Use the wbem scripting com object to enumerate the 64 bit standard registry provider
    $objNamedValueSet = New-Object -COM 'WbemScripting.SWbemNamedValueSet'
    $objNamedValueSet.Add('__ProviderArchitecture', 64) | Out-Null
    $objLocator = New-Object -COM 'Wbemscripting.SWbemLocator'
    $objServices = $objLocator.ConnectServer('localhost', 'root\default', '', '', '', '', '', $objNamedValueSet)
    $objStdRegProv = $objServices.Get('StdRegProv')
    $Inparams = ($objStdRegProv.Methods_ | where { $_.name -eq 'GetStringValue' }).InParameters.SpawnInstance_()

    # Add the input parameters
    $regkey = $keyPath -replace '\\','\\'
    ($Inparams.Properties_ | where { $_.name -eq 'Hdefkey' }).Value = $strkeyroot
    ($Inparams.Properties_ | where { $_.name -eq 'Ssubkeyname' }).Value = $regkey
    ($Inparams.Properties_ | where { $_.name -eq 'Svaluename' }).Value = $StringName	

    #Execute the method
    $Outparams = $objStdRegProv.ExecMethod_('GetStringValue', $Inparams, '', $objNamedValueSet)
    Return ($Outparams.Properties_ | where { $_.name -eq 'sValue' }).Value

}

You can read more about this COM object here: https://msdn.microsoft.com/en-us/library/aa390788(v=vs.85).aspx and the underlying StdRegProv here: https://msdn.microsoft.com/en-us/library/aa393664%28v=vs.85%29.aspx?f=255&MSPPError=-2147217396

Essentially what we are doing is using the WMI scripting Com object to reference the 64bit Registry provider.

Lets see how this works in a 64 bit session:

Get-RegStringValueBYWbemScripting.SWbemNamedValueSetCOM -KeyPath 'software\microsoft\Windows\CurrentVersion' -StringName 'CommonFilesDir' -keyRoot 'HKLM'

C:\Program Files\Common Files

Get-RegStringValueBYWbemScripting.SWbemNamedValueSetCOM -KeyPath 'software\Wow6432Node\microsoft\Windows\CurrentVersion' -StringName 'CommonFilesDir' -keyRoot 'HKLM'

C:\Program Files (x86)\Common Files

And from a 32bit session:

Get-RegStringValueBYWbemScripting.SWbemNamedValueSetCOM -KeyPath 'software\microsoft\Windows\CurrentVersion' -StringName 'CommonFilesDir' -keyRoot 'HKLM'

C:\Program Files\Common Files

Get-RegStringValueBYWbemScripting.SWbemNamedValueSetCOM -KeyPath 'software\Wow6432Node\microsoft\Windows\CurrentVersion' -StringName 'CommonFilesDir' -keyRoot 'HKLM'

C:\Program Files (x86)\Common Files

Finally!

You can see that we needed to use an intermediate session (the WBEM provider) to query the 64 bit registry from a 32 bit shell. This is also the method I used to get the execution history in the SysJam Powershell Rightclick tool from an SCCM client which you can download at https://github.com/mdedeboer/SysJamRightClickToolForConfigMgr and is usable with any of the methods defined by the StdRegProv class.

SysJam Powershell RightClick Tool – Part 6 – Getting Running ConfigMgr Jobs with Powershell

One of the key functionality improvements I wanted to include in the Sysjam Powershell RightClick tool was realtime running job monitoring. There are few tools that don’t require you to click some type of refresh button to see the running jobs.

Part 1 of providing this functionality is the using powershell jobs combined with a timer object to polling any data on a refresh cycle and updating the form. I have covered this previously here

Part 2 of this is querying WMI for the status of each running jobs…and translating this to english.

This post will concentrate on Part 2

Continue reading

SysJam Powershell RightClick Tool – Part 5 – By-Passing User Logon Requirement for a Program

Most of the time when deploying software I’ll set the program to run only with the user logged off. This is to avoid situations when the user may have an older version of the application open when they receive the advertisement. For testing however, this can be a pain….which is why I included the “ByPass User Logon Requirement (Temporary)” button in the Sysjam Powershell RightClick tool. This button sets the requirement to “None” temporarily within WMI. The next time the system does a Machine Policy refresh this setting gets overwritten.
Continue reading