Signing existing OS Configuration Discovery scripts

Very simply, this is an example of how to take existing configuration item discovery scripts that are present in a given Configuration Baseline and sign each of them.  This can be useful if you are importing scripts from the SCAP extensions etc..  Depending on your configuration item source, you may have several hundred scripts to sign.

Continue reading

Advertisements

Exporting Office 365 Licenses to CSV

Recently, I was asked by a collegue to review a script that is publically available in the technet gallery (https://gallery.technet.microsoft.com/scriptcenter/Office-365-licenses-and-ee8837a0)

The author mentions in the comments “If you find a way to improve this code, please share it.”  As a result, I thought I’d post my object based version.

Regarding the “English” names for the license, you can update the $sku hashtable to change the name of the license in the exported CSV.

As far as overall design, I start with a hashtable of fixed property names ($licenseHash).  This is the template for our object.  Then, I loop through the licenses and services available in the tenant and add these as keys to the hashtable.  Once I have discovered all the licenses and services, I discover all the users in the tenant.  Foreach user, I take a copy of $licenseHash and hard type it to a [pscustomobject] ($licenseduser).  This allows me to treat each of the licenses as properties of the object.  After mapping the licenses to matching properties, I use Export-CSV to append it to the spreadsheet.

Without further ado, here is the script!

Updated Dec 28 2017 to account for the service status as per JBros comment

<#
.DESCRITION
This script will create a comma separated file with a line per user and the following columns:
Display Name, Domain, UPN, Is Licensed?, all the SKUs in tenant, all the services,
Errors, ImmutableId and BlockCredential.
 
Based on previous script by Marcus Tarquinio
https://gallery.technet.microsoft.com/scriptcenter/Office-365-licenses-and-ee8837a0
 
.PARAMETER
After starting the script it will ask for the credentials to connect to Office 365
 
.OUTPUT
CSV specified by $csvpath
 
.NOTES
Version: 1.0
Author: Matthew DeBoer
Creation Date: September 27 2017
Purpose/Change: Initial script development
#>
Import-Module MSOnline
 
# CSV output path
$csvpath = 'C:\temp\OfficeLicenseCounts.csv'
 
#Translate SKUs to English
$Sku = @{
"DESKLESSPACK" = "Office 365 (Plan K1)"
"DESKLESSWOFFPACK" = "Office 365 (Plan K2)"
"LITEPACK" = "Office 365 (Plan P1)"
"EXCHANGESTANDARD" = "Office 365 Exchange Online Only"
"STANDARDPACK" = "Enterprise Plan E1"
"STANDARDWOFFPACK" = "Office 365 (Plan E2)"
"ENTERPRISEPACK" = "Enterprise Plan E3"
"ENTERPRISEPACKLRG" = "Enterprise Plan E3"
"ENTERPRISEWITHSCAL" = "Enterprise Plan E4"
"STANDARDPACK_STUDENT" = "Office 365 (Plan A1) for Students"
"STANDARDWOFFPACKPACK_STUDENT" = "Office 365 (Plan A2) for Students"
"ENTERPRISEPACK_STUDENT" = "Office 365 (Plan A3) for Students"
"ENTERPRISEWITHSCAL_STUDENT" = "Office 365 (Plan A4) for Students"
"STANDARDPACK_FACULTY" = "Office 365 (Plan A1) for Faculty"
"STANDARDWOFFPACKPACK_FACULTY" = "Office 365 (Plan A2) for Faculty"
"ENTERPRISEPACK_FACULTY" = "Office 365 (Plan A3) for Faculty"
"ENTERPRISEWITHSCAL_FACULTY" = "Office 365 (Plan A4) for Faculty"
"ENTERPRISEPACK_B_PILOT" = "Office 365 (Enterprise Preview)"
"STANDARD_B_PILOT" = "Office 365 (Small Business Preview)"
"VISIOCLIENT" = "Visio Pro Online"
"POWER_BI_ADDON" = "Office 365 Power BI Addon"
"POWER_BI_INDIVIDUAL_USE" = "Power BI Individual User"
"POWER_BI_STANDALONE" = "Power BI Stand Alone"
"POWER_BI_STANDARD" = "Power-BI standard"
"PROJECTESSENTIALS" = "Project Lite"
"PROJECTCLIENT" = "Project Professional"
"PROJECTONLINE_PLAN_1" = "Project Online"
"PROJECTONLINE_PLAN_2" = "Project Online and PRO"
"ECAL_SERVICES" = "ECAL"
"EMS" = "Enterprise Mobility Suite"
"RIGHTSMANAGEMENT_ADHOC" = "Windows Azure Rights Management"
"MCOMEETADV" = "PSTN conferencing"
"SHAREPOINTSTORAGE" = "SharePoint storage"
"PLANNERSTANDALONE" = "Planner Standalone"
"CRMIUR" = "CMRIUR"
"BI_AZURE_P1" = "Power BI Reporting and Analytics"
"INTUNE_A" = "Windows Intune Plan A"
}
 
# Connect to Office 365 (need modules installed)
write-verbose "Connecting to Office 365..."
$credential = Get-Credential
Connect-MsolService -Credential $credential
 
# Get a list of all licences that exist within the tenant
write-verbose "Geting the licenses available in tenant"
$licensetype = Get-MsolAccountSku | Where {$_.ConsumedUnits -ge 1}
 
# License Object. This forms the property names of the user objects we populate later
$licensehash = @{
    "DisplayName"='';
    "Domain"='';
    "UPN"='';
    "IsLicensed"='';
    "Errors"='';
    "ImmutableID"='';
    "BlockCredential"='';
}
 
#Get all account SKUs in tenant
$AccountSkus = Get-MsolAccountSku
 
#Loop through each license in tenant and get the sku
foreach ($license in $licensetype)
{
    if($license.SkuPartNumber -notin $licensehash.keys){
        if($license.SkuPartNumber -in $sku.keys){
            $licensename = $sku.($license.SkuPartNumber)
        }else{
            $licensename = $license.skupartnumber
        }
        $licensehash.Add($licensename,'')

 
        # Get a list of all the services in the tenant
        $services = ($AccountSkus | where {$_.AccountSkuId -eq $license.AccountSkuId}).ServiceStatus.serviceplan.servicename
        ForEach($service in $services){
            if($service -in $sku.keys){
                $servicename = $sku.($service)
            }else{
                $servicename = $service
            }
            if($servicename -notin $licensehash.keys){
                $licensehash.add($servicename,'')
            }
        }
    }
} 

# Get a list of all the users in the tenant
write-verbose "Getting all users in the Office 365 tenant..."
$users = Get-MsolUser -All
 
# Loop through all users found in the tenant
foreach ($user in $users)
{
    $displayname = $user.displayname -Replace ",",""
    $licenseduser = [pscustomobject]$licensehash
    $licenseduser.Displayname = $displayname
    $licenseduser.Domain = $user.UserPrincipalName.Split("@")[1]
    $licenseduser.UPN = $user.userprincipalname
    $licenseduser.ImmutableID = $user.immutableid
    $licenseduser.Errors = $user.errors
    $licenseduser.blockcredential = $user.blockcredential
    $licenseduser.IsLicensed = $user.IsLicensed
    if ($user.isLicensed){
        ForEach($userlicense in $user.licenses){
            if($userlicense.AccountSkuID.ToString() -in $licensetype.AccountSKUid){
                $usersku = (($userlicense.accountskuid.tostring()) -split ':')[1]
                if($usersku -in $sku.keys){
                    $usersku = $sku.($usersku)
                }
                if($usersku -in $licensehash.keys){
                    $licensedUser.$usersku = $true
                }
            }
            $UserLicenseConfiguredServices = $userlicense.ServiceStatus | Where{$_.provisioningstatus}
            ForEach($service in $UserLicenseConfiguredServices){
                if($service.ServicePlan.ServiceName -in $sku.keys){
                    $servicename = $sku.($service.ServicePlan.ServiceName)
                }else{
                    $servicename = $service.ServicePlan.ServiceName
                }
                if($servicename -in $licensehash.keys){
                    $licensedUser.$servicename = $service.ProvisioningStatus
                }
            }
        

        }
    }
$licenseduser | Export-csv -path $csvpath -Force -Append -notypeinformation
}

 

 

Silent Scripted PNP Driver Installation

Occasionally, you may find the need to push a new driver to computers.  Perhaps a driver is causing BSOD issues or whatever the reason.  Since DotNet does not have a direct way to do this, you are usually left with depending on the driver publisher to include an silent installation method.  In reality this rarely happens.  You definitely don’t want to run around and manually install the drivers, and tools like Configuration Manager don’t have support for post OS deployment of drivers.

Continue reading

Azure Pack Automation (SMA) Get-SMAJobOutput fails

I recently ran into an issue where regardless of the method used, I was unable to get any job output from Service Management Automation.

Get-SMAJobOutput -JobID '5c773933-5a9b-4021-a793-768b2efd0165' -WebServiceEndPoint $WebServiceEndpoint -Stream Output

simply returned:

Get-SmaJobOutput : The job 5c773933-5a9b-4021-a793-768b2efd0165 cannot be found: System.Data.Services.Client.DataServiceClientException: <?xml version="1.0" 
encoding="utf-8" standalone="yes"?><error xmlns="http://schemas.microsoft.com/ado/2007/08/dataservices/metadata"><code></code><message xml:lang="en-US">An error 
occurred while executing the command definition. See the inner exception for details.</message></error>
   at System.Data.Services.Client.BaseAsyncResult.EndExecute[T](Object source, String method, IAsyncResult asyncResult)
   at System.Data.Services.Client.QueryResult.EndExecuteQuery[TElement](Object source, String method, IAsyncResult asyncResult)
At line:1 char:1
+ Get-SmaJobOutput -Id $id -WebServiceEndpoint $webserviceendpoint
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    + CategoryInfo          : NotSpecified: (:) [Get-SmaJobOutput], InvalidOperationException
    + FullyQualifiedErrorId : System.InvalidOperationException,Microsoft.SystemCenter.ServiceManagementAutomation.GetSmaJobOutput

Note that this is not the issue where the output fails with the same error if using the wrong -Stream case.

Additionally, getting the job output from the Azure Pack admin site, simply returned a red exclamation mark.

Next, I adapted a REST based function to get the output from the REST api (originally posted here: http://www.laurierhodes.info/?q=node/105)

# Original script from http://www.laurierhodes.info/?q=node/105
# Modified by Christopher Keyaert (christopher.keyaert@inovativ.be)
 

 Function Get-SMARESTJobOutput{
    Param(
        [Parameter(Mandatory=$true)]
        [string]$ID,
        [Parameter(Mandatory=$true)]
        [validateset('Any','Progress','Output','Warning','Error','Debug','Verbose')]
        [String]$stream,
        [string]$webServiceEndPoint,
        [System.Management.Automation.PSCredential]$Credential
    )
# Ignore SSL Self-Signed certificate  
Try{
    [System.Net.ServicePointManager]::CertificatePolicy = new-object IgnoreSelfSignedCertificate
}Catch{
add-type @"
    using System.Net;
    using System.Security.Cryptography.X509Certificates;
 
        public class IgnoreSelfSignedCertificate : ICertificatePolicy {
        public IgnoreSelfSignedCertificate() {}
        public bool CheckValidationResult(
            ServicePoint sPoint, X509Certificate cert,
            WebRequest wRequest, int certProb) {
            return true;
        }
    }
"@ -ErrorAction SilentlyContinue
    [System.Net.ServicePointManager]::CertificatePolicy = new-object IgnoreSelfSignedCertificate
} 
    $SMAServer = "${WebServiceEndpoint}:9090"
    $VerbosePreference = "Continue"
    #$VerbosePreference = "SilentlyContinue"
 
    Write-Verbose ""

    $URI =  "$SMAServer/00000000-0000-0000-0000-000000000000/Jobs(guid'" + $ID  + "')"
    $Response = if($Credentials){Invoke-RestMethod -Uri $URI  -Method Get -Credential $credentials}else{Invoke-RestMethod -Uri $URI  -Method Get} 
    $JobStatus = $Response.entry.properties.JobStatus 
    
    Write-Verbose "Job Status = $JobStatus"
   
    $URI =   "$SMAServer/00000000-0000-0000-0000-000000000000/JobStreams/GetStreamItems?jobId='" + $ID +"'&streamType='$Stream' "
    Try{
        $Result = if($Credentials){Invoke-RestMethod -Uri $URI  -Method Get -Credential $credentials}else{Invoke-RestMethod -Uri $URI  -Method Get}
    }Catch [exception]{
        $Exception = $_
    }
  
    Write-verbose "StreamText = $($Result.content.properties.StreamText.InnerText)"
 
    $outputname = "Stream${Stream}"
    New-object -TypeName PSCustomObject -ArgumentList @{'ID'=$ID;'Status'=$JobStatus;$outputname=$Result.content.properties.StreamText.InnerText}
}

This returned the same error as Get-SMAJobOutput (An error
occurred while executing the command definition. See the inner exception for details)

The more I researched this, the more it smelled like a performance issue querying the database.
After doing a bit of digging, I found that my [SMA].[Stream].JobStreams table was ~17,000,000 rows. Using the following command, I dropped the job history down to 30 days (we had previously set it to 90):

Set-SmaAdminConfiguration -PurgeJobsOlderThanCountDays 30 -WebServiceEndpoint $webserviceendpoint

This took 8 hours to run and dropped the table down to 9,000,000 rows. The issue still remained, so I dropped it further to 25 days and also disabled debugging and process logging on all runbooks. This dropped the table down to 7,500,000 rows. Now the issue occurred ~75% of the time….Progress!

Since I had now verified that it was an SQL performance issue, I turned to the Activity Monitor of SQL Studio.
Sure enough, the JobStreams table did not have an index usable to query the data required by the Get-SMAJobOutput powershell command. Adding the following index dropped the query time to instantaneous:

USE [SMA]
GO
CREATE NONCLUSTERED INDEX [NC_JobIDTenantID]
ON [Stream].[JobStreams] ([TenantId],[JobId])
INCLUDE([StreamTime],[StreamTypeName],[Stream],[StreamText],[RunbookVersionId])
GO

The issue was now resolved! Please note that this fix may not be supported by Microsoft. Please use at your own discretion.

Modifying SharePoint Calendar via Powershell

First off, you’ll need the SharePoint Client Components for your appropriate version of SharePoint. For myself, I needed the 2013 version available here

In order to be able to reference the Sharepoint Client objects, you’ll need to load the assemblies:

#*************************************
# Import Sharepoint client Assemblies
#*************************************
$null = [System.Reflection.Assembly]::LoadWithPartialName("Microsoft.sharepoint.client")
$null = [System.Reflection.Assembly]::LoadWithPartialName("Microsoft.sharepoint.client.runtime")

Next, we’ll create a client object, load the site, site credentials and the site lists (Since Calendars are really just Sharepoint lists with a defined display)

#*************************************
# Connect to Site and get site lists
#*************************************
$ctx = New-object Microsoft.Sharepoint.client.clientcontext $siteurl
$ctx.load($ctx.site)
$web = $ctx.Web
$ctx.load($web)
$CredCache = New-object system.net.credentialcache
$credcache.Add((New-Object system.uri $siteURL), "NTLM", [System.Net.CredentialCache]::DefaultNetworkCredentials)
$ctx.credentials = $credcache
$ctx.load($web.lists)
$ctx.ExecuteQuery()

There are a couple things to note here.
1. The CredCache portion may not be necessary for your Sharepoint installation. For myself, it was necessary for me to be able to for the NTLM authentication. Otherwise, see the links below for other authentication methods.
2. Note that this isn’t acting like nice .NET objects. In order for you to populate the properties of each object, you must use the client context object to load the properties. Then you need to send this request to the server. What actually defines what properties are available, or which properties need special load commands are still somewhat of a mystery to me.

Next we’ll get the Sharepoint calendar we are interested in and get all the items in the list.

#*************************************
# Get Patching calendar and items
#*************************************
$cal = $web.lists.getbytitle('Name of your Sharepoint calendar here')
$ctx.load($cal)
$ctx.ExecuteQuery()
$query = [Microsoft.SharePoint.Client.CamlQuery]::CreateAllItemsQuery(1000)
$items = [Microsoft.SharePoint.Client.ListItemCollection]$cal.GetItems($query)
$ctx.load($items)
$ctx.ExecuteQuery()

Notice that in order to get the items in the calendar, we need to use the load and executequery methods twice (once for each action)

Next, I wanted to delete all calendar entries newer than today and older than 30 days

Foreach($i in ($items | %{$_.FieldValues.ID})){
    $listitem = $items.GetById($i)
    $ctx.load($listitem)
    $ctx.ExecuteQuery()
    $datediff = ($Today - [datetime]$listitem.FieldValues.EventDate).totaldays 
    if(($dateDiff -gt 30) -or ($datediff -lt 0)){
        $listitem.DeleteObject()
        $cal.Update()
        $ctx.load($cal)
        $ctx.ExecuteQuery()
    }

}

Since the previous code block had already loaded the $Items object, we can check the FieldValues.ID property of the object. We’ll iterate through each ID of calendar entries and check if we want to delete them. Note, however that if we did ForEach($item in $items) $items would no longer be valid once we deleted the first item. This is because $items will have changed. This is why we are iterating through an array of IDs rather than the objects themselves. Also worth mentioning, all of the fields of the listitem are availale in $listItem.FieldValues but only after you load the load($listitem) and executequery() methods. After this is complete we can do a date comparison and delete the listitem from the calendar.

Next, we’ll add a new list item

$listCreationInformation = New-object Microsoft.SharePoint.Client.ListItemCreationInformation
$listitem = $cal.AddItem($listCreationInformation)
$listitem.ParseAndSetFieldValue('Title', "Title of your new event")
$listitem.ParseAndSetFieldValue('Description', "Description of your new event")
$listitem.ParseAndSetFieldValue('EventDate', $startTime)
$listitem.ParseAndSetFieldValue('EndDate', $endTime)
$listitem.update()
                
$ctx.load($listitem)
$ctx.ExecuteQuery()

If you are looking for field names to use when setting your values, it can be useful to query an item in the list:

$items = [Microsoft.SharePoint.Client.ListItemCollection]$cal.GetItems($query)
$ctx.load($items)
$ctx.ExecuteQuery()
$items[0].FieldValues

If you are looking for more information on editing Sharepoint with Powershell, I highly recommend the following resources. Each of these provided a portion of understanding to what I have laid out above. These links however unfortunately do not cover parsing through entries in the list or the NTLM authentication mechanism that was required for my environment:

https://www.itunity.com/article/connecting-spo-csom-api-powershell-1038
https://www.itunity.com/article/completing-basic-operations-sharepoint-csom-api-powershell-1278
http://www.hartsteve.com/2013/06/sharepoint-online-powershell/
https://msdn.microsoft.com/en-us/library/office/fp179912.aspx

SAPIEN Powershell Studio – Scaling form contents when resizing


Update: June Blender  kindly reached out to the experts on this and provided the following method.


The anchor property defines that that the specified edge of the object should maintain its position in relation to its parent object on that edge.

This means that an anchor set to

  • None will float proportionally to the parent object
  • Left will maintain the number of pixels between the left side of the object and the left side of the parent object
  • Right will maintain the number of pixels between the right side of the object and the right side of the parent object
  • Top  will maintain the number of pixels between the upper edge of the object and the upper edge of the parent object
  • Bottom  will maintain the number of pixels between the lower edge of the object and the lower edge of the parent object

Continue reading

MS16-072 Causing GPO Application Problems

3 days after patch Tuesday, this has been a fairly well reported issue.
There have been some other blog posts about identifying troublesome GPOs
(see here: https://blogs.technet.microsoft.com/poshchap/2016/06/16/ms16-072-known-issue-use-powershell-to-check-gpos/)

The issue is that only the Domain Computers group or Authenticated Users group may read (not apply) Group Policy objects. This means that any group policy that is missing this read permission will not apply even if the user or computer has the GPOApply permission delegated from another group.

Depending on the extent of your group policy environment, previous scripts (see link above) may not work so well. The issue is that I may have upwards of 100 group policies to identify and fix. Since I’d rather identify the GPO objects that need fixing and then add the permissions with a script it would be more beneficial if this were a function with object output. This allows filtering of output with standard commands such as Where-Object, or exporting to Export-CSV or reusing the output for a fixme script.

Here is my modified version:

Function Test-GPOAuthenticatedUsers{
    #Load GPO module
    Import-Module GroupPolicy

    #Get all GPOs in current domain
    $GPOs = Get-GPO -All

    #Check we have GPOs
    if ($GPOs) {
        #Loop through GPOs
        foreach ($GPO in $GPOs) {
            #Get Authenticated Users and Domain Computers permissions
            $AuthUser = Get-GPPermissions -Guid $GPO.Id -TargetName “Authenticated Users” -TargetType Group -ErrorAction SilentlyContinue
            $DomComp = Get-GPPermissions -Guid $GPO.Id -TargetName “Domain Computers” -TargetType Group -ErrorAction silentlycontinue

            #Check Authenticated Users and Domain Computers permissions
            if ((!$AuthUser) -and (!$DomComp)) {
                $Message= 'Missing both Domain Computers and Authenticated Users Permissions'
                $Status = 'Error'
            }elseif(!($AuthUser)){
                #Check if Domain Computers have read
                if ($DomComp.Permission -notin @('GPORead','GPOApply')) {
                    $Message= 'Missing Authenticated User Permission, but found Domain Computers Permissions found not matching read/apply'
                    $Status = 'Error'
                }else{
                    $Message= 'Missing Authenticated User Permission, but found Read or Apply Domain Computers Permissions'
                    $Status = 'Warning'
                }
            }else{ 
                #Check if Authenticated Users has read
                if ($AuthUser.Permission -notin @('GPORead','GPOApply')) {
                    $Message= 'Authenticated Users Permissions found not matching read/apply'
                    $Status = 'Error'
                }
                else{
                    $Message= 'Read or Apply Authenticated User Permissions found'
                    $Status = 'Good'
                }
            }
            [pscustomobject]@{'DisplayName'=$GPO.DisplayName;'ID'=$GPO.ID;'Message'=$Message;'Status'=$Status}
        } 
    } 
}

Usage is quite simple:

#Regular
Test-GPOAuthenticatedUsers

#Filtered
Test-GPOAuthenticatedUsers | Where{$_.status -eq 'Error'}

#Export to CSV
Test-GPOAuthenticatedUsers | Export-CSV -path 'C:\temp\Test-GPOAuthenticatedUsers.csv' -NoTypeInformation

This is all fine and nice for reporting purposes…but lets actually fix something:

Function Fix-GPOAuthenticatedUsers{
    [CmdletBinding()]
    Param(
      #Path to the log file 
      [parameter(Mandatory=$True)] 
      [string[]]$GPOID,
      
      #Which AD Group to use
      [parameter(Mandatory=$True)] 
      [ValidateSet('Domain Computers','Authenticated Users')] 
      [string]$Group 
    )
    Begin{
        #Load GPO module
        Import-Module GroupPolicy
    }
    Process{
        ForEach($GUID in $GPOID){
            Write-Verbose "Processing GPO $GUID"
            #Get the GPO
            if(!(Get-GPO -id $GUID)){
                Write-Error "Unable to find GPO matching $GUID"
            }

            #Try Set the permissions
            $null = Set-GPPermissions -Guid $GUID -PermissionLevel GpoRead -TargetName $Group -TargetType Group -ErrorAction Stop

            #Test GPO perms
            $AuthUser = Get-GPPermissions -Guid $GUID -TargetName “Authenticated Users” -TargetType Group -ErrorAction SilentlyContinue
            if ($AuthUser.Permission -notin @('GPORead','GPOApply')) {
                $Message= 'Authenticated Users read/apply permissions found not after setting'
                Write-Error $Message
            }
            Write-Verbose "Completed Processing GPO $GUID"
        }
    }
    End{
    }
}

Usage again is quite simple…specify a GPO Guid(s) and whether you like to use Authenticated users or Domain Computers and Voila!

$BadGPOs = Test-GPOAuthenticatedUsers | Where{$_.status -eq 'Error'}
Fix-GPOAuthenticatedUsers -GPOID $BadGPOs.id -Group 'Authenticated Users'

Enjoy!

ConfigMgr Client Fails to Install: Unable to Compile UpdatesAgent.mof

We’ve had a couple of computers in the past being unable to re-install the Configuration Manager client due to the error:
“Unable to compile UpdatesAgent.mof”

This error can have a couple of different causes.

As such, here are a couple of steps you can try:

1. Reinstall the Windows Update agent. https://support.microsoft.com/en-ca/kb/949104
2. Uninstall any existing ConfigMgr client, stop the ccmsetup service and delete c:\windows\ccm, c:\windows\ccmsetup and c:\windows\ccmcache folders
3. Run the following commands to delete the ConfigMgr namespaces completely from WMI:

Gwmi –query “Select * from __Namespace Where Name=’sms’” –NameSpace “root\cimv2” | Remove-WmiObject
Gwmi –query “Select * from __Namespace Where Name=’ccm’” –NameSpace “root” | Remove-WmiObject
Gwmi –query “Select * from __Namespace Where Name=’smsdm’” –NameSpace “root” | Remove-WmiObject

Since #3 is quite drastic, you will want to try steps 1 and 2 first before 3. However if attempting step 3, you will want to complete both steps 2 and 3 together. After this, the ConfigMgr client should successfully install.

Hopefully this helps!

Recursively Discover WMI Namespaces

Sometimes when building custom functions based on WMI it can be helpful to discover all of the WMI namespaces registered on a machine. Here is a function to do just that!

Function Find-WMINamespaces{
    Param(
        [string]$Computer,
        [string]$StartingNameSpace = 'root',
        [boolean]$recurse = $true
    )
    $childNamespaces = gwmi -namespace $StartingNameSpace -class "__Namespace" -computername $computer | select -expandproperty Name
    ForEach($child in $childNamespaces){
       [PSCustomObject]@{'Name'="${StartingNamespace}\${child}"}
       if($recurse){
            Find-WMINamespaces -Computer $Computer -StartingNamespace "${StartingNamespace}\${child}" -recurse $true
        }
    }
}

Automating Download of Google Chrome

If you manage multiple computers, you will want to ensure that Google Chrome is always up to date. This important as Google contains flash built in by default as well as to patch any security vulnerabilities in Chrome. In order to accomplish this, historically you had two options:
1. Allow all your computers to auto – update. This works, but all your computers are downloading approximately 50 MB every time it updates. If you are on a slow connection, or you have lots of computers, this will be an issue.
2. Manually download the Google Chrome for Business msi file and deploy it via Microsoft System Center Configuration manager or group policy

You may also have a situation where neither of these are possible/desirable. May I present a 3rd option:
3. Have a scheduled task (or perhaps an Orchestrator or SMA runbook) check the latest available version and download it if there is a newer version available. Use another scheduled task on each computer to install the update from your central download cache if it is newer than the version installed.
Continue reading

SysJam Powershell RightClick Tool – Part 7 – Querying 64bit Registry Strings with 32bit Powershell

Suppose you are writing a script that requires a 32 bit powershell window. How can you query 64bit registry keys from this script?

If you are connecting to a remote computer your most obvious and best option is Enter-PSSession or New-PSSession to open a native powershell session on the remote computer.

You also have other options for querying registry strings but we need to be careful

Option 1: Get-Item
Lets first consider using Get-Item directly. Here is a very simple function with no error control (or warranty):

Function Get-RegStringValueByGetItem{
    Param([string]$KeyPath,
            [string]$StringName)
    $objRegKey = Get-ItemProperty $KeyPath -Name $StringName
    Return $objRegKey.($stringName)
}

When running from a 64bit powershell session, this returns as expected:

Get-RegStringValueByGetItem -KeyPath 'HKLM:\software\microsoft\Windows\CurrentVersion' -StringName 'CommonFilesDir'

C:\Program Files\Common Files

Get-RegStringValueByGetItem -KeyPath 'HKLM:\software\Wow6432Node\microsoft\Windows\CurrentVersion' -StringName 'CommonFilesDir'

C:\Program Files (x86)\Common Files

Lets try this from a 32bit powershell session:

Get-RegStringValueByGetItem -KeyPath 'HKLM:\software\microsoft\Windows\CurrentVersion' -StringName 'CommonFilesDir'

C:\Program Files (x86)\Common Files

Get-RegStringValueByGetItem -KeyPath 'HKLM:\software\Wow6432Node\microsoft\Windows\CurrentVersion' -StringName 'CommonFilesDir'

C:\Program Files (x86)\Common Files

So, you can see that when running from a 32bit session we get redirected to the Wow6432Node without warning.

Option 2: Microsoft.Win32.RegistryKey
We know that Powershell has access to the DotNet Classes lets try it through the Microsoft.Win32.RegistryKey class.
You can read more about this class here: https://msdn.microsoft.com/en-us/library/Microsoft.Win32.RegistryKey(v=vs.110).aspx

Function Get-RegStringValueBYMicrosoft.Win32.RegistryKeyCLASS{
    Param([string]$KeyPath,
            [string]$StringName,
            [string]$keyRoot
            )
    Switch($keyRoot){
        'HKLM' {$strKeyRoot = [Microsoft.Win32.RegistryHive]'LocalMachine'}
        'HKCR' {$strKeyRoot = [Microsoft.Win32.RegistryHive]'ClassesRoot'}
        'HKCU' {$strKeyRoot = [Microsoft.Win32.RegistryHive]'CurrentUser'}
        'HKU'  {$strKeyRoot = [Microsoft.Win32.RegistryHive]'Users'}
    }

    $objReg = [Microsoft.Win32.RegistryKey]::OpenRemoteBaseKey($strKeyRoot, 'localhost')
    $strKeyPath = $keyPath -replace '\\','\\'
    $objSubKey = $objReg.OpenSubKey($strKeyPath)
    $strValue = $objSubKey.GetValue($StringName)

    Return ($strValue)
}

When running from a 64bit powershell session, this returns as expected:

Get-RegStringValueBYMicrosoft.Win32.RegistryKeyCLASS -KeyPath 'software\microsoft\Windows\CurrentVersion' -StringName 'CommonFilesDir' -keyRoot 'HKLM'

C:\Program Files\Common Files

Get-RegStringValueBYMicrosoft.Win32.RegistryKeyCLASS -KeyPath 'software\Wow6432Node\microsoft\Windows\CurrentVersion' -StringName 'CommonFilesDir' -keyRoot 'HKLM'

C:\Program Files (x86)\Common Files

Lets try this from a 32bit powershell session:

Get-RegStringValueBYMicrosoft.Win32.RegistryKeyCLASS -KeyPath 'software\microsoft\Windows\CurrentVersion' -StringName 'CommonFilesDir' -keyRoot 'HKLM'

C:\Program Files (x86)\Common Files

Get-RegStringValueBYMicrosoft.Win32.RegistryKeyCLASS -KeyPath 'software\microsoft\Windows\CurrentVersion' -StringName 'CommonFilesDir' -keyRoot 'HKLM'

C:\Program Files (x86)\Common Files

So, you can see that once again when running from a 32bit session we get redirected to the Wow6432Node without warning.

Option 3: WbemScripting.SWbemNamedValueSet
This last option may take a little to wrap your head around.

Function Get-RegStringValueBYWbemScripting.SWbemNamedValueSetCOM{
    Param([string]$KeyPath,
            [string]$StringName,
            [string]$keyRoot)

    Switch($keyRoot){
        'HKLM' {$strKeyRoot = '&h80000002'}
        'HKCR' {$strKeyRoot = '&h80000000'}
        'HKCU' {$strKeyRoot = '&h80000001'}
        'HKU'  {$strKeyRoot = '&h80000003'}
    }

    #Use the wbem scripting com object to enumerate the 64 bit standard registry provider
    $objNamedValueSet = New-Object -COM 'WbemScripting.SWbemNamedValueSet'
    $objNamedValueSet.Add('__ProviderArchitecture', 64) | Out-Null
    $objLocator = New-Object -COM 'Wbemscripting.SWbemLocator'
    $objServices = $objLocator.ConnectServer('localhost', 'root\default', '', '', '', '', '', $objNamedValueSet)
    $objStdRegProv = $objServices.Get('StdRegProv')
    $Inparams = ($objStdRegProv.Methods_ | where { $_.name -eq 'GetStringValue' }).InParameters.SpawnInstance_()

    # Add the input parameters
    $regkey = $keyPath -replace '\\','\\'
    ($Inparams.Properties_ | where { $_.name -eq 'Hdefkey' }).Value = $strkeyroot
    ($Inparams.Properties_ | where { $_.name -eq 'Ssubkeyname' }).Value = $regkey
    ($Inparams.Properties_ | where { $_.name -eq 'Svaluename' }).Value = $StringName	

    #Execute the method
    $Outparams = $objStdRegProv.ExecMethod_('GetStringValue', $Inparams, '', $objNamedValueSet)
    Return ($Outparams.Properties_ | where { $_.name -eq 'sValue' }).Value

}

You can read more about this COM object here: https://msdn.microsoft.com/en-us/library/aa390788(v=vs.85).aspx and the underlying StdRegProv here: https://msdn.microsoft.com/en-us/library/aa393664%28v=vs.85%29.aspx?f=255&MSPPError=-2147217396

Essentially what we are doing is using the WMI scripting Com object to reference the 64bit Registry provider.

Lets see how this works in a 64 bit session:

Get-RegStringValueBYWbemScripting.SWbemNamedValueSetCOM -KeyPath 'software\microsoft\Windows\CurrentVersion' -StringName 'CommonFilesDir' -keyRoot 'HKLM'

C:\Program Files\Common Files

Get-RegStringValueBYWbemScripting.SWbemNamedValueSetCOM -KeyPath 'software\Wow6432Node\microsoft\Windows\CurrentVersion' -StringName 'CommonFilesDir' -keyRoot 'HKLM'

C:\Program Files (x86)\Common Files

And from a 32bit session:

Get-RegStringValueBYWbemScripting.SWbemNamedValueSetCOM -KeyPath 'software\microsoft\Windows\CurrentVersion' -StringName 'CommonFilesDir' -keyRoot 'HKLM'

C:\Program Files\Common Files

Get-RegStringValueBYWbemScripting.SWbemNamedValueSetCOM -KeyPath 'software\Wow6432Node\microsoft\Windows\CurrentVersion' -StringName 'CommonFilesDir' -keyRoot 'HKLM'

C:\Program Files (x86)\Common Files

Finally!

You can see that we needed to use an intermediate session (the WBEM provider) to query the 64 bit registry from a 32 bit shell. This is also the method I used to get the execution history in the SysJam Powershell Rightclick tool from an SCCM client which you can download at https://github.com/mdedeboer/SysJamRightClickToolForConfigMgr and is usable with any of the methods defined by the StdRegProv class.

SysJam Powershell RightClick Tool – Part 6 – Getting Running ConfigMgr Jobs with Powershell

One of the key functionality improvements I wanted to include in the Sysjam Powershell RightClick tool was realtime running job monitoring. There are few tools that don’t require you to click some type of refresh button to see the running jobs.

Part 1 of providing this functionality is the using powershell jobs combined with a timer object to polling any data on a refresh cycle and updating the form. I have covered this previously here

Part 2 of this is querying WMI for the status of each running jobs…and translating this to english.

This post will concentrate on Part 2

Continue reading

Modifying Permissions on an Active Directory OU with Powershell

There are a few other blogs about this topic, but even so this is not an easy topic to wrap your head around.

To start you off, you’ll want to read:
https://social.technet.microsoft.com/Forums/Lync/en-US/df3bfd33-c070-4a9c-be98-c4da6e591a0a/forum-faq-using-powershell-to-assign-permissions-on-active-directory-objects?forum=winserverpowershell (This is a must read!)
and
http://blogs.technet.com/b/joec/archive/2013/04/25/active-directory-delegation-via-powershell.aspx
Continue reading

Using PS Math to check a flag value – Checking Lync RCC without Lync PS Module

I had a unique situation where I didn’t want to deploy the Lync 2013 powershell module to a group of people, but I wanted a way to quickly detect if remote call control (rcc) was enabled for the user. I wasn’t concerned with modifying the setting and simply wanted to read the setting.

I found that the RCC settings for Lync are stored as flags in the AD object. You can view this using ADSI edit and looking for the property “msRTCSIP-OptionFlags”. Then, using this as a reference, we can decode if RCC is configured.

Powershell [math] to the rescue!

In order to tackle this, I needed to know that in behind the scenes the flag is actually calculated using binary. This means that each flag is tied to a power of 2. For example 2^5 = 32. So what we’ll do is find the highest power of two that is less than our flag value of 16. Then we’ll subtract it and do it again. We’ll stop as soon as we’re down to flags totaling less than 32 (1 power of 2 higher than the flag we want). If this number is less than 16, RCC is not enabled. If this number is greater than or equal to 16 RCC is enabled.

First, we’ll need a function to find the highest power of two for a number:

Function Get-HighestPowerOf2{
    #Returns the highest x in 2^x -le $number
    Param(
        [double]$number
    )
    return ([math]::Floor([math]::log($number,2)))
}

Second, we’ll need a function to use the highest power of two and subtract it until we get down to a number less than 32 and check if it is greater than or equal to 16.

Function Get-FlagEnabled{
    Param(
        [double]$Flags,
        [double]$DesiredFlag
    )
    #Get one higher power of two than the flag we want.
    $intOneHigherFlag = [math]::pow(2, (Get-HighestPowerOf2 -number $DesiredFlag) + 1)

    #Remove the sum of all the powers of two higher than the flag we want until the value is less than the flag above the one we want
    Do{
        $Flags = $Flags - ([Math]::pow(2,(Get-HighestPowerOf2 -number $Flags)))
    }While($Flags -ge $intOneHigherFlag) #one power of two higher than 16 (2^5)

    #If the remaining flags are greater than or equal to our desired flag, then our flag is enabled otherwise it is disabled
    If($Flags -ge $DesiredFlag){
        Return $true
    }else{
        Return $false
    }
}

Then test!

#Get the AD user
$objADUser = Get-ADUser -Properties msRTCSIP-OptionFlags -filter {SamAccountName -eq "YourUserName"}

#Get the flags property
$intFlags = [Double]($objADUser | Select-Object -ExpandProperty "msRTCSIP-OptionFlags")

#Is RCC enabled?
Get-FlagEnabled -Flags $intFlags -DesiredFlag 16

Note that this should work for any field that is stored as a flag. You just need to know what flag value you want.

SysJam Powershell RightClick Tool – Part 4 – Getting Users Logged onto a Remote Computer

This post is a combination of the Get-LoggedOnUser script by Jaap Brasser available on the technet gallery here and my post on powershell jobs available here. I am taking both of these concepts and wrapping it all inside my powershell right click tool for SCCM available at https://github.com/mdedeboer/SysJamRightClickToolForConfigMgr.

As you may know, there is no nice way to check a user’s session information with any out-of-the-box powershell cmdlet. Additionally, you could try query this from the SCCM client via wmi, however you’ll be hard pressed to get this to include disconnected terminal sessions as well as console sessions. At least however there is a command available for this. It is called qwinsta or if you prefer its alias quser or another alias “Query User”. (Note there is also a rwinsta command with its alias “Reset Session”).

If you were to type quser into your powershell console however it just spits out text. No nice objects. So if we were to try something like “quser /server:127.0.0.1 | Select-Object -Property Username” you’d get a whole lot of nothing. We’ll need to parse the output ourselves. Thankfully Jaap Brasser has done much of the work for us! This script block will be run by a separate powershell job, so in order to make sure I get all the data I have added each custom object holding the user session information into a hash table that I can pass back to my powershell job. I could just have easily used an array, so no specific reason here.

Lets take a look:

$sbGetLoggedOnUsers = {
	Param ($CompName)     #CompName predefined
	$HashTableCollection = @{ }  #Hashtable to keep all the user sessions in
	$i = 0
	quser /server:$CompName 2> "" | Select-Object -Skip 1 | ForEach-Object { #Get the user information (ignore any errors), skip the title line, for each line after the first line.....
		$CurrentLine = $_.Trim() -Replace '\s+', ' ' -split '\s'  #get rid of repeating whitespaces and split on whitespaces (gives an array of strings)
		$HashProps = @{      #Pre-define a hashtable of properties
			UserName = $CurrentLine[0] #first string in $Currentline
			ComputerName = $CompName
			SessionName = $null
			ID = $null
			State = $null
			IdleTime = $null
			LogonTime = $null
		}
		If ($CurrentLine[2] -eq 'Disc')  #If the session is disconnected quser gives a different layout than otherwise
		{
			$HashProps.SessionName = $null #since the session is disconnected it doesn't have a name
			$HashProps.ID = $CurrentLine[1] #second string in $Current line
			$HashProps.State = $CurrentLine[2] #third string in $Current line
			$HashProps.IdleTime = $CurrentLine[3] #forth string in $Current line
			$HashProps.LogonTime = $CurrentLine[4..6] -join ' ' #fifth to seventh strings in $current line
		}
		else
		{
			$HashProps.SessionName = $CurrentLine[1] #second string in $Current line
			$HashProps.ID = $CurrentLine[2] #third string in $Current line
			$HashProps.State = $CurrentLine[3] #forth string in $Current line
			$HashProps.IdleTime = $CurrentLine[4] #fifth string in $Current line
			$HashProps.LogonTime = $CurrentLine[5..7] -join ' ' #sixth to eighth strings in $current line
		}
		$UserObject = New-Object -TypeName PSCustomObject -Property $HashProps | Select-Object -Property UserName, ComputerName, SessionName, ID, State, IdleTime, LogonTime #Create a new powershell object holding all the properties defined in the hashtable
		if (!($HashTableCollection.ContainsKey($i)))
		{
			$HashTableCollection.Add($i, $UserObject) #Add the custom object to another hashtable
		}
		$i++ #Next user
	}
	return $HashTableCollection #Return hashtable of all user sessions
}

If you are following along at home, you’ll notice a few differences with my version than Jaap’s (besides the script block and extra hash table).
1. quser /server:$CompName 2> “” | Select-Object -Skip 1 | ForEach-Object {
The 2> “” silently discards any error data. This allows us to be sure of what we are parsing.
2. $HashProps = @{
UserName = $CurrentLine[0]
ComputerName = $CompName
SessionName = $null
ID = $null
State = $null
IdleTime = $null
LogonTime = $null
This pre-defines the hashtable’s keys before it has values which provides more consistency.

After the script block is complete you essentially have something similar to this returned (where PSCustomObject is the custom object holding each user’s session information):

@{
   1=PSCustomObject
   2=PSCustomObject
}

The next function basically triggers the scriptblock and when it completes updates a pre-created datagridview with the following columns: Username, ComputerName, SessionName, ID, State, IdleTime, LogonTime. These columns could be named anything, just so long as they are in this order.

If you want the code for the Add-JobTracker function, see my post on this here

function fnLoggedOnUsers
{
	$statusBar1.Text = "Getting Logged On Users"
	if ($dGUsers.Rows.Count -ne 0)
	{
		$dGUsers.Rows.Clear()
	}
	$strComputer = $tbCompName.Text

	if ($strComputer -and $strComputer -ne "")
	{
		Add-JobTracker -Name (Get-Random) `
					   -JobScript $sbGetLoggedOnUsers `
					   -CompletedScript {
			Param ($Job)
			$HashTableUsers = @{ }
			$HashTableUsers = Receive-Job -Job $Job
			ForEach ($key in $HashTableUsers.keys)
			{
				$dGUsers.Rows.Add($HashTableUsers.Item($key).UserName, $HashTableUsers.Item($key).ComputerName, $HashTableUsers.Item($key).SessionName, $HashTableUsers.Item($key).ID, $HashTableUsers.Item($key).State, $HashTableUsers.Item($key).IdleTime, $HashTableUsers.Item($key).LogonTime)
			}
			$statusBar1.Text = ""
		}`
					   -UpdateScript {
		}`
					   -ArgumentList $strComputer
	}
}

So what is happening here? This is all covered in my previous post, but here is a high-level overview:
1. Update a label with the value: “Getting Logged On Users”
2. Clear the datagridview
3. Get the computer name from a text box
4. If the computer name is something, spawn a new powershell job that runs the script block we created above with the computername passed as a parameter.
5. A timer (defined in blog post part 2) checks each powershell job to see if it is finished.
6. When the job is finished the -CompletedScript {} script block is executed in the original thread. This script block uses the output of the first script block as it’s input. For this reason we can be confident that we are getting a hashtable of custom powershell objects as input. We then add each item from the hashtable to the datagridview.