Note – The best method of assigning the DNS Servers is through the DHCP server. If you are setting the IP using DHCP, always make sure you add/remove additional DNS Client Servers from there. In my situation, there was no DHCP server, hence the detailed blog post.
Prerequsites
We are going to implement this configuration via Microsoft Intune using the Scripts:
The necessary Microsoft Intune permissions to create, the PowerShell Scripts.
A device group available within Microsoft Entra with all the devices you want to target this change.
PowerShell Script for DNSClient (Additional DNS Servers)
Save the below script and place on the desktop and we shall be uploading it to Microsft Intune portal – “AddDNSClient.ps1″
Please enter the proper DNS Server Address within the script based on your environment and requirement. In the example below the existing two DNS servers are 8.8.8.8 and 8.8.8.4. We are adding additional two DNS Servers 9.9.9.9 and 9.9.9.4.
Select Devices > Scripts > Add > Windows 10 and later.
In Basics, enter the following properties, and select Next:
Name: AddDNSClientServers
Description: Additional DNS Server 3 & 4
In Script settings, enter the following properties, and select Next:
Script location: Browse to the PowerShell script. saved previously and upload it (AddDNSClient.ps1)
Run this script using the logged on credentials: Select No.
Enforce script signature check: Select No
Run script in 64-bit PowerShell host: Select Yes to run the script in a 64-bit PowerShell host on a 64-bit client architecture.
Select Assignments > Select groups to include. Add the AAD group “Win11-P-DG”
Wait for approx. 15-20 minutes and the policy will apply to the managed devices. (Machine Win11-Intune-15)
Managed Device
You can validate that the settings have been applied to the client by going to the path – C:\ProgramData\Microsoft\IntuneManagementExtension\Logs and opening the file IntuneManagementExtension.txt. I copied the policy ID – cf09649b-78b7-4d98-8bcc-b122c29e5527 from the Intune portal hyperlink and searched within the log file. We can see the policy has been applied successfully.
I hope you will find this helpful information for applying additional DNS servers via Intune – Scripts and PowerShell. Please let me know if I have missed any steps or details, and I will be happy to update the post.
Suppose you’ve ever had to search for a particular or a list of GPO settings across a large number of Group Policy Objects (GPOs) within your domain. In that case, you know how tedious it can be to find specific settings across hundreds or thousands of GPOs. PowerShell comes to the rescue with a powerful script that can search for GPO settings across all your existing GPOs and generate an organized CSV output. In this blog post, we’ll walk you through the process and ensure you have all the prerequisites to get started.
Usecase
You have approx. 50 to 60 GPO settings from the Center of Internet Security (CIS) benchmark policies document (CIS Microsoft Windows Desktop Benchmarks/CIS Microsoft Windows Server Benchmarks), which you may want to search against your domain, whether they are already preconfigured\existing available within a GPO or not present in the environment. Instead of searching manually one by one, you may want to use the below PowerShell to get results like a champion.
Prerequisites
Before using the PowerShell script, ensure you have the following prerequisites in place:
Windows PowerShell version 5.0 and above
Active Directory Module for Windows PowerShell
Permissions: Ensure you have sufficient permissions to access and analyze GPO settings. Typically, you need to be a member of the Domain Administrators group or have equivalent privileges.
Execute the script from a member server that is part of the domain and has the necessary permissions.
Prepare the input file (inputgpo.txt) and enter the GPO setting one per line and save the file. In my situation, it’s present in C:\Temp
Relax minimum password length limits
Allow Administrator account lockout
Generate security audits
Impersonate a client after authentication
Lock pages in memory
Replace a process level token
Accounts: Block Microsoft accounts
Interactive logon: Machine inactivity limit
Microsoft network server: Server SPN target name validation level
Network access: Remotely accessible registry paths
Network security: Configure encryption types allowed for Kerberos
Audit Security State Change
Do not allow password expiration time longer than required by policy
Password Settings: Password Complexity
Password Settings: Password Length
Password Settings: Password Age (Days)
#Domain
$DomainName = "askaresh.com"
# Initialize matchlist
$matchlist = @()
# Collect all GPOs
$GPOs = Get-GPO -All -Domain $DomainName
# Read search strings from text file
# A list of GPOs settings you want to search
$SearchStrings = Get-Content -Path "C:\Temp\inputgpo.txt"
# Hunt through each GPO XML for each search string
foreach ($searchString in $SearchStrings) {
$found = $false
foreach ($gpo in $GPOs) {
$GPOReport = Get-GPOReport -Guid $gpo.Id -ReportType Xml
if ($GPOReport -match $searchString) {
$match = New-Object PSObject -Property @{
"SearchString" = $searchString
"GPOName" = $gpo.DisplayName
}
$matchlist += $match
$found = $true
}
}
if (-not $found) {
$match = New-Object PSObject -Property @{
"SearchString" = $searchString
"GPOName" = "No results found"
}
$matchlist += $match
}
}
# Output results to CSV, Search results
# This step will take time depending how many 100's or 1000's policies present in the enviornment
$matchlist | Export-Csv -Path "C:\Temp\gposearch.csv" -NoTypeInformation
Output (Results)
The ouput will look like the following within CSV:
I hope you will find this helpful information for searching GPO settings across 100’s and 1000’s of GPOs within your domain. Please let me know if I have missed any steps or details, and I will be happy to update the post.
In this blog post, we will explore how to extract highlighted data from a PDF using Python. Before we go ahead lets understand what is the usecase, you have the (CIS_Microsoft_Windows_Server_2022_Benchmark_v2.0.0.pdf) which is 1065 pages and you are reviewing the the policy against your enivornment and highlighting the pdf with specific color codes. For example, I use four colors for the following purposes:
Red Color – Missing Policies
Yellow Color – All existing policies
Pink – Policies not applicable
Green – Upgraded policies
Example of the highlighted text:
You dont have to use the same color codes like I have done but you get the idea. Once you have done the heavy lifting of reviewing the document and happy with the analysis. The next steps is you want to extract the highlighted data into a csv format so that the teams can review and action them.
Pre-requsites
We will use the PyMuPDF & Pandas library to parse the PDF file and extract the highlighted text. Additionally, we will apply this technique to the CIS Windows Server 2022 Benchmark PDF as an example.
Before we begin, make sure you have installed the necessary dependencies. You can install PyMuPDF and Pandas using pip:
pip install fitz
pip install pandas
First, I created a small script to go within the document pdf and detect the colors. I had to do this although, to my eyes, the colors are RED, Yellow, etc., the RGD color codes seem slightly different.
import fitz # PyMuPDF
# Open the PDF
doc = fitz.open('CIS_Microsoft_Windows_Server_2022_Benchmark_v2.0.0.pdf')
# Set to store unique colors
unique_colors = set()
# Loop through every page
for i in range(len(doc)):
page = doc[i]
# Get the annotations (highlights are a type of annotation)
annotations = page.annots()
for annotation in annotations:
if annotation.type[1] == 'Highlight':
# Get the color of the highlight
color = annotation.colors['stroke'] # Returns a RGB tuple
unique_colors.add(color)
# Print all unique colors
for color in unique_colors:
print(color)
You will get the following output post executing the script make sure you put the exact name of the PDF file and within the IDE of your choice cd to the directory where the above (CheckColor.py) resides.
Now we have the color codes it’s time to go ahead and extract the highlighted text. We iterate through each page of the PDF and check for any highlighted annotations. If an annotation is found, we extract the content and accumulate it in the extracted_text variable, followed by export to the csv.
Main Code
Replace "CIS_Microsoft_Windows_Server_2022_Benchmark_v2.0.0.pdf" with the actual path to your PDF file.
import fitz # PyMuPDF
import pandas as pd
# Open the PDF
doc = fitz.open('CIS_Microsoft_Windows_Server_2022_Benchmark_v2.0.0.pdf')
# Define the RGB values for your colors
PINK = (0.9686269760131836, 0.6000000238418579, 0.8196079730987549)
YELLOW = (1.0, 0.9411770105361938, 0.4000000059604645)
GREEN = (0.49019598960876465, 0.9411770105361938, 0.4000000059604645)
RED = (0.9215689897537231, 0.2862749993801117, 0.2862749993801117)
color_definitions = {"Pink": PINK, "Yellow": YELLOW, "Green": GREEN, "Red": RED}
# Create separate lists for each color
data_by_color = {"Pink": [], "Yellow": [], "Green": [], "Red": []}
# Loop through every page
for i in range(len(doc)):
page = doc[i]
annotations = page.annots()
for annotation in annotations:
if annotation.type[1] == 'Highlight':
color = annotation.colors['stroke'] # Returns a RGB tuple
if color in color_definitions.values():
# Get the detailed structure of the page
structure = page.get_text("dict")
# Extract highlighted text line by line
content = []
for block in structure["blocks"]:
for line in block["lines"]:
for span in line["spans"]:
r = fitz.Rect(span["bbox"])
if r.intersects(annotation.rect):
content.append(span["text"])
content = " ".join(content)
# Append the content to the appropriate color list
for color_name, color_rgb in color_definitions.items():
if color == color_rgb:
data_by_color[color_name].append(content)
# Convert each list to a DataFrame and write to a separate .csv file
for color_name, data in data_by_color.items():
if data:
df = pd.DataFrame(data, columns=["Text"])
df.to_csv(f'highlighted_text_{color_name.lower()}.csv', index=False)
After running the script, the extracted highlighted text will be saved under multiple csv files like the below screenshot:
You can now extract the highlighted text from the PDF using the above technique. Feel free to modify and adapt this code to suit your specific requirements. Extracting highlighted data from PDFs can be a powerful data analysis and research technique.
I hope you will find this helpful information for extracting data out from any PDF files. Please let me know if I have missed any steps or details, and I will be happy to update the post.
I have a blog post about creating a Windows 365 Cloud PC Provisioning Policy using PowerShell. In this post blog, I will demonstrate how to create the provisioning policy using PowerShell and MS Graph API with beta modules for Windows 365 Cloud PC – Frontline Workers.
Example – Each Windows 365 Frontline license can be shared with up to three employees. This means that if you have 30 employees, you only need to purchase 10 licenses to provision the CloudPC for all 30 employees with access over the day. However, note you are buying the frontline license based on the active sessions. You must purchase the license accordingly if you have more than 10 active workers in a shift.
What happens when license are exhausted?
In my demo tenant, I have two licenses for Frontline workers. When I try to log in to the third one (Note I have already logged into 2 active sessions and running them.) Get the following message.
Connect to MS Graph API
Step 1 – Install the MS Graph Powershell Module
#Install Microsoft Graph Beta Module
PS C:WINDOWSsystem32> Install-Module Microsoft.Graph.Beta
Step 2 – Connect to scopes and specify which API you wish to authenticate to. If you are only doing read-only operations, I suggest you connect to “CloudPC.Read.All” in our case, we are creating the policy, so we need to change the scope to “CloudPC.ReadWrite.All”
#Read-only
PS C:WINDOWSsystem32> Connect-MgGraph -Scopes "CloudPC.Read.All" -NoWelcome
Welcome To Microsoft Graph!
OR
#Read-Write
PS C:WINDOWSsystem32> Connect-MgGraph -Scopes "CloudPC.ReadWrite.All" -NoWelcome
Welcome To Microsoft Graph!
Permissions for MS Graph API
Step 3 – Check the User account by running the following beta command.
If you are doing on-premise network integration (Azure Network Connection) , then the following additional property and value is required. In my lab, I am leveraging the Microsoft Managed Network, so this is not required.
I hope you will find this helpful information for creating a frontline worker provisioning policy using PowerShell. Please let me know if I have missed any steps or details, and I will be happy to update the post.
Microsoft Security Response Center (MSRC) publishes a monthly consolidated report for all the Critical, Important, Moderate and Low security vulnerabilities affecting Microsoft products. The information posted there helps organizations manage security risks and keep their systems protected.
Looking at the overall release notes for all the affected products (30+ products) and filtering the OS you are interested in can become overwhelming. E.g. You are only interested in products Operating Systems – Windows 11 22h2 or Windows Server 2016/2019/2022. Not saying other information is not essential, but imagine you are only responsible for the Operating Sytems. In such situations, you can use the below script to get a monthly report of CVE (Critical & Important) for a particular OS over to you in an email.
Prerequsites
You will need the MSRCSecurity PowerShell module. Run the command to install the module; further, you can import the module within the script.
Following are the Operating System (OS) products you can fetch the information against. If you want details for any other operating systems, copy that value and, in my script, paste it under the variable $ClientOS_Type. In my demonstration, we used “Windows 11 Version 22H2 for x64-based Systems”
$ID = Get-MsrcCvrfDocument -ID $Month
$ID.ProductTree.FullProductName
ProductID Value
--------- -----
10049 Windows Server 2008 R2 for x64-based Systems Service Pack 1 (Server Core installation)
10051 Windows Server 2008 R2 for x64-based Systems Service Pack 1
10287 Windows Server 2008 for 32-bit Systems Service Pack 2 (Server Core installation)
10378 Windows Server 2012
10379 Windows Server 2012 (Server Core installation)
10407 Microsoft Outlook 2013 RT Service Pack 1
10483 Windows Server 2012 R2
10543 Windows Server 2012 R2 (Server Core installation)
10601 Microsoft Office 2013 Service Pack 1 (32-bit editions)
10602 Microsoft Office 2013 Service Pack 1 (64-bit editions)
10603 Microsoft Office 2013 RT Service Pack 1
10611 Microsoft Office Web Apps Server 2013 Service Pack 1
10612 Microsoft SharePoint Foundation 2013 Service Pack 1
10654 Microsoft Excel 2013 Service Pack 1 (32-bit editions)
10655 Microsoft Excel 2013 Service Pack 1 (64-bit editions)
10656 Microsoft Excel 2013 RT Service Pack 1
10729 Windows 10 for 32-bit Systems
10735 Windows 10 for x64-based Systems
10739 Microsoft Excel 2016 (32-bit edition)
10740 Microsoft Excel 2016 (64-bit edition)
10753 Microsoft Office 2016 (32-bit edition)
10754 Microsoft Office 2016 (64-bit edition)
10765 Microsoft Outlook 2016 (32-bit edition)
10766 Microsoft Outlook 2016 (64-bit edition)
10810 Microsoft Outlook 2013 Service Pack 1 (32-bit editions)
10811 Microsoft Outlook 2013 Service Pack 1 (64-bit editions)
10816 Windows Server 2016
10852 Windows 10 Version 1607 for 32-bit Systems
10853 Windows 10 Version 1607 for x64-based Systems
10855 Windows Server 2016 (Server Core installation)
10950 Microsoft SharePoint Enterprise Server 2016
11099 Microsoft SharePoint Enterprise Server 2013 Service Pack 1
11568 Windows 10 Version 1809 for 32-bit Systems
11569 Windows 10 Version 1809 for x64-based Systems
11570 Windows 10 Version 1809 for ARM64-based Systems
11571 Windows Server 2019
11572 Windows Server 2019 (Server Core installation)
11573 Microsoft Office 2019 for 32-bit editions
11574 Microsoft Office 2019 for 64-bit editions
11575 Microsoft Office 2019 for Mac
11585 Microsoft SharePoint Server 2019
11600 Microsoft Visual Studio 2017 version 15.9 (includes 15.0 - 15.8)
11605 Microsoft Office Online Server
11655 Microsoft Edge (Chromium-based)
11664 Microsoft Dynamics 365 (on-premises) version 9.0
11726 OneDrive for Android
11762 Microsoft 365 Apps for Enterprise for 32-bit Systems
11763 Microsoft 365 Apps for Enterprise for 64-bit Systems
11800 Windows 10 Version 20H2 for x64-based Systems
11801 Windows 10 Version 20H2 for 32-bit Systems
11802 Windows 10 Version 20H2 for ARM64-based Systems
11902 Microsoft Malware Protection Engine
11921 Microsoft Dynamics 365 (on-premises) version 9.1
11923 Windows Server 2022
11924 Windows Server 2022 (Server Core installation)
11926 Windows 11 version 21H2 for x64-based Systems
11927 Windows 11 version 21H2 for ARM64-based Systems
11929 Windows 10 Version 21H2 for 32-bit Systems
11930 Windows 10 Version 21H2 for ARM64-based Systems
11931 Windows 10 Version 21H2 for x64-based Systems
11935 Microsoft Visual Studio 2019 version 16.11 (includes 16.0 - 16.10)
11951 Microsoft Office LTSC for Mac 2021
11952 Microsoft Office LTSC 2021 for 64-bit editions
11953 Microsoft Office LTSC 2021 for 32-bit editions
11961 Microsoft SharePoint Server Subscription Edition
11969 Microsoft Visual Studio 2022 version 17.0
11987 Azure HDInsights
12051 Microsoft Visual Studio 2022 version 17.2
12085 Windows 11 Version 22H2 for ARM64-based Systems
12086 Windows 11 Version 22H2 for x64-based Systems
12097 Windows 10 Version 22H2 for x64-based Systems
12098 Windows 10 Version 22H2 for ARM64-based Systems
12099 Windows 10 Version 22H2 for 32-bit Systems
12129 Microsoft Visual Studio 2022 version 17.4
12137 CBL Mariner 1.0 x64
12138 CBL Mariner 1.0 ARM
12139 CBL Mariner 2.0 x64
12140 CBL Mariner 2.0 ARM
12142 Microsoft Edge (Chromium-based) Extended Stable
12155 Microsoft Office for Android
12156 Microsoft Office for Universal
12167 Microsoft Visual Studio 2022 version 17.5
12169 OneDrive for MacOS Installer
12170 OneDrive for iOS
12171 Azure Service Fabric 9.1 for Windows
12172 Azure Service Fabric 9.1 for Ubuntu
12173 Snipping Tool
12174 Snip & Sketch for Windows 10
9312 Windows Server 2008 for 32-bit Systems Service Pack 2
9318 Windows Server 2008 for x64-based Systems Service Pack 2
9344 Windows Server 2008 for x64-based Systems Service Pack 2 (Server Core installation)
Variable Region
Delcare all the variable within this section. Lets take a look at what we are declaring within the script:
The MSRC website releases the montly report in the following yyyy-MM format
#Month Format for MSRC
$Month = Get-Date -Format 'yyyy-MMM'
The operating system we want to focus on and leave the rest. If you are interested in any other OS, change the value from the above prerequisites, and it will give you Critical/Import vulnerabilities for that OS or product.
# Enter the Operating System you specifically want to focus on
$ClientOS_Type = "Windows 11 Version 22H2 for x64-based Systems"
Enter the details for sending the email (subject line, to and from etc.) for the report
#Email Details
$Recipients = "askaresh@askaresh.com", "someone@askaresh.com"
$Sender = "cve-report@askaresh.com"
$SMTP_Server = "smtp.askaresh.com"
$Subject = 'CVE List for Windows 11 22H2 on '+$Month
HTML report formatting (CSS, Title of the report, Logo and Header) information
I hope you will find this helpful information to gather Microsoft Security vulnerability reports for a specific operating system using PowerShell. Please let me know if I have missed any steps or details, and I will be happy to update the post.
I have written various individual blog posts on PowerShell creation of all configurational task for Windows 365 Cloud PC under Microsoft Endpoint Portal (MEM).
Based on public demand, I want to create a consolidated post for all the scripts and configuration items that can get you started with Windows 365 Cloud PC using PowerShell: (Of course all the below features can also be configured using the UI, however below is the guidance strictly using PowerShell)
PowerShell links to my blog post
Following are the links to my blog post for each and individual task:
I promise you once you have done the hard work, you can get up and running in a few hours using all the above PowerShell scripts with Windows 365 Cloud PC.
GitHub Link
Here is the repo with all the scripts and more – askaresh/avdwin365mem (github.com). A big thanks to Andrew Taylor for collabrating and updating the Provisioning policy script with the SSO details that was release in late Nov 2022.
I hope you will find this helpful information for all things PowerShell w.r.t Windows 365 Cloud PC. I will update the post if I publish or update more information.
Do you want to deploy an Azure Virtual Desktop – Host pools quickly and want a starting point for a golden image? Look no further in this blog post. I will show you how to create a golden image using PowerShell in no more than 10 min.
I will break down the code block into smaller chunks first to explain the critical bits, and in the end, I will post the entire code block that can be run all at once. In this way, explaining block by block becomes easier than pasting one single block.
Pre-requisites
Following are the pre-requisites before you begin
PowerShell 5.1 and above
Azure Subscription
Permissions within the Auzre Subscription for Azure Compute
Assumption
You have an existing Resource Group (RG)
You have an existing Azure Virtual Network (VNET)
You have an existing workload subnet within the VNET
Identify the VM Size you will be using for the golden image
We are going to use the Windows 11 22H2 Mutli-session – win11-22h2-avd within this script
Variable Region
Delcare all the variable within this section. Lets take a look at what we are declaring within the script:
Existing Resource Group within the Azure Subscription (AZ104-RG)
A location where you are deploying this virtual machine (Australia East)
Name of the golden image virtual machine (VM03)
NIC Interface name for the virtual machine (VM03-nic)
RG of the VNET (In my case they are same AZ104-RG, they can be seperate too and hence a independent variable)
Name of the existing subnet within the vNET (AZ104-VDI-Workload-L1)
Name of the existing VNET (AZ104-RG-vnet)
Mapping of the exisitng VNET
Mapping of the existing subnet
T-shirt size of the golden image we are deploying (Standard_D2s_v3)
Gallery details of the image
Published – MicrosoftWindowsDesktop
Offer – windows-11
SKU – win11-22h2-avd
version – Offcourse latest
Get credentials – A local admin account is created on the golden image (A input box to capture the uisername and password)
# Existing Resource Group to deploy the VM
$rgName = "AZ104-RG"
# Geo Location to deploy the VM
$location = "Australia East"
# Image template name
$vmName = "VM03"
# Networking Interfance Name for the VM
$nicName = "$vmName-nic"
# Resource Group for VNET
$vnetrgName = "AZ104-RG"
# Existing Subnet Name
$Existsubnetname = "AZ104-VDI-Workload-L1"
# Existing VNET Name
$Existvnetname = "AZ104-RG-vnet"
# Existing VNET where we are deploying this Virtual Machine
$vnet = Get-AzVirtualNetwork -Name $Existvnetname -ResourceGroupName $vnetrgName
# Existing Subnet within the VNET for the this virtual machine
$subnet = Get-AzVirtualNetworkSubnetConfig -Name $Existsubnetname -VirtualNetwork $vnet
# T-shirt size of the VM
$vmSize = "Standard_D2s_v3"
# Gallery Publisher of the Image - Microsoft
$publisher = "MicrosoftWindowsDesktop"
# Version of Windows 10/11
$offer = "windows-11"
# The SKY ending with avd are the multi-session
$sku = "win11-22h2-avd"
# Choosing the latest version
$version = "latest"
# Setting up the Local Admin on the VM
$cred = Get-Credential `
-Message "Enter a username and password for the virtual machine."
Execution block
Execution code block within this section. Lets take a look at what we are we executing within the script:
First its creating the network interface for the virtual machine (VM03)
Next, under the variable $VM all virtual machine configurations
Tshirt size of the virtual machine
Credentials for the local admin (username/password)
The network interface assignment along with the delete option (Note delete option is essential or/else during deletion of VM it will not delete the network interface)
The gallery image, sku, offer from the Microsoft Market Place gallery
The os disk assignment along with the delete option (Note delete option is essential or/else during deletion of VM it will not delete the disk)
The configuration around “Trusted Platform” and enabling of TPM and Secure Boot
The final command to create the virtual machine with all the above configurations
# Create New network interface for the virtual machine
$NIC = New-AzNetworkInterface -Name $nicName -ResourceGroupName $vnetrgName -Location $location -Subnet $subnet
# Creation of the new virtual machine with delete option for Disk/NIC together
$vm = New-AzVMConfig -VMName $vmName -VMSize $vmSize
$vm = Set-AzVMOperatingSystem `
-VM $vm -Windows `
-ComputerName $vmName `
-Credential $cred `
-ProvisionVMAgent `
-EnableAutoUpdate
# Delete option for NIC
$vm = Add-AzVMNetworkInterface -VM $vm `
-Id $NIC.Id `
-DeleteOption "Delete"
$vm = Set-AzVMSourceImage -VM $vm `
-PublisherName $publisher `
-Offer $offer `
-Skus $sku `
-Version $version
# Delete option for Disk
$vm = Set-AzVMOSDisk -VM $vm `
-StorageAccountType "StandardSSD_LRS" `
-CreateOption "FromImage" `
-DeleteOption "Delete"
# The sauce around enabling the Trusted Platform
$vm = Set-AzVmSecurityProfile -VM $vm `
-SecurityType "TrustedLaunch"
# The sauce around enabling TPM and Secure Boot
$vm = Set-AzVmUefi -VM $vm `
-EnableVtpm $true `
-EnableSecureBoot $true
New-AzVM -ResourceGroupName $rgName -Location $location -VM $vm
# Step 1: Import module
#Import-Module Az.Accounts
# Connect to the Azure Subcription
#Connect-AzAccount
# Get existing context
$currentAzContext = Get-AzContext
# Your subscription. This command gets your current subscription
$subscriptionID=$currentAzContext.Subscription.Id
# Command to get the Multi-session Image in Gallery
# Details from this command will help in filling out variables below on Gallery Image
# Get-AzVMImageSku -Location australiaeast -PublisherName MicrosoftWindowsDesktop -Offer windows-11
# Existing Resource Group to deploy the VM
$rgName = "AZ104-RG"
# Geo Location to deploy the VM
$location = "Australia East"
# Image template name
$vmName = "VM03"
# Networking Interfance Name for the VM
$nicName = "$vmName-nic"
# Resource Group for VNET
$vnetrgName = "AZ104-RG"
# Existing Subnet Name
$Existsubnetname = "AZ104-VDI-Workload-L1"
# Existing VNET Name
$Existvnetname = "AZ104-RG-vnet"
# Existing VNET where we are deploying this Virtual Machine
$vnet = Get-AzVirtualNetwork -Name $Existvnetname -ResourceGroupName $vnetrgName
# Existing Subnet within the VNET for the this virtual machine
$subnet = Get-AzVirtualNetworkSubnetConfig -Name $Existsubnetname -VirtualNetwork $vnet
# T-shirt size of the VM
$vmSize = "Standard_D2s_v3"
# Gallery Publisher of the Image - Microsoft
$publisher = "MicrosoftWindowsDesktop"
# Version of Windows 10/11
$offer = "windows-11"
# The SKY ending with avd are the multi-session
$sku = "win11-22h2-avd"
# Choosing the latest version
$version = "latest"
# Setting up the Local Admin on the VM
$cred = Get-Credential `
-Message "Enter a username and password for the virtual machine."
# Create New network interface for the virtual machine
$NIC = New-AzNetworkInterface -Name $nicName -ResourceGroupName $vnetrgName -Location $location -Subnet $subnet
# Creation of the new virtual machine with delete option for Disk/NIC together
$vm = New-AzVMConfig -VMName $vmName -VMSize $vmSize
$vm = Set-AzVMOperatingSystem `
-VM $vm -Windows `
-ComputerName $vmName `
-Credential $cred `
-ProvisionVMAgent `
-EnableAutoUpdate
# Delete option for NIC
$vm = Add-AzVMNetworkInterface -VM $vm `
-Id $NIC.Id `
-DeleteOption "Delete"
$vm = Set-AzVMSourceImage -VM $vm `
-PublisherName $publisher `
-Offer $offer `
-Skus $sku `
-Version $version
# Delete option for Disk
$vm = Set-AzVMOSDisk -VM $vm `
-StorageAccountType "StandardSSD_LRS" `
-CreateOption "FromImage" `
-DeleteOption "Delete"
# The sauce around enabling the Trusted Platform
$vm = Set-AzVmSecurityProfile -VM $vm `
-SecurityType "TrustedLaunch"
# The sauce around enabling TPM and Secure Boot
$vm = Set-AzVmUefi -VM $vm `
-EnableVtpm $true `
-EnableSecureBoot $true
New-AzVM -ResourceGroupName $rgName -Location $location -VM $vm
Note – It will give you a pop-up box for entering the username and password for the local account, and in under 10 mins you will see your virtual machine within the Azure portal
Next Steps on Golden Image
Now that the virtual machine is ready following are the next steps involved:
Using Azure Bastion console and installing all the required applications
Generalize and sysprep and shutdown the image
Capture the image to the Azure Compute Galleries
Deploy within the Azure Virtual Desktop
I hope you will find this helpful information for deploying a golden image within Azure – Virtual Machine to deploy the Azure Virtual Desktop – Host Pools. If you want to see a Powershell version of the host pool activities, leave me a comment below or on my socials. Please let me know if I have missed any steps or details, and I will be happy to update the post.
You have a large VMware App Volumes environment and have backed up your writable volumes using the capabilities provided in the App Volumes Manager. (You are doing the right thing!)
AV Manager – WV Backup Config
We decided to perform an audit on the backup of the writable volumes within the App Volumes Manager 2.18.10 and the VSAN Datastore. You can export all the writable volumes to a CSV using the API. My script here will provide you with a complete outlook for conducting your analysis. Now exclude your group entitlements from the list, leaving you with the total number of writable volumes within your environment. Ideally, you are after the same number of writable volumes on the VSAN datastore. (Of course, if everything is going well in the backup world!)
In my case, we observed more than 300+ missing writable volumes between the exported CSV and the VSAN datastore. Let the investigations begin – within the production.log, we could see the backup was happening, but the challenge of a large environment is impossible to track all the backup occurring just by looking at the logs. Feature request to VMW – A dedicated backup log showcases the entire environment’s status. We eventually ended up with a GSS case after few months of back and forth and the logs exchange, we finally got a working solution. This closed the mystery of the missing backup of the writable volumes.
Solution
Go the the SQL database of the App Volumes Manager. Select the DB and New Query.
AV Database – Microsoft SQL
Enter the following query and hit execute. Now this will change the default writable volumes batch size(writables_backup_batch_size) from 5 to 25. Note the value of the batch size was tweaked multiple times, we first went with 10, which drastically reduced the missing backup. However, a few were still missing and not getting backup. The final number for our environment was 25 got all the writable volumes backup.
Disclaimer – This tweak was required for a large App Volumes environment. Please consult with VMware Support before making any changes to your setup or Database. If it works for me doesn’t mean it will work for you. The value can differ based on the size of the enivronment.
I hope you will find this helpful information on your VMware App Volumes backup strategy. Please let me know if you have observed any issues like these, and would like to share your story?
Often within the VMware App Volumes Manager (AVM), Writable Volumes will show up as Status – Orphaned. Let’s take a look at the following topics:
What is Orphaned Writable Volumes?
Script to delete them from the App Volumes Managers
What is Orphaned Writable Volumes?
App Volumes Manager is integrated with Microsoft Active Directory (AD), and it’s in continuous synchronization. Whenever an end-user account gets disabled into the AD during the next sync activity of App Volumes Manager, it will mark the writable volumes with Writable Status = Orphaned.
Now in the ideal world, these accounts have been disabled and should be okay to delete? Maybe, if you don’t have the obligation of data retention, then you are ready to delete them. If you need to retain them, keep them as-is for compliance purposes.
Script to delete them for App Volumes Manager
Before we talk about the script, the deletion is very straightforward within the App Volumes Manager. Select the volumes with Status = Orphaned and select the Delete button. However, when you have to do the same against multiple POD, which becomes challenging, and as always, if it’s not automated, there is the scope of human error.
Pre-requisites
You need the App Volumes Manager URL
You need the username and password of the App Volumes Manager
You need to enter y/Y to proceed further with the deletion
The script was tested on PowerShell V5.x with App Volumes Manager version 2.18.10 (The logic will be the same however, the API call for App Volumes 4.x will be different)
###########################################################################
# Get List of Wrtiable Volumes from AppVolumes Manager for Status=Orphaned
# Delete the Orphaned Wrtiable Volumes
# You need username and password for the App Volumes Manager
# Author - Aresh Sarkari (Twitter - @askaresh)
# Version - V5.0
###########################################################################
#App Volumes Manager Name or IP Address
$AVManager = "https://avm001.askaresh.local"
# Run at the start of each script to import the credentials
$RESTAPIUser = "domain\username"
$RESTAPIPassword = "enteryourpassword"
#Ignore cert errors
add-type @"
using System.Net;
using System.Security.Cryptography.X509Certificates;
public class TrustAllCertsPolicy : ICertificatePolicy {
public bool CheckValidationResult(
ServicePoint srvPoint, X509Certificate certificate,
WebRequest request, int certificateProblem) {
return true;
}
}
"@
[System.Net.ServicePointManager]::CertificatePolicy = New-Object TrustAllCertsPolicy
[System.Net.ServicePointManager]::SecurityProtocol = [System.Net.SecurityProtocolType]'Ssl3,Tls,Tls11,Tls12'
#Login AV Manager Body
$body = @{
username = “$RESTAPIUser"
password = “$RESTAPIPassword”
}
#Login API call to the AV Manager
Invoke-RestMethod -SessionVariable DaLogin -Method Post -Uri "$AVManager/cv_api/sessions” -Body $body
#Get the list of Writbale Volumes from the AV Manager
$output = Invoke-RestMethod -WebSession $DaLogin -Method Get -Uri "$AVManager/cv_api/writables" -ContentType "application/json"
#Selecting the WV with status Orphaned into a variable
$WVouput = $output.datastores.writable_volumes | Select-Object id, owner_name, owner_upn, title, status | Where-Object {[string]$_.status -match "Orphaned"}
#Output on the console (Validate carefully before proceeding ahead)
$WVouput | Format-Table | Out-String | % {Write-Host $_}
#Confirmation logic to proceed with the deletion
$confirmation = Read-Host -Prompt "Are you Sure You Want To Proceed with the deletion:"
if ($confirmation -match "[yY]" ) {
# proceed
# The WV Deletion API call only looks for IDs. We are filtering the ids only
$WVOutputIDs = $WVouput.id
#Looping to delete each Writable Volumes via its ID
foreach ($WVOutputIDss in $WVOutputIDs) {
# Writable Volumes deletion Parameters body
$jsonbody = @{
bg = "0"
volumes = "$WVOutputIDss"
} | ConvertTo-Json
#API call to delete the Wrtiable Volumes
#We are using Invoke-webrequest for getting the Content of the deletion (Success) in oneline
$WVdeletecall = Invoke-WebRequest -WebSession $DaLogin -Method Post -Uri "$AVManager/cv_api/volumes/delete_writable" -Body $jsonbody -ContentType "application/json"
}
#Dig into the exception to get the Response details.
Write-Host $WVdeletecall.StatusCode
Write-Host $WVdeletecall.StatusDescription
Write-Host $WVdeletecall.Content
}
When you run the script, it will identify all the end-users with Status = Orphaned. If you like, you can copy and paste the output in an editior (Notepad++) to verify the output.
Once you press y/Y it will go ahead and delete the Orphaned writable volumes.
I hope you will find this script useful to bulk delete orphaned Writable Volumes in App Volumes Manager. A small request if you further enhance the script or make it more creative, I hope you can share it back with me?
Horizon Reach is a potent tool, and Andrew Morgan has put in a lot of blood, sweat and tears to develope it. What suprises me is why isnt this fling included into the Horizon product? We haven’t gathered here to talk about the product management and roadmap aspects 😉
Horizon Reach fling aggregates all the various Horizon POD information into its database. Typically, running Horizon API calls or Horizon Powershell modules might have to run them against individual pods to fetch information about that POD. The beauty with Horizon Reach is it aggregates all the information, we can write scripts/API calls to request information from there instead of writing Horizon POD specific scripts.
Let’s take a look at the following information from the Horizon Reach fling:
What API’s are available with Horizon Reach?
What all options are available to interact with Horizon Reach API?
Script – Get a consolidated list of Horizon Farm details (Display the Name, Base Image details, Snapshot Version, Health and If provisioning is enabled)
Note the above can also be fetched using the old Horizon Powershell modules but trust me it’s pretty tricky to run a foreach loop for every object on the SOAP method.
Script – Get a consolidated list of Horizon Desktop Pools details (Display the Name, Base Image details, Snapshot Version, Health and If provisioning is enabled)
What API’s are avilable with Horizon Reach?
After you have installed the Horizon Reach fling, go to the following URL to check out all the avilable API’s. Its the UI Swagger interface to simplify and understand each calls.
Scripts to get consolidated Horizon Farms/Desktops information
Pre-requsites:
You need the Horizon Reach Server URL
You need the password of the Horizon Reach Server
The script provides you with the details of all Horizon PODs in your setup.
The script was tested on PowerShell V5.x
#Horizon Reach Server Name or IP Address
$HZReachServer = "https://horizonreach.domain:9443"
#Ignore the self signed cert errors
add-type @"
using System.Net;
using System.Security.Cryptography.X509Certificates;
public class TrustAllCertsPolicy : ICertificatePolicy {
public bool CheckValidationResult(
ServicePoint srvPoint, X509Certificate certificate,
WebRequest request, int certificateProblem) {
return true;
}
}
"@
[System.Net.ServicePointManager]::CertificatePolicy = New-Object TrustAllCertsPolicy
[System.Net.ServicePointManager]::SecurityProtocol = [System.Net.SecurityProtocolType]'Ssl3,Tls,Tls11,Tls12'
#API Call to make the intial connection to the Horizon Reach Server##
$HZReachLogonAPIcall = "$HZReachServer`/api/Logon"
#The body payload that comprises of the login API request
$body = @{
username = "administrator"
password = "enteryourpassword"
} | ConvertTo-Json
$HZReachlogin = Invoke-RestMethod -Method Post -uri $HZReachLogonAPIcall -Body $body -ContentType "application/json"
#Header along with the JWT token is now saved for future API calls
#You need to call this header in all subsequent calls as it has the token
$Headers = @{ Authorization = "Bearer $($HZReachlogin.jwt)" }
#API Call to fetch the consolidated (as many pods you have) Horizon Farm information##
$HZReachFarms = Invoke-RestMethod -Method Get -uri "$HZReachServer/api/Farms" -Headers $Headers -ContentType "application/json" -UseBasicParsing | Format-Table -Property displayname, baseimage, snapshot, enabled, health, isProvisioningEnabled
Write-Output $HZReachFarms
#API Call to fetch the consolidated (as many pods you have) Horizon desktop pool information##
$HZReachPools = Invoke-RestMethod -Method Get -uri "$HZReachServer/api/pools" -Headers $Headers -ContentType "application/json" -UseBasicParsing | Format-Table -Property displayname, baseimage, snapshot, enabled, healthDetail, isProvisioningEnabled
Write-Output $HZReachPools
The following information (Display Name, Snapshot, Base Image, Health, Provisioning Mode) is pulled using the above scripts. I was much interested to see the snapshot versions of each Farms/Pools along with Health and provisioning status. Feel free to extract whatever details you are looking for there are plenty of other properties.
I hope you will find this script useful to fetch helpful information from Horizon Reach. A small request if you further enhance the script or make it more creative, I hope you can share it back with me?
Recent Comments