Recently the Apps on demand was released, essentially this an Azure market place offering from VMware aka Broadcom. You can deploy App Volumes manager virtual machine that runs Windows Server 2022 with App Volumes Manager pre-installed. Additionally, deploy Azure file share and database configuration.
What is an AppStack
An AppStack in VMware App Volumes is a virtual container that contains a set of applications packaged together. It’s used in virtual desktop environments such as (AVD and Windows 365) to dynamically deliver these applications to users. AppStacks are read-only, can be attached to user sessions transparently, and appear as if the applications are natively installed. They offer efficient application management, allowing for easy updates and maintenance without impacting the underlying system. This approach simplifies application lifecycle management and enhances user experience in virtualized desktop scenarios.
Prerequisites
You’ll need the following things ready before you can rollout App Volumes:
Azure Subscription: Have an active Microsoft Azure subscription. Azure Documentation
Azure Permissions: Ensure permissions to create virtual machines for App Volumes Manager, storage accounts, and file shares on Azure.
Access to On-Prem Active Directory: Required for identity and access control.
Azure Virtual Desktop Access: For remote connectivity.
Azure AD Connect: Install Azure AD Connect on the Active Directory Domain Controller.
Note: Microsoft Entra ID (Azure Active Directory) is not supported.
App Volumes License File: Download and install the App Volumes license file. App Volumes in Azure follows a Bring Your Own License model. VMware App Volumes Product Download Page
SQL Server Database for Production: For a production environment, ensure the availability of an existing SQL server database, as the default SQL Server Express Database is not recommended for production use.
My party came crashing on the requirement of Active Directory (AD) Domain Controller. In my entire Azure, Azure Virtual Desktop, Windows 365 Cloud PC and Microsoft Intune I have made deliberate choice to keep it with mordern management\authentication and follow the Microsoft best practice guidance. Though I am not able to complete the entire configurations due to AD I decided to share whatever configurations I manage to perform to deploy the AV Manager appliance within my Azure subscription for this blog post.
Starting in the Azure Portal
Access the Marketplace: Begin by navigating to the Microsoft Azure portal and clicking on ‘Marketplace’.
Find VMware App Volumes: Search for the ‘VMware App Volumes’ offer and click on it.
Select App Volumes Version: In the VMware App Volumes Azure Application page, choose the desired version from the Plan drop-down box.
Click ‘Create’
Deploy the Virtual Machine Appliance
Details: Select your Subscription and Resource group. You can also create a new resource group if needed.
Instance Details: Choose the Region and enter the name for your virtual machine. Set up the credentials for the local administrator. (#TIP – Ensure you deploy the Appliance where the session host pools are located)
Public IP Address: A new public IP address is created by default, but you can opt for an existing one or none at all. (#TIP – This allows accessing the App Volumes Manager over the internet )
DNS Prefix: If using a public IP address, consider creating a DNS prefix for easier access. (#TIP – Pick a friendly name I decided to leave it as defaults)
By default, a new virtual network and subnet are created. However, in my situation, I want to leverage the existing VNET and Subnet.
Database Selection: Choose between a Local SQL Server Express Database (not recommended for production) or a Remote SQL Server database. Enter the necessary database connection details. (#TIP – Explore for production use evaluate using the Azure SQL DB as its a managed SQL instance and you will not have the overhead of managing and maintaining a SQL Server)
File Share Configuration: Decide if you want Azure Marketplace to automatically provision storage with file shares or use existing storage. Configure accordingly. (#TIP – Same if you have Azure Dell or NetApp Files storage you can leverage that or using Azure storage account)
Tags: Add any desired tags. (#TIP – Ensure you add the tags as it will help you later during Billing and Automation)
Review and Launch Deployment: Review your setting Click ‘Create’ to start the deployment process.
Navigate to App Volumes Manager Admin UI: This is where you’ll configure Active Directory and other settings.
Configure Storage: If you opted out of automatic storage provisioning, manually configure the storage in the App Volumes Manager admin UI.
When you navigate to the Storage the newly created SMB shares are listed for templates and AppStacks stroage.
I will come back to this blog post once I learn more about the “No AD” mode as this will enable me to further integrate App Volumes into my Azure Virtual Desktop and Windows 365 Cloud PCs.
Resources deployed within the Azure Resource Group
Following resources are deployed within your Azure Resource Group for App Volumes
Whislist
I would like to see the following enhacements at a later day:
VMware aka Broadcom should evlaute a complete SaaS offering taking it to the next level instead of deploying an appliance and required to do alot of configuration. Just give me “as a Service” and the only responsiblity is to create/update and mange the applications with Entra ID Groups.
Microsoft Entra ID integration is a must
I hope you will find this helpful information for getting started with App Volumes for Windows 365 Cloud PC and Azure Virtual Desktop. Please let me know if I have missed any steps or details, and I will be happy to update the post.
Are you looking to keep a vigilant eye on your Windows 365 environment? Good news! You can now send Windows 365 audit events to Azure Log Analytics, Splunk, or any other SIEM system that supports it.
Understanding the Scope of Windows 365 Audit Logs
When it comes to monitoring your Cloud PC environment, Windows 365 audit logs are an indispensable resource. These logs provide a comprehensive chronicle of significant activities that result in modifications within your Cloud PC setup (https://intune.microsoft.com/). Here’s what gets captured:
Creation Events: Every time a Cloud PC is provisioned, it’s meticulously logged.
Update Events: Any alterations or configurations changes made to an existing Cloud PC are recorded.
Deletion Events: If a Cloud PC is decommissioned, this action is also captured in the logs.
Assignment Events: The process of assigning Cloud PCs to users doesn’t go unnoticed; it’s all in the logs.
Remote Actions: Activities such as remote sign-outs or restarts are tracked for administrative oversight.
These audit events encompass most actions executed via the Microsoft Graph API, ensuring that administrators have visibility into the operations that affect their Cloud PC infrastructure. It’s important to note that audit logging is an always-on feature for Windows 365 customers. This means that from the moment you start using Cloud PCs, every eligible action is automatically logged without any additional configuration.
Windows 365 and Azure Log Analytics
Windows 365 has made it easier than ever to integrate with Azure Log Analytics. With a few simple PowerShell commands, you can create a diagnostic setting to send your logs directly to your Azure Log Analytics workspace.
Sign in to the Microsoft Intune admin center, select Reports > Diagnostic settings (under Azure monitor)> Add Diagnostic settings.
Under Logs, select Windows365AuditLogs.
Under Destination details, select the Azure Log Analytics and choose the Subscription & Workspace.
Select Save.
Query the Azure Log Analytics
Once your logs are safely stored in Azure Log Analytics, retrieving them is a breeze. You can use Kusto Query Language (KQL) to extract and analyze the data. Here’s a basic example of how you might query the logs:
Leverage Graph API to retrieve Windows 365 audit events
Connect to MS Graph API
Step 1 – Install the MS Graph Powershell Module
#Install Microsoft Graph Beta Module
PS C:WINDOWSsystem32> Install-Module Microsoft.Graph.Beta
Step 2 – Connect to scopes and specify which API you wish to authenticate to. If you are only doing read-only operations, I suggest you connect to “CloudPC.Read.All” in our case, we are creating the policy, so we need to change the scope to “CloudPC.ReadWrite.All”
#Read-only
PS C:WINDOWSsystem32> Connect-MgGraph -Scopes "CloudPC.Read.All" -NoWelcome
Welcome To Microsoft Graph!
OR
#Read-Write
PS C:WINDOWSsystem32> Connect-MgGraph -Scopes "CloudPC.ReadWrite.All" -NoWelcome
Welcome To Microsoft Graph!
Permissions for MS Graph API
Step 3 – Check the User account by running the following beta command.
Integrating Windows 365 with Azure Log Analytics is a smart move for any organization looking to bolster its security and compliance posture. With the added flexibility of forwarding to multiple endpoints, you’re well-equipped to handle whatever audit challenges come your way.
I hope you will find this helpful information for enabling and quering Windows 365 Audit Logs in Azure Logs Analytics or using Graph API with PowerShell. Please let me know if I have missed any steps or details, and I will be happy to update the post.
In today’s digital age, managing cloud resources efficiently is paramount, not just for operational efficacy but also for cost management. Enter Azure Virtual Desktop (AVD) Scaling Plans – Microsoft’s answer to dynamic and intelligent scaling of your virtual desktop infrastructure. No longer do organizations need to overprovision resources or let them sit idle; with AVD Scaling Plans, you get a responsive environment tailored to your usage patterns. In this blog post, we’ll create the scaling plans using Terraform.
In the previous blog post, we delved into the distinctions between the Personal Desktop (1:1 mapping), Pooled Desktop (1:Many mapping) and Remote App configurations, providing a comprehensive guide on their creation via Terraform. The series continues as we further explore how to create the AVD Scaling Plan for Pooled Host Pool.
Permissions within the Azure Subscription for using Terraform
Terraform – Authenticating via Service Principal & Client Secret
Before running any Terraform code, we will execute the following PowerShell (Run as administrator)and store the credentials as environment variables. If we do this via the environment variable, we don’t have to keep the below information within the providers.tf file. In a future blog post, there are better ways to store the below details, and I hope to showcase them:
Azure Subcription ID – Azure Portal Subcription copy the ID
Client ID – From the above step you will have the details
Client Secret – From the above step you will have the details
Tenant ID – While creating the Enterprise Apps in ADD you will have the details
Terraform Folder Structure
The following is the folder structure for the terrraform code:
Azure Virtual Desktop Scaling Plan – Create a directory in which the below Terraform code will be published (providers.tf, main.tf, variables.tf and output.tf)
Create a file named main.tf and insert the following code. Let me explain what all we are attempting to accomplish here:
Leverage a existing Resource Group
Leverage a existing Host Pool
Create a custom role AVD AutoScale and assign to the Resource Group
This is a prerequisite for ensuring the scaling plan can increase and decrease the resources in your resource group.
Assign the role – AVD AutoScale to the service principal (AVD)
Create a a scaling plan with a production grade schedule
Associate the scaling plan with the host pool
# Generate a random UUID for role assignment
resource "random_uuid" "example" {}
# Fetch details of the existing Azure Resource Group
data "azurerm_resource_group" "example" {
name = var.resource_group_name
}
# Fetch details of the existing Azure Virtual Desktop Host Pool
data "azurerm_virtual_desktop_host_pool" "existing" {
name = var.existing_host_pool_name
resource_group_name = var.resource_group_name
}
# Define the Azure Role Definition for AVD AutoScale
resource "azurerm_role_definition" "example" {
name = "AVD-AutoScale"
scope = data.azurerm_resource_group.example.id
description = "AVD AutoScale Role"
# Define the permissions for this role
permissions {
actions = [
# List of required permissions.
"Microsoft.Insights/eventtypes/values/read",
"Microsoft.Compute/virtualMachines/deallocate/action",
"Microsoft.Compute/virtualMachines/restart/action",
"Microsoft.Compute/virtualMachines/powerOff/action",
"Microsoft.Compute/virtualMachines/start/action",
"Microsoft.Compute/virtualMachines/read",
"Microsoft.DesktopVirtualization/hostpools/read",
"Microsoft.DesktopVirtualization/hostpools/write",
"Microsoft.DesktopVirtualization/hostpools/sessionhosts/read",
"Microsoft.DesktopVirtualization/hostpools/sessionhosts/write",
"Microsoft.DesktopVirtualization/hostpools/sessionhosts/usersessions/delete",
"Microsoft.DesktopVirtualization/hostpools/sessionhosts/usersessions/read",
"Microsoft.DesktopVirtualization/hostpools/sessionhosts/usersessions/sendMessage/action",
"Microsoft.DesktopVirtualization/hostpools/sessionhosts/usersessions/read"
]
not_actions = []
}
assignable_scopes = [
data.azurerm_resource_group.example.id,
]
}
# Fetch the Azure AD Service Principal for Windows Virtual Desktop
data "azuread_service_principal" "example" {
display_name = "Azure Virtual Desktop"
}
# Assign the role to the service principal
resource "azurerm_role_assignment" "example" {
name = random_uuid.example.result
scope = data.azurerm_resource_group.example.id
role_definition_id = azurerm_role_definition.example.role_definition_resource_id
principal_id = data.azuread_service_principal.example.id
skip_service_principal_aad_check = true
}
# Define the Azure Virtual Desktop Scaling Plan
resource "azurerm_virtual_desktop_scaling_plan" "example" {
name = var.scaling_plan_name
location = var.location
resource_group_name = var.resource_group_name
friendly_name = var.friendly_name
description = var.scaling_plan_description
time_zone = var.timezone
tags = var.tags
dynamic "schedule" {
for_each = var.schedules
content {
name = schedule.value.name
days_of_week = schedule.value.days_of_week
ramp_up_start_time = schedule.value.ramp_up_start_time
ramp_up_load_balancing_algorithm = schedule.value.ramp_up_load_balancing_algorithm
ramp_up_minimum_hosts_percent = schedule.value.ramp_up_minimum_hosts_percent
ramp_up_capacity_threshold_percent= schedule.value.ramp_up_capacity_threshold_pct
peak_start_time = schedule.value.peak_start_time
peak_load_balancing_algorithm = schedule.value.peak_load_balancing_algorithm
ramp_down_start_time = schedule.value.ramp_down_start_time
ramp_down_load_balancing_algorithm= schedule.value.ramp_down_load_balancing_algorithm
ramp_down_minimum_hosts_percent = schedule.value.ramp_down_minimum_hosts_percent
ramp_down_force_logoff_users = schedule.value.ramp_down_force_logoff_users
ramp_down_wait_time_minutes = schedule.value.ramp_down_wait_time_minutes
ramp_down_notification_message = schedule.value.ramp_down_notification_message
ramp_down_capacity_threshold_percent = schedule.value.ramp_down_capacity_threshold_pct
ramp_down_stop_hosts_when = schedule.value.ramp_down_stop_hosts_when
off_peak_start_time = schedule.value.off_peak_start_time
off_peak_load_balancing_algorithm = schedule.value.off_peak_load_balancing_algorithm
}
}
# Associate the scaling plan with the host pool
host_pool {
hostpool_id = data.azurerm_virtual_desktop_host_pool.existing.id
scaling_plan_enabled = true
}
}
Configure AVD – ScalingPlans – variables.tf
Create a file named variables.tf and insert the following code. The place where we define existing or new variables:
# Define the resource group of the Azure Virtual Desktop Scaling Plan
variable "resource_group_name" {
description = "The name of the resource group."
type = string
default = "AE-DEV-AVD-01-PO-D-RG"
}
# Define the attributes of the Azure Virtual Desktop Scaling Plan
variable "scaling_plan_name" {
description = "The name of the Scaling plan to be created."
type = string
default = "AVD-RA-HP-01-SP-01"
}
# Define the description of the scaling plan
variable "scaling_plan_description" {
description = "The description of the Scaling plan to be created."
type = string
default = "AVD Host Pool Scaling plan"
}
# Define the timezone of the Azure Virtual Desktop Scaling Plan
variable "timezone" {
description = "Scaling plan autoscaling triggers and Start/Stop actions will execute in the time zone selected."
type = string
default = "AUS Eastern Standard Time"
}
# Define the freindlyname of the Azure Virtual Desktop Scaling Plan
variable "friendly_name" {
description = "The friendly name of the Scaling plan to be created."
type = string
default = "AVD-RA-HP-SP-01"
}
# Define the host pool type(Pooled or Dedicated) of the Azure Virtual Desktop Scaling Plan
variable "host_pool_type" {
description = "The host pool type of the Scaling plan to be created."
type = string
default = "Pooled"
}
# Define the details of the scaling plan schedule
variable "schedules" {
description = "The schedules of the Scaling plan to be created."
type = list(object({
name = string
days_of_week = list(string)
ramp_up_start_time = string
ramp_up_load_balancing_algorithm = string
ramp_up_minimum_hosts_percent = number
ramp_up_capacity_threshold_pct = number
peak_start_time = string
peak_load_balancing_algorithm = string
ramp_down_start_time = string
ramp_down_load_balancing_algorithm = string
ramp_down_minimum_hosts_percent = number
ramp_down_capacity_threshold_pct = number
ramp_down_wait_time_minutes = number
ramp_down_stop_hosts_when = string
ramp_down_notification_message = string
off_peak_start_time = string
off_peak_load_balancing_algorithm = string
ramp_down_force_logoff_users = bool
}))
default = [
{
name = "weekdays_schedule"
days_of_week = ["Monday", "Tuesday", "Wednesday", "Thursday", "Friday"]
ramp_up_start_time = "08:00"
ramp_up_load_balancing_algorithm = "BreadthFirst"
ramp_up_minimum_hosts_percent = 20
ramp_up_capacity_threshold_pct = 60
peak_start_time = "09:00"
peak_load_balancing_algorithm = "DepthFirst"
ramp_down_start_time = "18:00"
ramp_down_load_balancing_algorithm = "DepthFirst"
ramp_down_minimum_hosts_percent = 10
ramp_down_capacity_threshold_pct = 90
ramp_down_wait_time_minutes = 30
ramp_down_stop_hosts_when = "ZeroActiveSessions"
ramp_down_notification_message = "You will be logged off in 30 min. Make sure to save your work."
off_peak_start_time = "20:00"
off_peak_load_balancing_algorithm = "DepthFirst"
ramp_down_force_logoff_users = false
}
]
}
# Define the location of the Azure Virtual Desktop Scaling Plan
variable "location" {
description = "The location where the resources will be deployed."
type = string
default = "australiaeast"
}
# Define the tags of the Azure Virtual Desktop Scaling Plan
variable "tags" {
description = "The tags to be assigned to the Scaling plan."
type = map(string)
default = {
"Billing" = "IT"
"Department" = "IT"
"Location" = "AUS-East"
}
}
# Define the name of the Azure Virtual Desktop Host Pool
variable "existing_host_pool_name" {
description = "The name of the existing Azure Virtual Desktop Host Pool."
type = string
default = "AE-DEV-AVD-01-PO-D-HP"
}
Configure AVD – ScalingPlans – output.tf
Create a file named output.tf and insert the following code. This will showcase in the console what is getting deployed in form of a output.
# Output the ID of the Azure Virtual Desktop Scaling Plan
output "scaling_plan_id" {
description = "The ID of the Virtual Desktop Scaling Plan."
value = azurerm_virtual_desktop_scaling_plan.example.id
}
Intialize Terraform – AVD – ScalingPlans
Run terraform init to initialize the Terraform deployment. This command downloads the Azure provider required to manage your Azure resources. (Its pulling the AzureRM and AzureAD)
terraform init -upgrade
Create Terraform Execution Plan – AVD – ScalingPlans
Run terraform plan to create an execution plan.
terraform plan -out scaleplan.tfplan
Apply Terraform Execution Plan – AVD – ScalingPlans
Run terraform apply to apply the execution plan to your cloud infrastructure.
terraform apply "scaleplan.tfplan"
Validate the Output in Azure Portal
Go to the Azure portal, Select Azure Virtual Desktop and Select Scaling Plans and validate all the details such as Host Pool Assignment and Schedule:
Clean-up the above resources (Optional)
If you want to delete all the above resources then you can use the following commands to destroy. Run terraform plan and specify the destroy flag.
terraform plan -destroy -out scaleplan.destory.tfplan
terraform apply "scaleplan.destory.tfplan"
Quick Start Links
The intention here is to get you quickly started with Terraform on Azure Virtual Desktop Solution:
Description
Links
Create an autoscale scaling plan for Azure Virtual Desktop
I hope you will find this helpful information for getting started with Terraform to deploy the Azure Virtual Desktop – Scaling Plans. Please let me know if I have missed any steps or details, and I will be happy to update the post.
In the Aug 28, 2023 release for Windows 365 Enterprise, the reports for Connected Frontline Cloud PCs were announced. In today’s post, I will showcase how to access and make sure of the new report available within Microsoft Intune.
What does the report offer?
The primary aim of the Connected Frontline Cloud PCs report is to provide clarity on concurrent connections based on each frontline Cloud PC. This is crucial for businesses and IT Admins to understand their usage patterns and ensure they have the correct number of licenses. By analyzing the maximum concurrent connections, we can determine if there’s a need to acquire more licenses. This ensures that end users have uninterrupted access to their Frontline Cloud PCs. You can read more about Frontline Cloud PC provisioning in my previous blog post – PowerShell – Frontline Workers – Create Windows 365 Cloud PC Provisioning Policy | AskAresh
Accessing the report – Connected Frontline Cloud PCs
To view the report in Microsoft Intune portal, you can follow these steps:
If you don’t have the necessary permissions to manage Windows 365. You will also need these two roles to view the report (SharedUseLicenseUsageReport and SharedUseServicePlans)
Go to Devices > Overview > Cloud PC performance (preview) > You will find the report name Connected Frontline Cloud PCs. > Select View report
I have two licenses for Windows 365 Frontline Cloud PCs in my lab. I have four provisioned Cloud PCs. However, the maximum Cloud PCs that can be connected and worked upon simultaneously are two.
Scenario 1 – One Frontline Cloud PC connected out of my total two licenses
There is no warning here when one frontline Cloud PC is connected.
Scenario 2 – Two Frontline Cloud PC connected out of my total two licenses
You can see a warning stating you have reached the concurrency limit. The third session of frontline cloud PC will not be allowed.
Concurrent Frontline Cloud PC Connections
This is how the overall report looks when you click on the Cloud PC Size. The report aggregates data for the last 28 days and showcases:
The current number of connected Frontline Cloud PCs.
The maximum number of connected Frontline Cloud PCs for each day within the selected range (either 7 or 28 days).
The maximum concurrency limit.
Warnings when nearing or reaching the maximum concurrency limit.
It’s worth noting that this report is tailored for Windows 365 Frontline. If a business hasn’t purchased any Windows 365 Frontline licenses, the report will remain empty.
I hope you will find this helpful information for estimating and tracking the frontline worker concurrent usage via this report. Please let me know if I have missed any steps or details, and I will be happy to update the post.
In the July 2023 release for Azure Virtual Desktop, the Watermarking and Session capture protection features became generally available. Numerous blog posts already showcase how to enable the feature using Group Policy. In today’s post, I will showcase how to enable Watermarking and Session Capture protection using Microsoft Intune for Session Host Virtual machines that are Azure AD joined.
Prerequisites
You’ll need the following things ready before you can rollout watermarking/session capture protection:
Azure Virtual Desktop: You must have Azure Virtual Desktop deployed (Pooled or Personal Desktops) and set up in your Azure environment.
Microsoft Intune: You should have an active subscription to Microsoft Intune, which is a cloud-based service that enables device management and security. The role within Intune Portal for creating and assigning the configuration profiles is – Policy and Profile manager built-in role-based access control (RBAC) role.
Azure Active Directory: Your Azure Virtual Desktop environment should be integrated with Azure Active Directory (AD) (The Host pools RDP properties – targetisaadjoined:i:1). The AAD Security groups must be in place, which has the members as the session’s host in AVD.
Azure AD Joined Devices: The session host virtual machines (VMs) you want to enable Watermarking and Session Capture protection for should be Azure AD joined. This means they must be connected to Azure AD and registered as members of your organization’s directory.
Windows 11 operating system for the client along with the Azure Virtual Desktop Client or Remote Desktop Client versions 1.2.x and higher
Configuration Profiles – Intune
To enable the Watermarking and Session Capture protection features in Azure Virtual Desktop using Microsoft Intune Configuration profiles and Azure AD joined devices, you can follow these steps:
In the settings picker, browse to Administrative templates > Windows Components > Remote Desktop Services > Remote Desktop Session Host > Azure Virtual Desktop. You should see settings in the Azure Virtual Desktop subcategory available for you to configure, such as “Enable watermarking” and “Enable screen capture protection”
Select the “Enable screen capture protection” settings, too and leave the values as defaults. (Feel free to tweak it based on your requirements)
Assigning the configuration to the AAD group, which has all the session host devices
Reboot the session host after applying or wait until the next maintenance cycle
Client Validation
Connect to a remote session with a supported client (Azure Virtual Desktop Client or Remote Desktop Client versions 1.2.x), where you should see QR codes appear.
The QR code only works for Windows 11 Multi-session\Windows 11 Enterprise (pooled or personal desktops). The RemoteApps will not show the QR code as its not supported.
Screenshot protection – In the session, it will be completely blank if you try to take a screenshot. Below is an example. I was trying to take a screenshot of the text file, and the screenshot was completely blank.
Mobile Phone Photo
When you try to take a screenshot from the mobile phone, this is how it will look, and it will pop the Connection ID. You have this connection ID you can match in Azure Insights.
Azure Virtual Desktop Insights
To find out the session information from the QR code by using Azure Virtual Desktop Insights:
Open a web browser and go to https://aka.ms/avdi to open Azure Virtual Desktop Insights. Sign-in using your Azure credentials when prompted.
Select the relevant subscription, resource group, host pool and time range, then select the Connection Diagnostics tab.
In the section Success rate of (re)establishing a connection (% of connections), there’s a list of all connections showing First attempt, Connection Id, User, and Attempts. You can look for the connection ID from the QR code in this list, or export to Excel.
I hope you will find this helpful information for getting started with Watermarking and Screenshot protection for the Azure Virtual Desktop – Session Host. Please let me know if I have missed any steps or details, and I will be happy to update the post.
Note – The best method of assigning the DNS Servers is through the DHCP server. If you are setting the IP using DHCP, always make sure you add/remove additional DNS Client Servers from there. In my situation, there was no DHCP server, hence the detailed blog post.
Prerequsites
We are going to implement this configuration via Microsoft Intune using the Scripts:
The necessary Microsoft Intune permissions to create, the PowerShell Scripts.
A device group available within Microsoft Entra with all the devices you want to target this change.
PowerShell Script for DNSClient (Additional DNS Servers)
Save the below script and place on the desktop and we shall be uploading it to Microsft Intune portal – “AddDNSClient.ps1″
Please enter the proper DNS Server Address within the script based on your environment and requirement. In the example below the existing two DNS servers are 8.8.8.8 and 8.8.8.4. We are adding additional two DNS Servers 9.9.9.9 and 9.9.9.4.
Select Devices > Scripts > Add > Windows 10 and later.
In Basics, enter the following properties, and select Next:
Name: AddDNSClientServers
Description: Additional DNS Server 3 & 4
In Script settings, enter the following properties, and select Next:
Script location: Browse to the PowerShell script. saved previously and upload it (AddDNSClient.ps1)
Run this script using the logged on credentials: Select No.
Enforce script signature check: Select No
Run script in 64-bit PowerShell host: Select Yes to run the script in a 64-bit PowerShell host on a 64-bit client architecture.
Select Assignments > Select groups to include. Add the AAD group “Win11-P-DG”
Wait for approx. 15-20 minutes and the policy will apply to the managed devices. (Machine Win11-Intune-15)
Managed Device
You can validate that the settings have been applied to the client by going to the path – C:\ProgramData\Microsoft\IntuneManagementExtension\Logs and opening the file IntuneManagementExtension.txt. I copied the policy ID – cf09649b-78b7-4d98-8bcc-b122c29e5527 from the Intune portal hyperlink and searched within the log file. We can see the policy has been applied successfully.
I hope you will find this helpful information for applying additional DNS servers via Intune – Scripts and PowerShell. Please let me know if I have missed any steps or details, and I will be happy to update the post.
Over the years, I’ve discovered a list of exclusions that can help with the smooth functioning of VMware App Volumes – Writable Volumes. Though these exclusions are just suggestions, each environment is unique, so take them at your own risk. Testing them in your environment before implementing them in production is essential.
Path/Process/File Exclusion (Snapvol.cfg)
In this blog, I am not outlining the steps on how to add the snapvol.cfg exclusion as my ex-colleague Daniel Bakshi outlines on a VMware blog post on how to do it step by step. I hope you will find this information useful if you encounter intermittent black screen issues.
#Exclusions to resolve issues with BlueYonder
exclude_registry=\REGISTRY\MACHINE\SOFTWARE\Classes\ProFloor.Application
exclude_registry=\REGISTRY\MACHINE\SOFTWARE\Classes\ProSpace.Application
exclude_registry=\REGISTRY\MACHINE\SOFTWARE\Classes\WOW6432Node\CLSID\{22FBECF5-10A3-11D2-9194-204C4F4F5020}
exclude_registry=\REGISTRY\MACHINE\SOFTWARE\Classes\WOW6432Node\CLSID\{5E77716A-9680-4C0D-883E-74D49A2F4456}
#VMware DEM
exclude_registry=\REGISTRY\MACHINE\SOFTWARE\VMware, Inc.\VMware UEM
I hope you will find this helpful information for applying exclusions within the snapvol.cfg file. Please let me know if I have missed any steps or details, and I will be happy to update the post. I will gladly add more exclusions if you want to share them in the comments section.
Let’s say you have the entire Windows member server fleet of Windows Server 2016/2019/2022, Windows 11 Pro/Enterprise etc., using DNS Server 1 and Server 2 within their TCP-IP properties and now you decide to add DNS Server address 3 and Server 4 to the member servers to increase resiliency.
In the blog post, I will demonstrate how you can add the additional DNS Server using Group Policy Object and PowerShell with your enterprise.
What doesn’t work?
It would be best if you didn’t waste time – The GPO Computer Configuration –> Administrative Templates –> Network –> DNS Client –> DNS Servers doesn’t work. The “Supported On” version doesn’t include Windows Server 2016\Windows 10 in the compatibility. Even if you apply this GPO, it will apply to the server within the registry, but there will be no visible change under the TCP-IP properties.
Prerequsites
We are going to implement this configuration via group policy object within the enterprise:
The necessary active directory permissions to create, apply and link the GPOs
Access to the Sysvol folder to store the script
WMI Filters to target the script\GPO to specific subnets (More details below)
PowerShell Script for DNSClient (Additional DNS Servers)
Save the below script and place it within the location – \\DOMAINNAME\SYSVOL\DOMAINNAME\scripts\SetDNSAddress.ps1″
Please enter the proper DNS Server Address within the script based on your environment and requirements.
On a member server with administrative privileges, press Win + R to open the Run box. Type gpmc.msc and press Enter to open the Group Policy Management Console.
In the GPMC, expand the forest and domain trees on the left pane to locate the domain you want to create the GPO in.
Right-click on “Group Policy Objects” under the domain and select “New” to create a new GPO.
In the “New GPO” dialog box, provide a name for the GPO (e.g., “Additional DNS Servers”) and click “OK”.
Right-click on the newly created GPO and select “Edit” to open the Group Policy Management Editor.
Navigate to Computer Configuration > Preferences > Control Panel Settings > Scheduled Tasks
Right Click on Scheduled Tasks > Configure the task as Immediate Task.
Give it a name – SetDNSClient
Set the user account as SYSTEM. It will automatically convert into NT Authority\system.
Set the check “run with highest privileges”
In the Actions tab, create a new “Start a program” action.
Set the Program as: PowerShell.exe
Set the Add Arguments point to this line, and modify including your network share and file: ExecutionPolicy Bypass -command “& \\DOMAINNAME\SYSVOL\DOMAINNAME\scripts\SetDNSAddress.ps1”
Set the following in common Tab. – “Apply once and do not reapply”
Bonus Tip – WMI Filters
You want to target the GPO to a specific set of member servers who’s IP range starts with a particular IP address. Then you can create a WMI filter such as the below to target particular computers that meet the below range. In the below example, the GPO will apply to the machine starting with IP Address 10.XX OR 10.XX.
Select * FROM Win32_IP4RouteTable
WHERE (Mask='255.255.255.255'
AND (Destination Like '192.168.%' OR Destination Like '192.169.%'))
Intune (Configuration Profiles – Doesn’t Work)
As of writing the blog post the Intune built-in setting\CSP is showing similar behaviour like the DNS Server GPO it doesn’t work.
CSP
Under both situations (CSP & ADMX templates), the report says the policy is applied successfully. However, there is no visible impact on the operating system’s TCP-IP properties. I am optimistic that using the Scripts method and PowerShell can achieve the same results in Intune. Please let me know in the comments sections if you got it working or/else if you would like to see a blog post on using Intune Scripts to set the DNS Client on member servers.
Reference Links
Following are the references and important links worth going through for more details:
I hope you will find this helpful information for applying additional DNS servers via the GPO and PoweShell. I want to thank my friend Eqbal Hussian for his assistance and additional rounds of testing\validations. Please let me know if I have missed any steps or details, and I will be happy to update the post.
Suppose you’ve ever had to search for a particular or a list of GPO settings across a large number of Group Policy Objects (GPOs) within your domain. In that case, you know how tedious it can be to find specific settings across hundreds or thousands of GPOs. PowerShell comes to the rescue with a powerful script that can search for GPO settings across all your existing GPOs and generate an organized CSV output. In this blog post, we’ll walk you through the process and ensure you have all the prerequisites to get started.
Usecase
You have approx. 50 to 60 GPO settings from the Center of Internet Security (CIS) benchmark policies document (CIS Microsoft Windows Desktop Benchmarks/CIS Microsoft Windows Server Benchmarks), which you may want to search against your domain, whether they are already preconfigured\existing available within a GPO or not present in the environment. Instead of searching manually one by one, you may want to use the below PowerShell to get results like a champion.
Prerequisites
Before using the PowerShell script, ensure you have the following prerequisites in place:
Windows PowerShell version 5.0 and above
Active Directory Module for Windows PowerShell
Permissions: Ensure you have sufficient permissions to access and analyze GPO settings. Typically, you need to be a member of the Domain Administrators group or have equivalent privileges.
Execute the script from a member server that is part of the domain and has the necessary permissions.
Prepare the input file (inputgpo.txt) and enter the GPO setting one per line and save the file. In my situation, it’s present in C:\Temp
Relax minimum password length limits
Allow Administrator account lockout
Generate security audits
Impersonate a client after authentication
Lock pages in memory
Replace a process level token
Accounts: Block Microsoft accounts
Interactive logon: Machine inactivity limit
Microsoft network server: Server SPN target name validation level
Network access: Remotely accessible registry paths
Network security: Configure encryption types allowed for Kerberos
Audit Security State Change
Do not allow password expiration time longer than required by policy
Password Settings: Password Complexity
Password Settings: Password Length
Password Settings: Password Age (Days)
#Domain
$DomainName = "askaresh.com"
# Initialize matchlist
$matchlist = @()
# Collect all GPOs
$GPOs = Get-GPO -All -Domain $DomainName
# Read search strings from text file
# A list of GPOs settings you want to search
$SearchStrings = Get-Content -Path "C:\Temp\inputgpo.txt"
# Hunt through each GPO XML for each search string
foreach ($searchString in $SearchStrings) {
$found = $false
foreach ($gpo in $GPOs) {
$GPOReport = Get-GPOReport -Guid $gpo.Id -ReportType Xml
if ($GPOReport -match $searchString) {
$match = New-Object PSObject -Property @{
"SearchString" = $searchString
"GPOName" = $gpo.DisplayName
}
$matchlist += $match
$found = $true
}
}
if (-not $found) {
$match = New-Object PSObject -Property @{
"SearchString" = $searchString
"GPOName" = "No results found"
}
$matchlist += $match
}
}
# Output results to CSV, Search results
# This step will take time depending how many 100's or 1000's policies present in the enviornment
$matchlist | Export-Csv -Path "C:\Temp\gposearch.csv" -NoTypeInformation
Output (Results)
The ouput will look like the following within CSV:
I hope you will find this helpful information for searching GPO settings across 100’s and 1000’s of GPOs within your domain. Please let me know if I have missed any steps or details, and I will be happy to update the post.
In this blog post, we will explore how to extract highlighted data from a PDF using Python. Before we go ahead lets understand what is the usecase, you have the (CIS_Microsoft_Windows_Server_2022_Benchmark_v2.0.0.pdf) which is 1065 pages and you are reviewing the the policy against your enivornment and highlighting the pdf with specific color codes. For example, I use four colors for the following purposes:
Red Color – Missing Policies
Yellow Color – All existing policies
Pink – Policies not applicable
Green – Upgraded policies
Example of the highlighted text:
You dont have to use the same color codes like I have done but you get the idea. Once you have done the heavy lifting of reviewing the document and happy with the analysis. The next steps is you want to extract the highlighted data into a csv format so that the teams can review and action them.
Pre-requsites
We will use the PyMuPDF & Pandas library to parse the PDF file and extract the highlighted text. Additionally, we will apply this technique to the CIS Windows Server 2022 Benchmark PDF as an example.
Before we begin, make sure you have installed the necessary dependencies. You can install PyMuPDF and Pandas using pip:
pip install fitz
pip install pandas
First, I created a small script to go within the document pdf and detect the colors. I had to do this although, to my eyes, the colors are RED, Yellow, etc., the RGD color codes seem slightly different.
import fitz # PyMuPDF
# Open the PDF
doc = fitz.open('CIS_Microsoft_Windows_Server_2022_Benchmark_v2.0.0.pdf')
# Set to store unique colors
unique_colors = set()
# Loop through every page
for i in range(len(doc)):
page = doc[i]
# Get the annotations (highlights are a type of annotation)
annotations = page.annots()
for annotation in annotations:
if annotation.type[1] == 'Highlight':
# Get the color of the highlight
color = annotation.colors['stroke'] # Returns a RGB tuple
unique_colors.add(color)
# Print all unique colors
for color in unique_colors:
print(color)
You will get the following output post executing the script make sure you put the exact name of the PDF file and within the IDE of your choice cd to the directory where the above (CheckColor.py) resides.
Now we have the color codes it’s time to go ahead and extract the highlighted text. We iterate through each page of the PDF and check for any highlighted annotations. If an annotation is found, we extract the content and accumulate it in the extracted_text variable, followed by export to the csv.
Main Code
Replace "CIS_Microsoft_Windows_Server_2022_Benchmark_v2.0.0.pdf" with the actual path to your PDF file.
import fitz # PyMuPDF
import pandas as pd
# Open the PDF
doc = fitz.open('CIS_Microsoft_Windows_Server_2022_Benchmark_v2.0.0.pdf')
# Define the RGB values for your colors
PINK = (0.9686269760131836, 0.6000000238418579, 0.8196079730987549)
YELLOW = (1.0, 0.9411770105361938, 0.4000000059604645)
GREEN = (0.49019598960876465, 0.9411770105361938, 0.4000000059604645)
RED = (0.9215689897537231, 0.2862749993801117, 0.2862749993801117)
color_definitions = {"Pink": PINK, "Yellow": YELLOW, "Green": GREEN, "Red": RED}
# Create separate lists for each color
data_by_color = {"Pink": [], "Yellow": [], "Green": [], "Red": []}
# Loop through every page
for i in range(len(doc)):
page = doc[i]
annotations = page.annots()
for annotation in annotations:
if annotation.type[1] == 'Highlight':
color = annotation.colors['stroke'] # Returns a RGB tuple
if color in color_definitions.values():
# Get the detailed structure of the page
structure = page.get_text("dict")
# Extract highlighted text line by line
content = []
for block in structure["blocks"]:
for line in block["lines"]:
for span in line["spans"]:
r = fitz.Rect(span["bbox"])
if r.intersects(annotation.rect):
content.append(span["text"])
content = " ".join(content)
# Append the content to the appropriate color list
for color_name, color_rgb in color_definitions.items():
if color == color_rgb:
data_by_color[color_name].append(content)
# Convert each list to a DataFrame and write to a separate .csv file
for color_name, data in data_by_color.items():
if data:
df = pd.DataFrame(data, columns=["Text"])
df.to_csv(f'highlighted_text_{color_name.lower()}.csv', index=False)
After running the script, the extracted highlighted text will be saved under multiple csv files like the below screenshot:
You can now extract the highlighted text from the PDF using the above technique. Feel free to modify and adapt this code to suit your specific requirements. Extracting highlighted data from PDFs can be a powerful data analysis and research technique.
I hope you will find this helpful information for extracting data out from any PDF files. Please let me know if I have missed any steps or details, and I will be happy to update the post.
Recent Comments