What
This article provides a comprehensive guide on PowerShell scripting best practices, focusing on code structure, output formatting, error handling, performance optimization, and security measures.
Why
Following best practices in PowerShell scripting ensures your scripts are readable, maintainable, secure, and performant. It reduces technical debt, enhances collaboration, and minimizes risks in production environments.
How
Tool and Controller Design
Decide Whether You’re Coding a ‘Tool’ or a ‘Controller’
- Tool: Reusable functions/modules.
- Controller: Automates a specific task, not designed for reuse.
Make Your Code Modular
- Use functions and script modules to maximize reusability.
Use Standard Naming Conventions
- Follow Verb-Noun format using approved PowerShell verbs (
Get-Verb
).
Standardize Parameter Naming
- Use names like
$ComputerName
instead of custom prefixes.
Output Raw Data from Tools
- Tools should output minimally processed data for flexibility.
Controllers Should Output Formatted Data
- Controllers can format data for user-friendly reports.
Example
function Get-DiskInfo {
param ([string]$ComputerName)
Get-WmiObject Win32_LogicalDisk -ComputerName $ComputerName
}
Avoid Reinventing the Wheel
Use built-in cmdlets like Test-Connection
instead of custom ping functions.
# Preferred
Test-Connection $ComputerName -Quiet
Writing Parameter Blocks
Always Write Help
Include comment-based help with .SYNOPSIS
, .DESCRIPTION
, and at least one .EXAMPLE
.
function Test-Help {
<#
.SYNOPSIS
Demonstrates proper help documentation.
.EXAMPLE
Test-Help -MandatoryParameter "Example"
Runs the Test-Help function with a mandatory parameter.
#>
[CmdletBinding()]
param (
[Parameter(Mandatory = $true)]
[Alias("MP")]
[String]$MandatoryParameter
)
}
Use [CmdletBinding()]
Enables common parameters like -Verbose
, -Debug
, -ErrorAction
.
Support -WhatIf and -Confirm
For state-changing commands, use SupportsShouldProcess
.
[CmdletBinding(SupportsShouldProcess, ConfirmImpact = "Medium")]
param ([switch]$Force)
Strongly Type Parameters
Always define parameter types for validation and clarity.
param (
[string]$Name,
[int]$Count
)
Use [switch] Correctly
- Defaults to
$false
. - Use boolean logic, avoid treating it as three-state.
Formatting Output
Avoid Write-Host Unless Necessary
Use Write-Verbose
, Write-Debug
, or Write-Output
appropriately.
Use Write-Progress for Progress Updates
Write-Progress -Activity "Processing" -Status "50% Complete" -PercentComplete 50
Use Format Files for Custom Objects
Define .format.ps1xml
files instead of inline formatting.
Output Only One Type at a Time
Use [OutputType()]
and avoid mixing object types.
Error Handling Best Practices
Use -ErrorAction Stop with Cmdlets
Force terminating errors to handle them with try-catch
.
try {
Get-Item "C:\InvalidPath" -ErrorAction Stop
} catch {
Write-Warning "Item not found."
}
Use $ErrorActionPreference for Non-Cmdlets
Temporarily set to 'Stop'
around risky operations.
Avoid Flags and $? for Error Handling
Use structured try-catch
blocks instead.
Copy $Error[0] or $_ Immediately in catch
catch {
$errorDetails = $_
Write-Error "An error occurred: $($errorDetails.Exception.Message)"
}
Performance Optimization
PERF-01 Measure Performance When It Matters
Use Measure-Command
to benchmark different approaches, especially with large datasets.
Measure-Command {
foreach ($item in $data) { Process-Item $item }
}
PERF-02 Balance Performance and Readability
- For small datasets, prioritize readability.
- For large datasets, consider streaming and low-level .NET techniques if necessary.
Readable but less performant:
$content = Get-Content -Path file.txt
foreach ($line in $content) {
Do-Something -Input $line
}
Streamlined for performance:
Get-Content -Path file.txt | ForEach-Object {
Do-Something -Input $_
}
High-performance with .NET:
$sr = New-Object System.IO.StreamReader "file.txt"
while ($sr.Peek() -ge 0) {
$line = $sr.ReadLine()
Do-Something -Input $line
}
PERF-03 Prefer Language Features Over Cmdlets for Speed
- Language constructs (
foreach
) > .NET methods > Scripts > Cmdlets/Pipeline - Always measure before optimizing prematurely.
Security Best Practices
Always Use PSCredential for Credentials
Avoid plain text passwords. Accept credentials as parameters using [Credential()]
.
param (
[System.Management.Automation.PSCredential]
[System.Management.Automation.Credential()]
$Credential
)
If passing to APIs:
$Insecure.SetPassword($Credential.GetNetworkCredential().Password)
Use SecureString for Sensitive Data
Prompt securely and store encrypted values.
$Secure = Read-Host -Prompt "Enter Secure Data" -AsSecureString
Convert SecureString to plain text safely:
$BSTR = [System.Runtime.InteropServices.Marshal]::SecureStringToBSTR($Secure)
$PlainText = [System.Runtime.InteropServices.Marshal]::PtrToStringAuto($BSTR)
[System.Runtime.InteropServices.Marshal]::ZeroFreeBSTR($BSTR)
Save Credentials Securely
Use Export-CliXml
for storing credentials.
Get-Credential | Export-CliXml -Path C:\secure\cred.xml
$Credential = Import-CliXml -Path C:\secure\cred.xml
Save Encrypted Strings
ConvertFrom-SecureString -SecureString $Secure | Out-File -Path "${Env:AppData}\secure.bin"
$Secure = Get-Content -Path "${Env:AppData}\secure.bin" | ConvertTo-SecureString
Conclusion
By following these PowerShell best practices across design, documentation, output handling, error management, performance, and security, you can create robust, maintainable, and efficient scripts suitable for both small tasks and enterprise-level automation. Always strive to balance readability, performance, and security to deliver high-quality solutions.