Skip to content

How PowerShell Scripting Works

PowerShell scripts are sequences of commands that run automatically. Understanding how they execute, handle data, and manage flow is fundamental.

When you type commands in PowerShell console, they execute immediately. In a script, you write multiple commands in a file and they execute top-to-bottom when you run the file.

Key difference: In a script, you plan the entire workflow before execution. Commands wait for previous commands to finish.

Variables store information your script needs. Every variable starts with $.

Variables have types (integers, text, objects), but PowerShell figures this out automatically based on what you store.

Terminal window
$MachineName = hostname
$seconds = 60
$ResultFile = "MemoryResults.sql"

The variable lives in memory while your script runs. Once the script ends, it’s gone (unless you save it to a file).

When running a script, users can pass values as arguments:

Terminal window
.\MyScript.ps1 120 "MyOutputFile.txt"

Inside your script, access these with $args[0], $args[1], etc:

  • $args[0] = 120 (first argument)
  • $args[1] = “MyOutputFile.txt” (second argument)

You typically store these in variables and use the variables throughout your script.

Your script needs to make choices (if this, then that). Control flow statements decide what runs and what doesn’t.

If statements run code only when a condition is true:

  • Is the file missing? Create it.
  • Did the user enter valid input? Continue. Otherwise, ask again.

Loops repeat code while a condition is true:

  • Keep checking memory until the time limit is reached.
  • Keep asking the user until they enter a valid choice.

Without control flow, scripts just execute line-by-line with no flexibility.

Your script shows results in different ways:

Write-Host outputs text to the console (for the user to see):

Terminal window
Write-Host "Free memory at 10:30: 8192 MB"

Add-Content appends data to a file:

Terminal window
Add-Content -Path $ResultFile -Value $NewLine

Set-Content overwrites a file:

Terminal window
Set-Content -Path $ResultFile -Value "DELETE FROM MemoryUsage WHERE ID > 0;"

File operations let you store results permanently. Console output lets you give feedback to the user.

Control flow requires conditions. You compare values to decide what happens:

Terminal window
If ($answer -eq '1') { ... } # equals
If ($value -lt 10) { ... } # less than
If ($age -ge 18) { ... } # greater than or equal
If (Test-Path $ResultFile) { ... } # file exists?
If ($var -ne $null) { ... } # not empty?

Conditions evaluate to true or false, which determines if the code block runs.

Functions group code that does one specific job. You write the function once, then call it multiple times.

Terminal window
function CheckInput($userAnswer) {
If ($userAnswer -eq '1' -or $userAnswer -eq '2') {
return $true
}
return $false
}

Functions reduce repetition and make scripts easier to maintain.

Many PowerShell commands return objects (like data containers). Objects have properties (attributes) you access with a dot.

Terminal window
$FreeMemory = Get-CimInstance Win32_OperatingSystem | Select FreePhysicalMemory

FreePhysicalMemory is a property. Access it like: $entry.FreePhysicalMemory

Objects let you work with complex system data in a structured way.

Comments help you (and others) understand what code does. They start with #:

Terminal window
# This is a single-line comment
<# This is a multi-line comment
useful for longer explanations #>

Good comments explain why code exists, not just what it does.

Scripts encounter errors: files that don’t exist, invalid input, permission problems. Plan for these.

Check before acting:

Terminal window
If (Test-Path -LiteralPath $ResultFile) {
# File exists, handle accordingly
}

Validate user input:

Terminal window
While (($answer -ne '1') -and ($answer -ne '2') -and ($answer -ne '9')) {
$answer = Read-Host "Please enter 1, 2, or 9"
}

Scripts that handle errors gracefully feel professional and reliable.

Your script executes line-by-line from top to bottom, except:

  • Control flow changes the order (if statements skip code, loops repeat code)
  • Functions are defined but don’t run until called
  • Loops circle back to earlier lines

Everything else happens in sequence. This predictability is why planning matters—you know exactly what happens when.

Variables consume memory while they exist. Large datasets (reading huge files) can slow your script. Design with efficiency in mind:

  • Reuse variables when possible
  • Process data in chunks instead of loading everything at once
  • Close file handles when done

Most small scripts don’t worry about this, but it’s good to know.