Copying files from one server to another as a different user (two separate domains) using PowerShell

I’ve been working on needed to copy a number of files from one client site to another, my issue is that they have separate Active Directory domains and there is no trust between them. Using PowerShell, we can save a user credential and then use that to map a network drive with them and perform our copy.

We will setup the credential to be stored in a text file, although a cool feature of PowerShell it’s very limited in that it can only be decrypted by the user who created it on the same local machine (which is fine for our needs). The following cmdlet will prompt for a string to encrypt, which in this case is our password.

Read-Host -ASSecureString | ConvertFrom-SecureString | Out-File E:\Scripts\password.txt

Once done, we will build up our PowerShell script that will read the file, map a network drive via PowerShell which will use the secure credentials and then copy across our files.

$password = Get-Content E:\Scripts\password.txt | ConvertTo-SecureString
$credentials = new-object -typename System.Management.Automation.PSCredential -argumentlist "domain\username",$pass

New-PSDrive -Name Z -PSProvider filesystem -Root "\10.15.2.5\Baseline$" -Credential $creds
Remove-Item -Path Z:\ -filter *.bak
Copy-Item -filter *.bak E:\Backup -Destination Z:\ -ErrorAction SilentlyContinue -ErrorVariable A
Remove-PSDrive Z

The script copies files, and is a quick and dirty way to get files from a server in one domain to another without any sort of trust.

Moving the SQL Server tempdb file location after SQL Server is installed

Working with one of my education customers I recently had to perform some maintenance on their SQL Database server as they were running low on disk space AND had a free unused virtual disk where we could throw their tempdb onto (it was meant to go here but they didn’t place it there during installation). So I had the task of moving it over.

First step is to get an as-is of where tempdb currently is and how many fragments there are; Open up SQL Management Studio and run the following query.

-- Lists all current tempdb files and their paths
SELECT name, physical_name AS CurrentLocation 
FROM sys.master_files 
WHERE database_id = DB_ID (N'tempdb');
GO

Now that we have a listing (similar to our screenshot above/left), we then need to build up a query to move the database files from one drive to another. It should look something similar to the below query, changing paths and adding/removing tempdb fragments as needed (simply add more alter database statements for each file node).

USE master;
GO
ALTER DATABASE tempdb
MODIFY FILE (NAME = tempdev, FILENAME = 'F:\MSSQL14.MSSQLSERVER\MSSQL\DATA\tempdb.mdf');
GO
ALTER DATABASE tempdb
MODIFY FILE (NAME = templog, FILENAME = 'F:\MSSQL14.MSSQLSERVER\MSSQL\DATA\templog.ldf');
GO
ALTER DATABASE tempdb
MODIFY FILE (NAME = temp2, FILENAME = 'F:\MSSQL14.MSSQLSERVER\MSSQL\DATA\tempdb_mssql_2.ndf');
GO
ALTER DATABASE tempdb
MODIFY FILE (NAME = temp3, FILENAME = 'F:\MSSQL14.MSSQLSERVER\MSSQL\DATA\tempdb_mssql_3.ndf');
GO

In the above example we have specified F drive.  The other thing to note is that if you have SQL Server running through a service account, that will need NTFS modify permissions to the destination.  Once that is executed under the Master database context the files will then be re-created on the next SQL Server service restart.

Deploy Firefox in the Enterprise with uBlock Origin, HTTPS Everywhere and Privacy Badger using Group Policy

So we’ve been deploying Firefox for quite a while pretty much everywhere we can however, only recently have we started standardising the deployments across the organisation’s that we manage. We’ve deployed Internal Root CAs using the CCK2 method to improve our user experience with Deep Packet SSL inspection however setting up configurations and extensions and having that consistent was a challenge. Recently I learnt that Mozilla begun developing Group Policy Objects not long ago and that now allows for enterprise management straight into Firefox without the fuss. In this article I’ll go through setting up uBlock Origin, HTTPS Everywhere and Privacy Badger which are our go to Extensions for end-user protection.

To deploy Firefox (or any windows application), you generally want to use an MSI based installer (for better control and management). Mozilla have now been building them for a shot while via their Enterprise Deployment Support page for beta and standard releases. If you are after the Extended Support Release (ESR), visit the FrontMotion download page (they also offer a number of other services such as a custom packager). In a managed environment you’d either use Group Policy to deploy software or System Center Configuration Manager or some other form of MDM (ala InTune). The MSI should be in a network share accessible by all machines, you would then be imported either into Policies > Software Settings > Software installation for Group Policy (then Right-Click, New > Package) or an Application under the Software Library for Config Manager and pushed out. There are much better guides than what I can fit in here so please Google if you’re unsure. In our case, I used Config Manager and since we’re upgrading and I’ve setup a Supersedence rule like below.

Once Firefox is being deployed we need to get these extensions onto the machines, so the first part is getting the URLs of the extensions you wish to deploy. Visit the add-on store and start searching for what you’re after. For our example, as mentioned we will be installing uBlock Origin, Privacy Badger and HTTPS everywhere. When you are at the add-on page, right-click on the Add to Firefox button and select Copy Link Location and save that for later. Once you have your list of Extensions it should look like something similar to the below (I’ve removed the tracking string at the end).

https://addons.mozilla.org/firefox/downloads/file/1672871/ublock_origin-1.18.4-an+fx.xpi
https://addons.mozilla.org/firefox/downloads/file/1688114/privacy_badger-2019.2.19-an+fx.xpi
https://addons.mozilla.org/firefox/downloads/file/1669416/https_everywhere-2019.1.31-an+fx.xpi

Next is to get the Group Policy Definitions from Mozilla and load them onto your Active Directory Group Policy. I’d highly recommend you have a Group Policy Central Store setup as it makes managing this stuff a whole lot easier. Download the latest version or ones that match your deployment of Firefox from the Mozilla GitHub Releases page, unzip and then copy across those files to the Group Policy Central Store or required location. Now the fun part.

Create a new Group Policy Object, in my case Configure – Firefox and then open it up, and navigate to the following policy branch; Computer Configuration > Policies > Administrative Templates> Mozilla > Firefox > Extensions. Here we will be enabling Extensions to Install. Using the list we compiled earlier, enter the URLs one by one into the list so it will look like something similar to the below.

The next step I’d recommend, is we want to stop or prevent our end users from being able to remove these extensions/protections. To do this we need to get the Extension IDs, so fire up Firefox and install the list of Extensions we compiled earlier (if you didn’t already for testing). Now the easiest way I’ve found of getting the Extension IDs is to use the in-built memory profiler of Firefox. In the address bar enter about:memory and once it loads under Show memory reports group, click Measure. Do a search for Extensions and you’ll get to a list of all currently running extensions. Now extract everything for the id key (in this case a GUID but can be text as well);

Extension(id={d634138d-c276-4fc8-924b-40a0ea21d284}, name="1Password X – Password Manager", baseURL=moz-extension://31872614-f67c-4cda-84e4-18c0515c8b48/

The above is an example of what you’ll find in the list (using 1Password). Below is what we’ll be entering into the Prevent Extensions from being disabled or removed based on our setup so far – with the last line belonging to Privacy Badger.

[email protected]
[email protected]
jid1-MnnxcxisBPnSXQ@jetpack

Now that we have the list of Extension IDs we want to enter these into the Prevent extensions from being disabled or removed Group Policy setting located in the same branch as Extensions to Install. Again, enter them into the list one by one until you have something similar

Once that is configured, apply the Group Policy Object to your Test Machines preferably with Mozilla Firefox installed on it, log in and do a gpupdate /force with the end result being those Extensions magically appearing as per the below image.

Hope that helps.

Change Windows 10 Taskbar Icons Script Deploying a custom taskbar for Windows 10

Over the summer holiday period, I was assisting a school with building out an SoE for the new year.  One of the things we used to do with Windows 7 was tweak the Taskbar to contain only items we were after instead of the default items of Internet Explorer, Windows Explorer and Windows Media Player. To do this we implemented a VB Script that would make the changes on the fly.  And since that stopped working I’ve always wanted to sit down and try get it working again for Windows 10 and I’ll now share how I did.

You’ll want to begin by building out your Taskbar layout, add in any icons that you want and when you’re ready we can use a command to extract it.  Alternatively, you can use the sample below to build your own manually, simply reference the location of the shortcuts you want to place in and the order.

<?xml version="1.0" encoding="utf-8"?>
<LayoutModificationTemplate
xmlns="http://schemas.microsoft.com/Start/2014/LayoutModification"
xmlns:defaultlayout="http://schemas.microsoft.com/Start/2014/FullDefaultLayout"
xmlns:start="http://schemas.microsoft.com/Start/2014/StartLayout"
xmlns:taskbar="http://schemas.microsoft.com/Start/2014/TaskbarLayout"
Version="1">
	<CustomTaskbarLayoutCollection PinListPlacement="Replace">
		<defaultlayout:TaskbarLayout>
			<taskbar:TaskbarPinList>
				<taskbar:DesktopApp DesktopApplicationLinkPath="%ALLUSERSPROFILE%\Microsoft\Windows\Start Menu\Programs\Firefox.lnk" />
				<taskbar:DesktopApp DesktopApplicationLinkPath="%APPDATA%\Microsoft\Windows\Start Menu\Programs\System Tools\File Explorer.lnk" />
				<taskbar:DesktopApp DesktopApplicationLinkPath="%ALLUSERSPROFILE%\Microsoft\Windows\Start Menu\Programs\Word.lnk" />
				<taskbar:DesktopApp DesktopApplicationLinkPath="%ALLUSERSPROFILE%\Microsoft\Windows\Start Menu\Programs\Excel.lnk" />
				<taskbar:DesktopApp DesktopApplicationLinkPath="%ALLUSERSPROFILE%\Microsoft\Windows\Start Menu\Programs\Outlook.lnk" />
				<taskbar:DesktopApp DesktopApplicationLinkPath="%ALLUSERSPROFILE%\Microsoft\Windows\Start Menu\Programs\Microsoft SQL Server Tools 17\Microsoft SQL Server Management Studio 17.lnk" />
			</taskbar:TaskbarPinList>
		</defaultlayout:TaskbarLayout>
	</CustomTaskbarLayoutCollection>
</LayoutModificationTemplate>

Now that we have our Layout template ready, it is simply a matter of importing it. We do so by using the Import-StartLayout command. This can also be used to import a Start Menu tile layout however in this case we are only importing the Taskbar layout. Execute the command below (replacing the path with your own).

Import-StartLayout -layoutpath \\MGS-DEPLOY-02\Scripts\TaskbarLayout.xml -Mountpath C:\

Now you can either execute this on a user login, during an MDT Task Sequence or at any other time you want.

For more info around this topic, visit the Microsoft Docs site. Hope that helps.

Using Stunnel to Allow Legacy Apps and Devices that do not support SSL POP3 or TLS SMTP to Connect to Office 365

I’ve been busy lately assisting with a number of Office 365 migrations. Every single one is different and while many are straightforward, In some cases, you will find applications or devices that don’t support the requirements for connecting to Office 365 using TLS or SSL or they may not even work over standard ports such as 587. Working with one SMB recently, they had a critical Line of Business application that was written internally and can no longer be maintained by anyone in-house. They had identified a path forward however we still needed to keep the app running for around 6 months post migration to Office 365. 

This is where stunnel, which is a TLS Proxy comes in handy.  Grab the latest version from the stunnel website and install it.  This little TLS/SSL proxy tool allows for us to listen for standard For our purposes we will install the Service instance so that it is always running whenever the server reboots.  Once installed we can start building our configuration file. I’ve outlined a simple one below;

#Basic Configuration for Microsoft Office 365 POP3 and SMTP 
output = stunnel-log.txt 
debug=4 
taskbar=yes
 
[POP3 Incoming] 
client = yes 
accept = 127.0.0.1:110
verifyChain = yes
CAfile = ca-certs.pem 
connect = outlook.office365.com:995 

[SMTP Outgoing] 
client = yes 
protocol = smtp 
accept = 127.0.0.1:25 
verifyChain = yes
CAfile = ca-certs.pem
connect = smtp.office365.com:587

This allows any application local on the same server as sTunnel to connect up to SMTP and POP3 on the standard ports then push this onto Office 365. We’re also pushing everything to a log file If you have issues with certificates the remove the verifyChain and CAFile lines which will prevent stunnel from attempting to verify the cert we receive from Office 365. If you are looking at doing IMAP or even need to do more with stunnel, see the example config files for more.

Veeam File to Tape Job and how to backup up files directly from a Network Share

A customer of ours has a large archive of files located on a NAS device (around 15TB worth) that they want to simply push off to tape and then remove from the NAS. Network drives don’t show up in Veeam whilst creating backup jobs, so we needed a way to get this working. We use Veeam with a number of customers and understand quite well how things operate with it, as such whenever Veeam is working with a local file system it does so under the Local System context, which is what we need to get our mapped drive under, but how.

PSExec from SysInternals to the rescue. We can create a command prompt in the context of local system and then map our network drive for it to appear in Veeam’s Job Wizard. Grab the latest version and place that in the System32 directory under the Windows folder. Now we open an administrative command prompt and enter;

psexec -s -i cmd.exe

This opens a new window which is now running under our Local System context, which we can double check by using the whoami command. Our next step is simply to map the network drive using a NET USE command. We need to ensure we pass through some credentials as the local system will not have access to the share.

NET USE G: \\NAS\Archive /USER:Domain\User

Now we can go into Veeam and run our Create File to Tape wizard and during our Files and Folders selection, we can now see our mapped network drive (in this case G:\ or Archive) and add folders in from this mapped drive. We then proceeded to backup the data we needed onto tape for archive.

Hope that helps.

How to Configure the Management IP of a Palo Alto Firewall through a console connection

Palo Alto OS DashboardSo I’ve recently started experimenting with a Palo Alto VM Firewall that we are about to trial.  Unfortunately they don’t offer a Hyper-V virtual machine so I’ve had to stick this into dev our ESXi host.

After importing the .ovf, I edited the network adapters onto the right VLANs for me to get it going in a one-arm sniffer configuration.  I then proceeded to power it up.  Once it was loaded, I entered the default username and password (which are admin/admin) and entered the console of the device.

set deviceconfig system ip-address 192.168.1.99 
netmask 255.255.255.0 default-gateway 192.168.1.1
dns-setting servers primary 172.16.1.5

I then entered commit for the PA to save the configuration I had just entered.

I performed a ping for safe measure and ensure the unit can communicate with with the outside world for updates with PAN and other services if required (ping host fortinet.com) and then logged into the web interface using the default credentials.

Delete Windows.old from an upgraded Windows Server install operating in Core

I was at a customer site and they had a single Hyper-V host (running Server Hyper-V edition) and had done an in-place upgrade. Microsoft generally recommends you always do fresh installations and migrate, except for Configuration Manager servers where it is a supported configuration to upgrade Windows versions.  They were starting to run low on disk space on the C drive, so I’ve outlined the below process for removing the windows.old directory.  You can get anywhere from 6 GB to 15 GB back by removing the windows.old folder which is where everything Windows based is moved to if you decide to upgrade your Windows Server.

Download the SysInternals Junction utility which we will use to find and delete and directory symbolic links (or NTFS Junctions) that may still exist in the directory structure, expand the zip file and create a PowerShell file with the following code and save it under a C:\temp location (which is where we will work from).

foreach ($line in [System.IO.File]::ReadLines("c:\temp\junctions.txt"))
{
    if ($line -match "^\\")
    {
        $file = $line -replace "(: JUNCTION)|(: SYMBOLIC LINK)",""
        & c:\temp\junction64.exe -d "$file"
    }
}

The above code will iterate through the junction list we can extract with the below command.  On a majority of systems this should actually come back empty indicating that the Windows upgrade has gone smoothly.

junction -s C:\Windows.old > junctions.txt

We then execute the PowerShell file we saved earlier with the text file we just created with the Junction utility.  Once that is done we can begin to clean up.  Firstly, take owernship by issuing;

takeown /F c:\Windows.old\* /R /A /D Y

You may find that will be all you need and can issue the rmdir otherwise, run this additional command

cacls c:\Windows.old\*.* /T /grant administrators:F

So after all that I was easily able to reclaim a whole bunch of disk space by issuing the following command.

rmdir /S /Q c:\Windows.old

If only Microsoft kept Disk Cleanup on Windows Server to make life easier.

Renaming a Hyper-V Failover Cluster

If you find yourself taking over a cluster with a name that is silly or doesn’t make sense, you can rename it without much issue. Your main thing to watch out for are backup software that target the cluster (such as Veeam or DPM). You just need to ensure they are reconfigured to use the new cluster name. Also, if you happen to have VMM managing the cluster, make sure you remove it from VMM before doing the rename and then add it back in.

To do this, simply open up the failover cluster manager, right click on the cluster and click properties. Now you can enter a new name. Once done it prompts you to restart each node in the cluster and you should do that sooner rather than later to prevent issues.

Hope that helps.

How to Reset a Domain Controller’s Domain Admin password for a Virtual Machine running up in Azure

The Reset password utility for Virtual Machines has come in handy on the odd occasion when we never recorded or misplaced the password for a VM running in Azure. The downside is this tool does not support running against Domain Controllers (to reset the in-built Administrator account).  So what happens when you have a domain controller, that only has a single Domain Admin account and we’ve forgotten the password?  In comes Virtual Machines Extensions to the rescue.  Firstly, open up Notepad and enter a net user reset password command like below replacing the username and password with the one you want to reset.  Save it as script.ps1

net user <Username> <Password>

Log into the Azure Portal and then select the Virtual Machine  you want to change domain password for, under the main menu blade for that Virtual Machine find Extensions and enter it.  We now want to add in a new Extension so click on the +Add button at the top, in the Add Extension blade, find and select Script Extension and click on Create.

This will now allow us to upload the script.ps1 we created earlier, so browse to it and then hit Upload.  This will then trigger the script to run in the Virtual Machine and we’ll get notified when it is created and run.