Fixing Maximum connections reached by Clearing Connected Sessions on an APC UPS

I was trying to log into an APC UPS with the correct login but still received an error, The maximum number of web connections has been reached or simply Maximum connections reached. Knowing I had the right login credentials, and that no one else was logged into, I was a little perplexed.  There is a straight forward fix but can be a little annoying. 

Open up your favorite SSH client and connect via SSH to the APC or Schneider Electric Network device. Login using the configured username or password, or default of apc and apc.  Once logged into the console, we’ll issue some commands to list the current sessions and then use the -d switch to “disconnect” a few. I’ll point out the last session via Telnet is usually you so don’t disconnect it.

session
session -d 153

Commands are based on the image, simply change the session number to suit. Once you have cleared all the commands, you should be able to login to the web interface without issues.

Automate Lets Encrypt Renewals using Certify the web on Windows with Atlassian Jira behind a Apache Reverse Proxy

So been a while since my last post. I’ve been recently pushing our machines into Azure as well as automating as much as possible. We’ve got an internal Jira instance that we use. It is still running totally on a VM with no fancy Azure PaaS features on it.

Certify the web Tasks break down I have a Lets Encrypt SSL certificate managed using Certify the Web. I am running the free and awesome Community Edition and have added a number of tasks to deploy the certificate to the Apache Reverse Proxy (we run other apps on the box) as well as into the Java Key Store (since we use the installer/bundled JRE that comes with Jira). Deploying to Apache is an in-built task and is easily added (as per the screenshot), but how about adding it to the Key Store of the Java Runtime Environment that is bundled with Jira? Well, a quick batch file with some commands to firstly delete (as you cannot replace) a certificate alias and then load our new certificate in as well as passing the store password and preventing the Trust this certificate message.

I came up with the following quick and dirty batch file that will update the certificate in the JRE Keystore (assuming all default paths and credentials).  Simply save it to a path (i.e. C:\Scripts) and an Export Certificate Deployment Task and then add a Run… Deployment Task pointing to the below Batch file.

CD C:\Atlassian\JIRA\jre\bin
keytool -delete -alias JiraLE -keystore ../lib/security/cacerts -noprompt -storepass changeit
keytool -importcert -noprompt -trustcacerts -alias JiraLE -file jira-le.cer -keystore ../lib/security/cacerts -storepass changeit

And there you have it, no Issues or Errors whilst trying to work with Jira every time your certificate renews.

Queries for troubleshooting the Database Mail (dbmail) function of Microsoft SQL Server

So just a quick one today. I was recently working on a SQL Server, running through some Database Mail setup and testing (see Microsoft Docs) with one of our applications.  I needed a way to see what e-mails were being sent out as well as what wasn’t.  The below queries will give you the info I was after, the first one shows any items that have run through DB Mail and their details for the last day (you can customise the WHERE statement to your needs.  You will want to run them against MSDB, do this by selecting it or issuing a USE MSDB statement.

SELECT p.name, i.send_request_date, i.sent_date, i.recipients, i.subject, i.body
FROM sysmail_mailitems AS i inner join sysmail_profile as p on p.profile_id = i.profile_id
WHERE sent_date > DATEADD(DAY, -1,GETDATE())

Bare in mind that I’m using Aliases to shorten the query a little (see this article).  Now this next one simply shows failed items as well as error responses if any from a mail server.

SELECT i.subject, i.recipients, i.copy_recipients, i.blind_copy_recipients, i.last_mod_date, l.description
FROM sysmail_failedi AS i LEFT OUTER JOIN sysmail_event_log AS l ON i.mailitem_id = l.mailitem_id
WHERE (i.last_mod_date > DATEADD(DAY, - 1, GETDATE()))

Hope that helps.

Get the last Reboot or Shutdown reason and user from the Windows Event Log

Start by going into Event Viewer (Windows+R or the Start Menu and type eventvwr.msc).  Navigate to the System Log under Windows, we then want to use Filter Current Log to allow us to only show Events with certain attributes (such as Source or IDs).

In our case, we want to filter on Event Source: USER32.  Then for Event IDs we want to see only 1074.  If you are after unexpected shutdowns, use 6008.  Once that’s in (like the pic on the right), click OK and this will filter the event log based on our requirements. You can scroll through and see what and who initiated a shutdown.

Enabling BitLocker with Group Policy and backing up Existing BitLocker recovery keys to Active Directory

So getting BitLocker enabled in an Active Directory environment is fairly painless and helps to get your end user devices more Secure.  I’ll outline the steps you need to take to enable it as well as get the recovery keys stored in Active Directory.  I’ll also dive into replicating this setup on Azure AD/Intune in a future post.

First thing is to create a new GPO (i.e. Configure – BitLocker) – Edit it and navigate to Policies > Administrative Templates > Windows Components > BitLocker Drive Encryption. Enable the following Options:

  • Choose drive encryption method and cipher strength (Windows 10 Version 1511 and later)
  • Choose drive encryption method and cipher strength (Server 2012, Win 8.1 etc…)
  • Choose how users can recover BitLocker protected drives
  • Store BitLocker recovery information in Active Directory Domain Services

Then go down one folder into Operating System Drives and enable the following:

  • Choose how BitLocker protected operationg system drives can be recovered

Once you’ve set this all up, it should look something similar to the image below.

Now target the GPO to some machines and if you’re running 1809 (from what I’ve discovered so far) or later you’ll notice them start the BitLocker process to encrypt automatically.  If not then you may need to check and ensure the TPM is enabled for the device (as we haven’t specified to encrypt devices without a TPM in this case).

What happens if you have already enabled BitLocker but now want to store the recovery keys in Active Directory? With this GPO set it will allow windows to write the recovery key to AD however we need to use the manage-bde utility, that is a command based utility that can be used to configure BitLocker

manage-bde -protectors -get c:
for /f "skip=4 tokens=2 delims=:" %%g in ('"manage-bde -protectors -get c:"') do set MyKey=%%g
echo %MyKey%
manage-bde -protectors -adbackup c: -id%MyKey%

I saved that as a batch file and ran that on the machines that had already been encrypted prior to rolling out the GPO.  Once run, it escrows the key into Active Directory.

The last bit you will need to do so you can actually see the keys in the Properties tab or via the Search function in Active Directory Users and Computers, ensure that the BitLocker RSAT is enabled in Server Features and Roles.

Windows 10 May 2019 or 1903 Software Update Management Changes for WSUS and Config Manager

We’ve started to deploy the latest release of Windows 10 and it’s interesting to note that Microsoft have released with little fan-fare some changes to the way Updates are deployed for the 1903 release.

Microsoft are now pushing updates through what is called the Unified Update Platform (see this RPC Mag article). Anyway, the main thing is there is now a new product category for WSUS and Config Manager that needs to be configured before your clients will being to receive updates.

You’ll see there is now a Windows 10, version 1903 and later product – make sure that is ticked on your Update Management Tool for updates to by synchronised. Once we had that ticked, for Config Manager you may need to tweak your Automatic Deployment Rule to include additional filters based on how you have it setup.  Microsoft have also blogged about these changes here.

Copying files from one server to another as a different user (two separate domains) using PowerShell

I’ve been working on needed to copy a number of files from one client site to another, my issue is that they have separate Active Directory domains and there is no trust between them. Using PowerShell, we can save a user credential and then use that to map a network drive with them and perform our copy.

We will setup the credential to be stored in a text file, although a cool feature of PowerShell it’s very limited in that it can only be decrypted by the user who created it on the same local machine (which is fine for our needs). The following cmdlet will prompt for a string to encrypt, which in this case is our password.

Read-Host -ASSecureString | ConvertFrom-SecureString | Out-File E:\Scripts\password.txt

Once done, we will build up our PowerShell script that will read the file, map a network drive via PowerShell which will use the secure credentials and then copy across our files.

$password = Get-Content E:\Scripts\password.txt | ConvertTo-SecureString
$credentials = new-object -typename System.Management.Automation.PSCredential -argumentlist "domain\username",$pass

New-PSDrive -Name Z -PSProvider filesystem -Root "\10.15.2.5\Baseline$" -Credential $creds
Remove-Item -Path Z:\ -filter *.bak
Copy-Item -filter *.bak E:\Backup -Destination Z:\ -ErrorAction SilentlyContinue -ErrorVariable A
Remove-PSDrive Z

The script copies files, and is a quick and dirty way to get files from a server in one domain to another without any sort of trust.

Moving the SQL Server tempdb file location after SQL Server is installed

Working with one of my education customers I recently had to perform some maintenance on their SQL Database server as they were running low on disk space AND had a free unused virtual disk where we could throw their tempdb onto (it was meant to go here but they didn’t place it there during installation). So I had the task of moving it over.

First step is to get an as-is of where tempdb currently is and how many fragments there are; Open up SQL Management Studio and run the following query.

-- Lists all current tempdb files and their paths
SELECT name, physical_name AS CurrentLocation 
FROM sys.master_files 
WHERE database_id = DB_ID (N'tempdb');
GO

Now that we have a listing (similar to our screenshot above/left), we then need to build up a query to move the database files from one drive to another. It should look something similar to the below query, changing paths and adding/removing tempdb fragments as needed (simply add more alter database statements for each file node).

USE master;
GO
ALTER DATABASE tempdb
MODIFY FILE (NAME = tempdev, FILENAME = 'F:\MSSQL14.MSSQLSERVER\MSSQL\DATA\tempdb.mdf');
GO
ALTER DATABASE tempdb
MODIFY FILE (NAME = templog, FILENAME = 'F:\MSSQL14.MSSQLSERVER\MSSQL\DATA\templog.ldf');
GO
ALTER DATABASE tempdb
MODIFY FILE (NAME = temp2, FILENAME = 'F:\MSSQL14.MSSQLSERVER\MSSQL\DATA\tempdb_mssql_2.ndf');
GO
ALTER DATABASE tempdb
MODIFY FILE (NAME = temp3, FILENAME = 'F:\MSSQL14.MSSQLSERVER\MSSQL\DATA\tempdb_mssql_3.ndf');
GO

In the above example we have specified F drive.  The other thing to note is that if you have SQL Server running through a service account, that will need NTFS modify permissions to the destination.  Once that is executed under the Master database context the files will then be re-created on the next SQL Server service restart.

Deploy Firefox in the Enterprise with uBlock Origin, HTTPS Everywhere and Privacy Badger using Group Policy

So we’ve been deploying Firefox for quite a while pretty much everywhere we can however, only recently have we started standardising the deployments across the organisation’s that we manage. We’ve deployed Internal Root CAs using the CCK2 method to improve our user experience with Deep Packet SSL inspection however setting up configurations and extensions and having that consistent was a challenge. Recently I learnt that Mozilla begun developing Group Policy Objects not long ago and that now allows for enterprise management straight into Firefox without the fuss. In this article I’ll go through setting up uBlock Origin, HTTPS Everywhere and Privacy Badger which are our go to Extensions for end-user protection.

To deploy Firefox (or any windows application), you generally want to use an MSI based installer (for better control and management). Mozilla have now been building them for a shot while via their Enterprise Deployment Support page for beta and standard releases. If you are after the Extended Support Release (ESR), visit the FrontMotion download page (they also offer a number of other services such as a custom packager). In a managed environment you’d either use Group Policy to deploy software or System Center Configuration Manager or some other form of MDM (ala InTune). The MSI should be in a network share accessible by all machines, you would then be imported either into Policies > Software Settings > Software installation for Group Policy (then Right-Click, New > Package) or an Application under the Software Library for Config Manager and pushed out. There are much better guides than what I can fit in here so please Google if you’re unsure. In our case, I used Config Manager and since we’re upgrading and I’ve setup a Supersedence rule like below.

Once Firefox is being deployed we need to get these extensions onto the machines, so the first part is getting the URLs of the extensions you wish to deploy. Visit the add-on store and start searching for what you’re after. For our example, as mentioned we will be installing uBlock Origin, Privacy Badger and HTTPS everywhere. When you are at the add-on page, right-click on the Add to Firefox button and select Copy Link Location and save that for later. Once you have your list of Extensions it should look like something similar to the below (I’ve removed the tracking string at the end).

https://addons.mozilla.org/firefox/downloads/file/1672871/ublock_origin-1.18.4-an+fx.xpi
https://addons.mozilla.org/firefox/downloads/file/1688114/privacy_badger-2019.2.19-an+fx.xpi
https://addons.mozilla.org/firefox/downloads/file/1669416/https_everywhere-2019.1.31-an+fx.xpi

Next is to get the Group Policy Definitions from Mozilla and load them onto your Active Directory Group Policy. I’d highly recommend you have a Group Policy Central Store setup as it makes managing this stuff a whole lot easier. Download the latest version or ones that match your deployment of Firefox from the Mozilla GitHub Releases page, unzip and then copy across those files to the Group Policy Central Store or required location. Now the fun part.

Create a new Group Policy Object, in my case Configure – Firefox and then open it up, and navigate to the following policy branch; Computer Configuration > Policies > Administrative Templates> Mozilla > Firefox > Extensions. Here we will be enabling Extensions to Install. Using the list we compiled earlier, enter the URLs one by one into the list so it will look like something similar to the below.

The next step I’d recommend, is we want to stop or prevent our end users from being able to remove these extensions/protections. To do this we need to get the Extension IDs, so fire up Firefox and install the list of Extensions we compiled earlier (if you didn’t already for testing). Now the easiest way I’ve found of getting the Extension IDs is to use the in-built memory profiler of Firefox. In the address bar enter about:memory and once it loads under Show memory reports group, click Measure. Do a search for Extensions and you’ll get to a list of all currently running extensions. Now extract everything for the id key (in this case a GUID but can be text as well);

Extension(id={d634138d-c276-4fc8-924b-40a0ea21d284}, name="1Password X – Password Manager", baseURL=moz-extension://31872614-f67c-4cda-84e4-18c0515c8b48/

The above is an example of what you’ll find in the list (using 1Password). Below is what we’ll be entering into the Prevent Extensions from being disabled or removed based on our setup so far – with the last line belonging to Privacy Badger.

Now that we have the list of Extension IDs we want to enter these into the Prevent extensions from being disabled or removed Group Policy setting located in the same branch as Extensions to Install. Again, enter them into the list one by one until you have something similar

Once that is configured, apply the Group Policy Object to your Test Machines preferably with Mozilla Firefox installed on it, log in and do a gpupdate /force with the end result being those Extensions magically appearing as per the below image.

Hope that helps.

Change Windows 10 Taskbar Icons Script Deploying a custom taskbar for Windows 10

Over the summer holiday period, I was assisting a school with building out an SoE for the new year.  One of the things we used to do with Windows 7 was tweak the Taskbar to contain only items we were after instead of the default items of Internet Explorer, Windows Explorer and Windows Media Player. To do this we implemented a VB Script that would make the changes on the fly.  And since that stopped working I’ve always wanted to sit down and try get it working again for Windows 10 and I’ll now share how I did.

You’ll want to begin by building out your Taskbar layout, add in any icons that you want and when you’re ready we can use a command to extract it.  Alternatively, you can use the sample below to build your own manually, simply reference the location of the shortcuts you want to place in and the order.

<?xml version="1.0" encoding="utf-8"?>
<LayoutModificationTemplate
xmlns="http://schemas.microsoft.com/Start/2014/LayoutModification"
xmlns:defaultlayout="http://schemas.microsoft.com/Start/2014/FullDefaultLayout"
xmlns:start="http://schemas.microsoft.com/Start/2014/StartLayout"
xmlns:taskbar="http://schemas.microsoft.com/Start/2014/TaskbarLayout"
Version="1">
	<CustomTaskbarLayoutCollection PinListPlacement="Replace">
		<defaultlayout:TaskbarLayout>
			<taskbar:TaskbarPinList>
				<taskbar:DesktopApp DesktopApplicationLinkPath="%ALLUSERSPROFILE%\Microsoft\Windows\Start Menu\Programs\Firefox.lnk" />
				<taskbar:DesktopApp DesktopApplicationLinkPath="%APPDATA%\Microsoft\Windows\Start Menu\Programs\System Tools\File Explorer.lnk" />
				<taskbar:DesktopApp DesktopApplicationLinkPath="%ALLUSERSPROFILE%\Microsoft\Windows\Start Menu\Programs\Word.lnk" />
				<taskbar:DesktopApp DesktopApplicationLinkPath="%ALLUSERSPROFILE%\Microsoft\Windows\Start Menu\Programs\Excel.lnk" />
				<taskbar:DesktopApp DesktopApplicationLinkPath="%ALLUSERSPROFILE%\Microsoft\Windows\Start Menu\Programs\Outlook.lnk" />
				<taskbar:DesktopApp DesktopApplicationLinkPath="%ALLUSERSPROFILE%\Microsoft\Windows\Start Menu\Programs\Microsoft SQL Server Tools 17\Microsoft SQL Server Management Studio 17.lnk" />
			</taskbar:TaskbarPinList>
		</defaultlayout:TaskbarLayout>
	</CustomTaskbarLayoutCollection>
</LayoutModificationTemplate>

Now that we have our Layout template ready, it is simply a matter of importing it. We do so by using the Import-StartLayout command. This can also be used to import a Start Menu tile layout however in this case we are only importing the Taskbar layout. Execute the command below (replacing the path with your own).

Import-StartLayout -layoutpath \\MGS-DEPLOY-02\Scripts\TaskbarLayout.xml -Mountpath C:\

Now you can either execute this on a user login, during an MDT Task Sequence or at any other time you want.

For more info around this topic, visit the Microsoft Docs site. Hope that helps.