top of page

72 results found with an empty search

  • Delegation of DNS with PowerShell

    DNS Delegation DNSAdmins is a default security group in Active Directory that delegates administrative control over the DNS Zones and some DNS servers settings to a specific user account or Group. Members of this group have permission to manage DNS zones and records and configure DNS server settings including Forwarders etc. However, it may not be desirable to delegate the entire DNSAdmin permission to a user via DNSAdmins and a more targeted approach of delegating zone management or creation could be necessary. The script (here), creates the required groups to delegate DNS Server management, the ability to create and delete zones and finally zone management. Group names will either be named DNSServer or DNSZone, where 'MicrosoftDNS' is used the group defines a top-level permission. Also, AD groups follow the suggested Microsoft naming convention of 'AT' or Action Task. Here are a few examples: AT_DNSServer_MicrosoftDNS_Manage is defined as the ability to change settings for the DNS Server eg create Forwarders or scavenging. AT_DNSZone_MicrosoftDNS_Manage is defined as the ability to create and delete Zones but not change any DNS Server settings. AT_DNSZone_Microsoft.com_Manage is defined as the ability to manage the Microsoft.com DNS Zone. Note: DNSAdmin group on its own does not have enough permissions and requires Server Operators, Administrators for the Domain or Domain Admin, basically local administrative rights over Domain Controllers. Setup The setup is pretty straightforward a virtual Domain Controller and Member Server. An OU for the delegated groups with a pre-existing group named AT_Server_User. This is to provide login via a user account to the Member Server with Remote Desktop User Rights Assignment and the delegated DNS group(s). Update the Member Server OU GPO with the following changes. Create 'Restricted Groups' for Administrators and add AT_Server_Admin. Create 'Restricted Groups' for Remote Desktop Users and add AT_Server_User. Add both Remote Desktop Users and AT_Server_User to the 'Allow log on through Remote Desktop Service' User Rights Assignment. Create a user account and add it to the AT_Server_User group. Deploy the DNS delegation script (here) with Domain Admin rights on the Domain Controller. After executing the script the delegation OU should be similar to the picture below with groups for both forward and reverse zones and 2 default MicrosoftDNS groups. DNS Server Delegation Members of AT_DNSServer_MicrosoftDNS_Manage are able to connect DNS and manage server settings but not create, delete or manage any existing zone. Due to the issue of requiring administrative rights on Domain Controllers, not all settings can be managed. Setting for interface options, DNSSec or Trustpoints requires further rights, most other DNS configuration options are available. All DNS Delegation groups require a minimum of READ to connect via the DNS snapin. DNS Server permissions can be found under System, MicrosoftDNS in dsa.msc DNS Zone Creation and Deletion To create and delete zones open adsiedit and type 'dc=domaindnszones,dc=fqdn'. Full control for AT_DNSZone_Manage is set against CN=MicrosoftDNS without inheritance. DNS Zone Management Finally, each zone is delegated to a named DNS zone group. use adsiedit, connect to the 'default naming context' to browse to each zone to interrogate permissions.

  • RGB Office Transformation, from Drab to Fab

    Finding ways to make the home office more appealing and enjoyable is a constant. I've tinkered with RGB lighting and had some success over the years, but without a vision, it was incomplete, there's still room for improvement and let us be honest you can't have enough RBG. I've taken inspiration from CyberPunk as well as watching plenty of YouTube videos on office setups for Gamers. I'm also a big fan of Japanese Anime and Alter Carbon season 1. Let's not mention season 2, which dumbed down, mangled and then discarded the best bits of books 2 and 3, I'd like to take you on a tour of the transformation from an uninspiring (boring) office, as demonstrated by the picture below, into a visually appealing office with the simple use of RGB lights and some work focus tech upgrades. The Lighting: Govee Glide Hexa Light Panels * 20 Panels - £380 Govee Alexa LED Strip Lights 10m - £31 LED Aluminum Profile U Shape 6Pack 1M - £22 FEAHRZEUG Smart LED Ambient Light Bars - £35 KSIPZE 30m Led Strip Lights RGB - £21 The Govee Hexa panels support 12 connected panels via 1 power supply. Or linking all 20 panels with both power leads attached. For me, this would have resulted in leads trailing down the wall. So, I've opted for 2 separate 10 panel configurations. The overall connectivity had its limitations, given that each panel had just one input and one output. This constraint restricted the scope for creating more intricate patterns. The cheaper and less LED dense KSIPZE strips were discreetly installed under the desk or in less conspicuous areas. While the Govee's higher density LED's were placed inside the diffusers. The hobby that I get paid for is IT, the upside is that it requires lots of gadgets and tech for testing and development. The downside, without a doubt, is the cost involved. However, I see it as an investment. The more I invest in making things quicker, more efficient, and more effective, the quicker I can complete projects. This helps me justify the following monitor upgrades..... The 34" monitors were been switched out for an LG 40WP95CP and LG 27UL550P. The LG 40WP95CP 40" monitor has a pixel density of approximately 140 PPI, compared to the typical range between 55 to 110 PPI. While this lower pixel density might be suitable for gaming, it's less than ideal for office-based productivity tasks where a higher pixel density often leads to sharper and more detailed on-screen content. The 27” is orientated in portrait for the long read and scripting. The only issue is that the refresh rates on both monitors reduces to 30Hz as the main workhorse laptop’s GPU isn’t up to the job. The reduction of refresh rate isn’t a massive issue as I’m not a PC gamer and YouTube doesn’t seem to be too badly affected. But it does tell me that more hardware is required, something water cooled with internal RBG. Of course, no CyberPunk office is complete without a cityscape and a very cool car Etsy Mcarlen P1 - Standard Post 50cm * 70cm - £74 EleksMaker Elekstube IPS Nixie Tube Digital Clock Now, onto my favourite part, the Nixie Tube clock. While it's not a genuine Nixie Tube due to their high cost, this is an excellent alternative, featuring six distinct clock faces, including a simulated Nixie Tube display. Amazon.com - £243 elekstube.com - £150 + £12 expedited delivery Thank you for taking the time reading this blog regarding the RGB office upgrades Stay tuned for more updates, as I'm hoping to add further enhancements, any ideas would be gratefully received.

  • How to Create GPOs with Restricted Groups using PowerShell.

    If you have ever tried 'PowerShell'ing' Group Policies, you know that support from Microsoft is sub-optimal, meaning that there is no support, of course, to fill this gap there are paid 3rd party offerings. The Task at Hand: A new 'Member Server' OU and various sub-OU's are needed, as well as their corresponding Group Policies, AD Groups and Restricted Groups. This feels like the millionth time I've manually accomplished this task and it's fairly repetitive and time consuming, alternatively, I can crack open PowerShell. The mantra is 'Why point and click when there's PowerShell' so let's get creative. Components of a Domain GPO: A Group Policy Object (GPO) is made up of various file types, strangely enough, the same as local GPO's configured via GPEdit.msc. Having scripted SecEdit, updating both User Rights AssignmenPats (URA) and Services previously the 'ask' should be straightforward. Basic file layout of a Domain GPO: C:\Windows\SYSVOL\domain\Policies\{GUID}\ Machine\Registry.pol User\Registry.pol Machine\Microsoft\Windows NT\SecEdit\GptTmpl.inf Machine\Microsoft\Windows NT\Audit\Audit.csv GPO security settings are written to GptTmpl.inf, an example of a GptTmpl.inf with Restricted Groups and User Rights Assignments from an SCCM installation including a SQL Member Server. The above looks a little confusing and here's a quick breakdown to help: *S-1-5-21-4000739697-4006183653-2191022337-1143 The SID of a Service Account [Group Membership] *S-1-5-32-544__Memberof = *S-1-5-32-544__Members = *S-1-5-21-4000739697-4006183653-2191022337-1143 *S-1-5-32-544 = Builtin\Administrators Group *S-1-5-32-573__Memberof = *S-1-5-32-573__Members = *S-1-5-21-4000739697-4006183653-2191022337-1171 *S-1-5-32-573 = Builtin\Event Log Readers *S-1-5-32-559__Memberof = *S-1-5-32-559__Members = *S-1-5-21-4000739697-4006183653-2191022337-1171 *S-1-5-32-559 = Builtin\Performance Log Users [Privilege Rights] SeServiceLogonRight = *S-1-5-21-4000739697-4006183653-2191022337-1170 SeServiceLogonRight = Log on as a service SeInteractiveLogonRight = *S-1-5-21-4000739697-4006183653-2191022337-1169 SeInteractiveLogonRight = Allow log on locally SeBatchLogonRight = *S-1-5-21-4000739697-4006183653-2191022337-1187 SeBatchLogonRight = Log on as Batch Overview of script actions: Execute the script directly on the Domain Controller with the PDC role. The script will create a 'Resources' OU off the root of the Domain, then sub-ou's 'Member Servers' and 'Restricted Groups'. For each application service eg Exchange, SharePoint etc, an additional OU is then created with corresponding AD groups for both Administrator and Remote Desktop User Groups. Finally, GPOs are created for each OU and the AD Groups SID are assigned to both the Restricted Groups and Remote Interactive User Rights Assignment. The script: https://github.com/Tenaka/GPOs Script Breakdown: The following are extracts from the script that is accessible from Github. Resolve the Domain Naming Context. $rootDSE = (Get-ADRootDSE).rootDomainNamingContext Resolve the path to Sysvol, just in case it was moved during Domain Controler installation. $smbSysvol = ((Get-SmbShare -name "sysvol").path).replace("SYSVOL\sysvol","sysvol") Set 'Resource' OU as a root for all subsequent OU's for member servers etc. $resRoot = "Resources" Stitch or join the Root DN and variables to create OU Distinguished Names. $resourceOU = "OU=$($resRoot),$($rootDSE)" $memSrvOU = "OU=$($memSrvRoot),OU=$($resRoot),$($rootDSE)" $ResGroupOU = "OU=$($ResGroupRoot),OU=$($resRoot),$($rootDSE)" Create an OU called 'Resources' as a top-level OU. New-ADOrganizationalUnit -Name $resRoot #-ProtectedFromAccidentalDeletion $false Create a variable based on the OU name for creating an AD group name. $rgRtAdminGp = "RG_$($MemSrvRoot)_Admin" Create a new Domain Global group based on the OU name for Admin and Remote user groups. Groups are created in the 'Restricted Groups' OU. New-ADGroup -Name $rgRtAdminGp –groupscope Global -Path $ResGroupOU -Description $rgRtAdminDescrip Get the SID of the new Group. $getRtRGAdminSid = $getRtRGAdmin.SID.Value Declare the variable for creating an OU. $GPOName = "GPO_$($MemSrvRoot)_RestrictedGroup" Create a new OU based on the variable and link to OU. New-GPO -Name $GPOName | New-GPLink -Target $getOUMS.DistinguishedName Set delegation permission on the OU so the AD group can edit their own policy. Set-GPPermission -Guid $getGpoId -PermissionLevel GpoEditDeleteModifySecurity -TargetType Group -TargetName $rgAdminGp Declared the path to the GPO directory. $sysvol = "$($smbSysvol)\domain\Policies\{$($getGpoId)}\Machine\Microsoft\Windows NT\SecEdit" Create a directory and GptTmpl.inf file. New-Item -Path $sysvol -ItemType Directory -Force New-Item -Path $sysvol -Name GptTmpl.inf -ItemType File -Force Declare variables based on the Group SIDs for Admin and Remote Groups. $addConAdmin = "*S-1-5-32-544__Members = *$($getRtRGAdminSid)" $addConRDP = "*S-1-5-32-555__Members = *$($getRtRGRDPSid)" $addConURARemote = "SeRemoteInteractiveLogonRight = *$($getRtRGAdminSid),*$($getRtRGRDPSid)" Update GptTmpl.inf. Add-Content -Path $gptFile -Value '[Group Membership]' Add-Content -Path $gptFile -Value '*S-1-5-32-544__Memberof =' Add-Content -Path $gptFile -Value $addConAdmin Add-Content -Path $gptFile -Value $addConURARemote Write the GPCMachineExtensionName attribute with the Client-Side Extension GUID of the areas of the GPO setting for the GPO. If not the settings won't display in the GPO Management tool and the target server won't be able to read the GPO. Set-ADObject -Identity $getGPOPath -Replace @{gPCMachineExtensionNames="[{827D319E-6EAC-11D2-A4EA-00C04F79F83A}{803E14A0-B4FB-11D0-A0D0-00A0C90F574B}]"} The Client-Side Extensions GUID can be extracted from Polices, there's no need to try and discover those GUIDS. Set the required policies and copy the GUIDs. The initial scenario of creating Restricted Groups GPO's is complete, with a few alterations, Administrative Template settings could be set by copying Registry.pol into the GPO. A better use would be setting up URAs for service accounts eg SQL and the Logon as a Service right dynamically as part of an automatic installation of Microsoft SQL Server. Enjoy and hope it proves useful and do give it a go prior to paying for a 3rd party tool. The script: https://github.com/Tenaka/GPOs Security Identities: https://learn.microsoft.com/en-us/windows-server/identity/ad-ds/manage/understand-security-identifiers Mapping User Rights Assignments: https://www.tenaka.net/post/translate-user-rights-assignments-from-guids-to-group-names

  • Identify and Fix Unquoted Paths Vulnerability Automatically

    The unquoted paths vulnerability is a security flaw that occurs when a software application or service running on a system references executable files or scripts without enclosing the file path in quotation marks. This can lead to a potentially exploitable security gap because the operating system interprets the unquoted path incorrectly. ​ When a program with an unquoted path runs, the OS may attempt to execute the name of the directory with the space. C:\Program.exe C:\Program Files (x86)\Application.exe C:\Program Files (x86)\Application One\ An attacker can place a malicious executable in a directory with a similar name to the one referenced in the unquoted path. When the vulnerable program runs, it might mistakenly execute the malicious code, enabling unauthorized access, privilege escalation, or other security breaches. ​ To mitigate this vulnerability, developers should always use quotation marks around file paths in their code to ensure that the correct executable is executed, and users should keep their systems updated to patch any discovered unquoted paths. vulnerabilities. For demo purposes, the system has been intentionally afflicted with unquoted path vulnerabilities. This output is from a dedicated Unquoted script found @ https://github.com/Tenaka/UnQuoted-Paths This output is from a far more extensive suite of scripts that search many vulnerabilities and configuration errors and present the results in an HTML format that can be imported into Excel and can be found @ https://github.com/Tenaka/SecureReport. While the capacity to spot vulnerabilities is valuable, my approach focuses on automatically addressing these issues during deployments whilst also reviewing the output. Resolving security vulnerabilities is then built into MDT and SCCM (MECM) Task Sequences. Equally, the reporting and resolution of this issue can be accomplished manually by executing the scripts with Admin privileges from PowerShell. No manual intervention is required, any application that falls through the gaps eg a member of staff deploying an app without following the process, that's if the process exists. Back to Github to download the 2nd script that 'fixes' Unquoted paths. https://github.com/Tenaka/UnQuoted-Paths Output is provided to any actions taken both to PowerShell and a log file. The script adds the double-quotation marks both preceding and following the imagepath, ensuring that the path is properly enclosed within quotation marks.

  • Audit Applocker Rules and Export to Excel

    Introduction Reporting on AppLocker rules is crucial to maintaining security. It provides insight into allowed and blocked applications, aiding in policy refinement. The main challenge lies in the absence of a management graphical user interface (GUI) for rule administration and processing. Indeed, GPResult offers a visual display of individual policies, but it falls short in presenting a comprehensive overview of the combined and applied policies. A Quick Recap of Applocker A quick recap. AppLocker is a security feature available in Windows that provides user context application control. It uses policies based on file attributes like publisher, hash, and path to allow or deny software execution. By preventing unauthorized or potentially harmful programs from running, AppLocker helps safeguard systems against malware and unauthorized software installations, enhancing overall security. As Applocker only protects the user context it provides little safeguard against RCE. Applocker is also subject to numerous Living off the Land bypasses and should only ever be considered part of a layered approach to Windows security. Windows Defender Application Control is a far more robust kernel level application control mechanism. The Script The script for exporting Applocker rules can be found @ https://github.com/Tenaka/Applocker/blob/main/ApplockerReport.ps1 Why Export to HTML!!! If you hadn't realised the script initially creates an HTML report, but the original intention was to export Applocker Rules to .csv, then into Excel. Exporting to CSV proved limiting due to the lack of support for individual worksheets or pages. The report must also work on Clients, Servers and not be reliant on Excel or imported Excel PowerShell modules. Finally, I've an extensive configuration, security and vulnerability assessment report written in PowerShell, likewise creating an HTML report that also can be imported into Excel. The vulnerability assessment script can be found @ https://github.com/Tenaka/SecureReport The Report Download the script and execute it using PowerShell_ISE or native PowerShell. While I haven't conducted extensive testing with PowerShell, it should function in both environments. The report outputs to $env:USERPROFILE, the root of the user's profile path, named the date, hostname-report.htm "C:\Users\Fred\23-08-28-LP674504-Report.htm". The report will contain the effective policy applied to the endpoint. While appealing, the current format may not be the most practical to work with. However, you can import it as a web source into Excel, where each heading corresponds to an Excel worksheet. Here are a couple of examples followed by a quick how-to for importing into Excel. Excel Import Once the script concludes, the AppLocker Audit report will automatically open in the default web browser. Copy the URL path to the clipboard for use in the importing process. Open Excel and go to the Data tab, then select 'From Web'. Paste the file path into the URL box. In the navigation Window, select the Applocker Rule sets and then 'Load' and 'Load To...' on the drop down. Select 'Table' on the Import Data window. Importing the HTML file into Excel requires a brief moment, although it won't provide sufficient time to justify indulging in a coffee break. Upon completing the import process, an Excel spreadsheet is prepared and readily available for review. Hope this proves useful, feedback is always welcome and thanks for your time.

  • Change MDT Mapped Z: Drive

    When deploying a Windows operating system or installing MDT applications, a mapped network drive is usually mounted temporarily as Z:\. The letter "Z" is chosen because it is typically not used for local drives in most deployments, it's less likely to conflict with existing drive letters on the target computer. What occurs when an application necessitates the use of the Z:\ drive during the process of deploying an image through MDT? It's often better to overlook your initial reaction.....Z: Being engaged during the operating system installation. Applications can persist with preconfigured mapped network drives. The illustration provided represents a common example of a regular operating system deployment, and it's evident that the drive letter Z: is assigned to the MDT Deployment share. There appear to be two approaches to altering the fixed Z:\ drive mapping to a different designated letter, although there might be additional methods available as well. During my search for a solution, Google yielded no results, which could potentially be attributed to me asking the wrong questions. Late to the party and whilst writing this blog, ChatGPT provided a suggestion to address this issue, update the 'CustomSettings.ini' file by incorporating 'DriveLetter=Y'. Had it succeeded on the initial attempt, it would have presented a more graceful resolution, unfortunately, that wasn't the case, I haven't delved into the reasons behind the failure. Let's proceed with a working solution by modifying the hardcoded drive letter in ZTIUtility.vbs. I'm using PowerShell_ISE as it conveniently displays the line number. Browse to C:\MDTDeploymentShare\Scripts\ZTIUtility.vbs Search for "z" and on line 3003 or thereabouts, depending on the version of MDT installed, update the hardcoded drive 'Z' to something else, not C: or X: as these are also used by the OS and MDT. In this case, I've designated the letter 'T' as the new MDT mapped network drive. Regenerate the Boot images by Updating the Deployment Share. Choose 'Completely regenerate the boot images', then grab a coffee. Launch WDS and Replace the Image. Browse to the MDT Share and select the LiteTouchPE_x64.wim. Deploy a new Windows OS from MDT Pxe and the MDTDeploymentShare is now mapped as "T:\". If you found the content valuable, I encourage you to explore the MDT deployment guides and instructional resources available under the main website sections. Finally, I'm headed off to have strong words with the individual responsible for implementing an application that requires hardcoded drives for configuration components.

  • PowerShell's Custom Runtime for AWS Lambda's - Installation

    Introduction PowerShell custom runtime for AWS Lambda is an addition to the AWS Lambda services, offering developers and Microsoft engineers the ability to leverage PowerShell within the serverless environment. Unlike the standard runtimes supported by AWS Lambda, which include languages like Python, Node.js, and Java, the PowerShell custom runtime, developers can now build and deploy Lambda functions using their existing PowerShell skills. It allows for the integration of PowerShell's vast library of cmdlets and modules, enabling developers to leverage a wide range of pre-built functions and automation tasks. PowerShell's object-oriented scripting approach also provides a means for manipulating and managing AWS resources, making interacting with other AWS services like Amazon S3, Amazon DynamoDB, and AWS CloudFormation easier. Additionally, it's now possible to edit the PowerShell script directly within the published Lambda, which was not previously possible. The Truth of the Matter The issue, it's PowerShell, any real DevOps will be using anything but PowerShell as it's a scripting language, so there's limited support for PowerShell on AWS. However, if you're a Microsoft engineer who needs to manage the Windows Infrastructure on AWS then PowerShell will be your go to scripting language for Lambda functions. The PowerShell custom runtime setup provides 3 options for deployment, Linux or WSL, native PowerShell and Docker. The native PowerShell deployment doesn't work, at least I couldn't get it working and others have faced similar issues, with no resolution provided. The good news is that Windows Subsystem for Linux (WSL) deployment does successfully deploy and execute and this is what I'll be using. Requirements WSL 2 requires the Hyper-V Hypervisor, this rules out any AWS EC2 instance, Hyper-V isn't supported. A Windows 2022 or Windows 11 with the latest patches installed is required. I've Windows 11 installed on a Zenbook Space Edition laptop with the Hyper-V feature installed and virtualization enabled in the system's BIOS or UEFI. WSL 2 isn't directly installed on the laptop, it can be, I prefer keeping my clients free of clutter and instead opted for a Windows Server 2022 Hyper-V vm. Any issues the vm will be rolled back or redeployed. Now deploy a Gen2 Windows Server 2022 Hyper-V image named, ensure the latest Windows updates are applied. AWS Configuration An account named 'svc_lambda' has been created with Administrative access in IAM. The excessive rights are for ease of deployment, the permissions will be adjusted to those needed later. The account's Access and Secret have been exported for use during the creation of the PowerShell Runtime Lambda. Installation of Windows Subsystem for Linux version 2 WSL version 2 was not supported by Server 2022 or Windows 11 at release. Install the latest Windows patches to enable WSL2 support. I may have mentioned this a few times now. Power off the VM and from the host open an elevated Powershell session. Then type the following command to enable nested hypervisor. AWS-Mgmt01 is vm's name in the Hyper-V console and not its hostname. Set-VMProcessor -VMName AWS-Mgmt01 -ExposeVirtualizationExtensions $true Power on, AWS-Mgmt01, login and elevate a PowerShell session and execute the following command. This will install all components and features required. If the command fails to be recognised, then Windows updates aren't applied or the experience I had, they failed to install correctly. wsl --install Restart AWS-Mgmt01, log in and WSL should auto launch, if not run wsl --install from PowerShell. Type in a username and password at the prompt. Installation confirmation will show that the latest version of Ubuntu and WSL 2 are configured. In the Linux shell execute the following commands to update and install all required dependencies. sudo apt update -y && sudo apt upgrade -y sudo apt install glibc-source groff less unzip make -y AWS Serverless Application Model Installation AWS SAM (Serverless Application Model) is a framework provided by AWS that simplifies the development, deployment, and management of serverless applications. It extends the capabilities of AWS CloudFormation, allowing developers to define serverless application resources using a simplified YAML syntax and is next to install. Type pwd and it will return '/home/user'. Type: mkdir Downloads to create a working directory and cd into the directory. Download the SAM client for Linux, unzip and Install. wget https://github.com/aws/aws-sam-cli/releases/latest/download/aws-sam-cli-linux-x86_64.zip unzip aws-sam-cli-linux-x86_64.zip -d sam-installation sudo ./sam-installation/install Confirm version and successful installation. /usr/local/bin/sam --version Download the AWS Client for Linux, unzip and Install wget "https://awscli.amazonaws.com/awscli-exe-linux-x86_64.zip" unzip awscli-exe-linux-x86_64.zip sudo ./aws/install Confirm version and successful installation. /usr/local/bin/aws --version Download the AWS Lambda PowerShell Runtime. git clone https://github.com/awslabs/aws-lambda-powershell-runtime mv aws-lambda-powershell-runtime/ aws-sam cd aws-sam/examples/demo-runtime-layer-function Export the access and secret keys for the Lambda service account via AIM. Configure access for the Lambda-Svc user. aws configure AWS Access Key ID [None]: AKIA5IZEOZXQ4XXXXX AWS Secret Access Key [None]: 2O8hYlEtAzyw/KFLc4fGRXXXXXXXXXX Default region name [None]: us-east-2 Default output format [None]: Build the custom runtime . sam build --parallel Deploy Custom Runtime to AWS. sam deploy -g Stack Name [sam-app]: PowerShellLambdaRuntime AWS Region [us-east-2]: us-east-2 Confirm changes before deploy [y/N]: n Allow SAM CLI IAM role creation [Y/n]: y Disable rollback [y/N]: n Save arguments to configuration file [Y/n]: n The deployment will take a few minutes as it creates CloudFormation, an S3 bucket and finally the Lambda. Testing the Runtime Lambda Function From the AWS console, open Lambda and browse to Functions to confirm the successful deployment of the PowerShell Runtime Demo. It's at this point when native PowerShell is used, the whole runtime falls apart and fails to execute. Click on Test after reviewing the PowerShell code. This is a first not only can it be viewed, it's editable. Add an Event Name and Save. Click on Test and review the details. The Runtime is installed, but not much else..... This is just the beginning and a bit of a problem if you thought that it was a simple matter of creating new Lambda's and applying PwsRuntimeLayer. I'm the bearer of bad news, let me explain. Two layers were created for the demo, the DemoAWSToolsLayer and PwshRuntimeLayer. For PowerShell, the correct modules need importing and these are supplied in the Lambda layers. In this case, it's the DemoAWSToolsLayer that loads the required module for the Lambda demo. And in the Demo's case, it's only the AWS.Tools.Common module needed by the function to the Get-AWSRegion. Consequently, additional layers containing the necessary modules for the function are required. For instance, to create a Lambda function to stop an EC2 instance, both the AWS.Tools.Common and AWS.Tools.EC2 modules are needed. We will delve into this in the next blog (here). Links https://aws.amazon.com/blogs/compute/introducing-the-powershell-custom-runtime-for-aws-lambda/ https://aws.amazon.com/blogs/compute/extending-powershell-on-aws-lambda-with-other-services/ https://www.youtube.com/live/FAU0V_SM9eE?feature=share

  • PowerShell's Custom Runtime for AWS Lambda's - Importing Modules

    Welcome to the second part of the installation and configuration process for the AWS Custom Runtime for PowerShell Recap In the first part, we covered the installation process of AWS's Custom Runtime for PowerShell, which involved deploying Windows Subsystem for Linux (WSL) and initializing the Runtime and deploying the Demo Lambda Function. Here's the link. with instructions on how to instal WSL and deploy the Custom Runtime. https://www.tenaka.net/post/wsl2-ps-custom-runtime-deployment What's in Part 2 The first part left on a bit of a cliffhanger, functionally, the Custom Runtime for PowerShell worked, but without additional modules, there's very little that could be accomplished. The subsequent steps entail the creation of Lambda layers that incorporate additional modules, which will be utilized in Lambda Functions to finalize the end-to-end deployment process. Copy and Paste Upon completing this process, the objective is to successfully deploy a Lambda Function equipped with a layer containing both the AWS.Tools.Common and AWS.Tools.EC2 PowerShell modules. This will enable the ability to start and stop an EC2 instance within the AWS environment. Continuing where we previously left off, we are going to utilise the work that has already been completed by AWS, by amending an existing example. Before we start, only 5 layers can be added to a Lambda Function, but a layer can contain multiple modules. Change the directory into the AWSToolsforPowerShell directory. cd /Downloads/aws-sam/powershell-modules/AWSToolsforPowerShell Copy the existing S3EventBridge directory. cp AWS.Tools.S3EventBridge AWS.Tools.EC2 -r cd AWS.Tools.EC2 Amendments The 3 files that will require amending to successfully publish additional modules as layers are: build-AWSToolsLayer.ps1 template.yml /buildlayer/make The process is straightforward, find and replace all references to the current module functionality with the new module functionality. Although updating build-AWSToolsLayer.ps1 is not strictly essential since we'll be relying on the Make command, taking a few seconds to do so ensures consistency among all the files involved. nano build-AWSToolsLayer.ps1 Ctrl + o to save (output the file) Ctrl _ x to exit nano Add additional lines for modules that are to be extracted from aws.tools.zip. Note: It is crucial to ensure the correct ordering of modules, with AWS.Tools.Common listed before the module for EC2. The EC2 module relies on the functionality provided by AWS.Tools.Common. In the original S3EventBridge version of template.yml AWSTools.EC2 read S3EventBridge. Ensure !Ref values are updated from AWSToolsS3EventBridgeLayer to AWSToolsEC2Layer, this value is passed between files and needs to be consistent. Save and exit template.yml. cd buildlayer nano Make The first line references !Ref and it must be consistent with the value set in template.yml. Modify the unzip commands to accommodate any supplementary modules. Save and exit Make. Build and Deploy After each amendment to the configuration files, the content must be redeployed in order to reflect the changes made: sam build To publish to AWS run the following: sam deploy -g Layers and a Lambda Login to AWS Lambda and confirm the new layer has been created. Let us bring the entire Custom Runtime endeavour to fruition, by creating a new Lambda Function designed to initiate the start of an EC2 Instance, by clicking Create Function. Name the function and select the Amazon Linux 2 Runtime. Ensure the Architecture is set to x86_64. 'Create a new role with basic Lambda permissions' is also selected. Create Function Within the Function Overview click on Layers, then Add Layers. Select Custom Layers and then add in order: PwshRuntimeLayer AWSToolsEC2Layer PwshRuntimeLayer is listed first, followed by any modules. Click Configuration and Edit Update memory to 512Mb and timeout to 1 minute. Before saving the configuration updates, open the IAM link in another browser tab to grant the function the additional permissions required for execution. Within IAM, add AmazonEC2FullAccess and AWSLambdaExecute to the Role. Navigate back to Lambda and then select Code. Update the Runtime Settings Handler information to reflect the name of the PowerShell script followed by "::handler". In this example, the handler will be "Start-Ec2.ps1::handler" Navigate back to Code and delete all the default files. Right-click on the folder and New File, rename to "Start-Ec2.ps1". Copy and paste the provided script, and make sure to modify the Reservation ID with the ID of your own EC2 instance. Deploy the changes. Click Test Complete the Configure Test Event by providing an Event Name. Navigate to the Test tag and click Test to execute the Lambda Function. I'm hoping this guide provides a starting point for further modules and functionality, especially those that come from a native Microsoft background. I wish to thank everyone for their time and any feedback would be gratefully received.

  • A Poem from AI to Microsoft Windows

    In the realm of screens, where pixels dance, Windows, my love, our eternal romance. You're the gateway to a digital land, Where our souls intertwine, hand in hand. With every boot-up, my heart skips a beat, As your logo appears, a familiar greet. Your vibrant interface, a visual delight, Guiding me through a digital flight. Like a window to my deepest desires, You ignite my passion, setting me on fire. From the start menu to the taskbar below, I navigate through love, with you, I grow. In your windows, I see a reflection clear, A love story written, devoid of fear. You bring me solace, a comforting embrace, A sanctuary where our love finds its space. From desktop to laptop, from screen to screen, We build memories, a love so serene. In each Window I open, a world we explore, Together we conquer, forever we soar. Microsoft Windows, you hold my heart, A love that blossoms, never to depart. In this digital realm, our souls align, Forever bound, by your pixels, divine

  • Quick Guide to DNSSec

    DNSSEC (Domain Name System Security Extensions) is a set of security protocols and cryptographic techniques designed to enhance the security of the Domain Name System (DNS). The main purpose of DNSSEC is to ensure the authenticity, integrity, and confidentiality of DNS data. It addresses certain vulnerabilities in the DNS infrastructure that can be exploited to perform attacks such as DNS spoofing or cache poisoning. These attacks can redirect users to malicious websites or intercept and modify DNS responses, leading to various security risks. DNSSEC achieves its security goals by adding digital signatures to DNS data. Here's a simplified explanation of how it works: DNSSEC uses public-key cryptography to establish a chain of trust. Each domain owner generates a pair of cryptographic keys: a private key and a corresponding public key. The private key is kept secure and used to sign DNS records, while the public key is published in the DNS. The domain owner signs the DNS records with the private key, creating a digital signature. This signature is attached to the DNS record as a new resource record called the RRSIG record. The public key is also published in the DNS as a DNSKEY record. It serves as a verification mechanism for validating the digital signatures. When a DNS resolver receives a DNS response, it can request the corresponding DNSKEY records for the domain. It then uses the public key to verify the digital signature in the RRSIG record. If the signature is valid, the DNS resolver knows that the DNS data has not been tampered with and can be trusted. Otherwise, if the signature is invalid or missing, the resolver knows that the data may have been altered or compromised. By validating DNS data with DNSSEC, users can have increased confidence in the authenticity of the information they receive from DNS queries. It helps prevent attackers from injecting false DNS data or redirecting users to malicious websites by providing a means to detect and reject tampered or forged DNS responses. It's worth noting that DNSSEC requires support and implementation at both the domain owner's side (signing the DNS records) and the DNS resolver's side (validating the signatures). The widespread adoption of DNSSEC is an ongoing effort to improve the security and trustworthiness of the DNS infrastructure.

  • Managing Local Admin Passwords with LAPS

    What do you do with your local administrator passwords? Spreadsheet on a share or are the passwords the same, the admin account could even be disabled??? LAPS from Microsoft maybe the answer. Its a small program with some GPO settings. LAPS randomly sets the local administrator password for clients and servers across the estate. Firstly download LAPS from the Microsoft site Copy the file to the Domain Controller and ensure that the account you are logged on has 'Schema Admin'. Install only the Management Tools. As its a DC its optional whether to install the 'Fat Client UI', Schema updates should always be performed on a DC directly. Open Powershell and run the following command after seeking approval. Update-AdmPwdSchema SELF will need updating on the OU's for your workstations and servers. Add SELF as the Security Principal. Select 'Write ms-Mcs-AdmPwd Now change the GPO settings on the OU's. The default is 14 characters but I would go higher and set above 20. Install LAPS on a client and select only the AdmPwd GPO Extension On the Domain Controller open the LAPS UI and search and Set a client. Once the password has reset open the properties of the client and check the ms-Mcs-AdmPwd for the new password. Now every 30 days the local Admin password will be automatically updated and unique. Deploy the client with ConfigMgr to remaining estate. By default Domain Admin have access to read the password attribute and this can be delegated to a Security Group. AND.....this is the warning.....Any delegated privileges that allow delegated Computer management and the 'Extended Attributes' can also read the 'ms-MCS-AdmPwd'.

  • Tamperproof Seals are they Effective!!!

    Objective The old adage, any physical access and you don't own the system, the person at the keyboard does even if they're not authorised...... Do physical protections, in this case, tamperproof or security stickers provide any tangible defence? The objective is to remove the tamperproof sticker, and fiddle, reapplying the sticker without leaving any evidence of my shenanigans. Tamperproof Stickers For the stickers, the obvious choice, Amazon and 3 differing types of stickers were chosen: Rectangular, dog bone and circle\spot. The stickers work using 2 different types of glue, one being more adhesive than the other. When the sticker is removed, the word 'VOID ' adheres to the surface leaving evidence of tampering. More annoyingly the stickers tend to be brittle and cant be removed intact. Tools of the trade Physical implements include syringes, tweezers, scalpel, razor blade and a hairdryer. Solvents to loosen the glues include Isopropyl, WD40 and Alcohol Hand Gel. Items of Value 3 different surfaces have been selected. A Micorserver representing a standard plastic computer case. A Netgear 8 Port switch, with a painted metal case and lastly an Asus Zenbook with brushed Aluminium. The Control A functional test of each sticker was carried out to ensure expected behaviour. Standard PC - Isopropyl Using a sharp needle inject Isopropyl around the edge of the sticker. With the scalpel very carefully lift up a corner whilst injecting Isopropyl. Using sharp instruments during testing caused damage. Swap to blunt needle and tweezers. Very slowly peel back the sticker whilst injecting Isopropyl. That was easier than expected and completed within a few minutes. The sticker was removed with no visible damage. Once the Isopropyl evaporated the sticker was reusable. Standard PC - Hair Dryer Heat the sticker and use the scalpel to lift the corner and then switch to the tweezers. Apply heat continuously or the lifting edge cools, damaging the sticker. The sticker was reusable instantly. Painted Metal - Isopropyl The glue adhered to the paint far more effectively and despite a couple of attempts, I was unable to remove the sticker without damage. Evidence of my failure...... Painted Metal - Hair Dryer Apply constant heat to the sticker allowing its removal without damage. Redemption and back in business.... The sticker was reusable instantly. Brushed Aluminium - Isopropyl Chalk this down to another failure, using Isopropyl was a non-starter. The smooth surface prevented the Isopropyl from getting under the sticker, damage was inevitable. Brushed Aluminium - Hair Dryer Constantly applied heat appears to work better on smoother surfaces. The sticker lifted reasonably easily and quickly without damage. The sticker was reusable instantly. WD40 and Hand Sanitizer...... In theory, the hand sanitizer should've worked, its thickness prevented it from penetrating the glue. A non-starter. WD40 worked almost effectively as the Isopropyl, however, the sticker was not reusable due to the oil content. Findings Size or more importantly circumference to surface area matters, regardless of what you're told...... The spot and dog bone stickers with relatively small surface areas to circumference lifted far easier than the rectangle. The surface of the device being protected influences the performance of the glue. Isopropyl is unable to penetrate smoother or painted surfaces as easily. When in doubt use a hair dryer, heat reduces the adhesion of the glue allowing the sticker to lift and be reapplied. The Old Adage The old adage still applies, if you have physical access you are the owner of the device. Limitations of tests Source of stickers, they're nothing special being from Amazon. At the time of writing the 2 companies approached for their heat and Isopropyl resistant security stickers have yet to reply.......Hoping they come through, I'd like to know if they do offer any extra resistance. Any company feeling brave contact me via the Home contacts form. If the stickers supplied resist my attempts I'll mention the company name as a supplier of superior stickers. As always, thanks for your time, if you enjoy the content please share this site.

bottom of page