my personal blog about systemcenter

Dump AD user password hashes on-the-fly to a file of chosen format

Categories: Active Directory, AD
Comments Off on Dump AD user password hashes on-the-fly to a file of chosen format

So, you achieved Domain Admin permissions during a security assessment (penetration test) and you want to crack all of those nice password hashes from Active Directory, or you might have to perform a password audit, but you just hate exporting NTDS.DIT + SYSTEM and extracting the database afterwards…?

Instead you can now do live, in-memory on-the-fly Mimikatz-DCSync-style, synchronization of all those user password NT-hashes in PowerShell and write them to a pwddump format of your own choice, all ready for having lots of cracking fun!

Check out:

A few things to consider.

  1. Michael Grafnetter, who developed the DSInternals module, hasn’t released the source code yet. Therefore, you will have to trust his code (blindly) at the moment. However, Michael has told me that he will release the code later this year when he has had time to clean it up a bit. Thanks to Michael for his hard work and help.
  2. Be sure to have permissions to extract (and crack?) hashes from Active Directory :-)

BTW. Have you seen this related tool and post: Crack and detect weak passwords in Active Directory on-the-fly

/Jakob H. Heidelberg

Crack and detect weak passwords in Active Directory on-the-fly

Categories: Active Directory
Comments Off on Crack and detect weak passwords in Active Directory on-the-fly

A serious problem with Active Directory (AD) and built-in password policies is, that although password complexity is required, attackers (including penetration testers) can easily find weak user passwords during an engagement, that IT administrators or security officers do not have the means to discover “out -of-the-box”. There’s no visibility into how strong or weak user passwords really are.

Simple and very common passwords, such as “Summer2015”, “October2015”, “Password123”, “[company name] + [year]”,”[Well-known-shared-password-in-the-company]”, etc., all meet the regular requirements for password length and complexity, but in practice they are extremely weak passwords and probably among the first guesses an attacker will try out.

Brute-force and NTDS.DIT attacks

A so-called “brute-force” attack can be performed in two different ways. The most well-known method is the attack of one given user account, where the attacker tries out a whole lot different password combinations. In most environments this will lead to the user account being locked after a few guesses and the attack ends.

A better version of the “brute-force” attack is to try out one weak and widely used password, for example “Summer2015” against all user accounts in the environment (also called “password spraying”). This method will most often lead to a successful login without any account being locked – especially in environments where users are not properly trained in generating strong passwords.

Previously, obtaining insight into the password usage and strength in an AD environment, has been done by extracting data from the NTDS.DIT file of a Domain Controller, which is a rather tedious and manual process.

With a new PowerShell module, DSInternals, it is now possible to analyze passwords “on-the-fly”, in a live environment, assuming that you have (acquired) the proper rights (equivalent of ‘Domain Admin’ or ‘Domain Controller’). If you’ve ever looked at the DCSync tool, recently built into Mimikatz, this PS module offers the same functionality.

Get-bADpasswords to the rescue

I have developed a simple PowerShell script, Get-bADpasswords, which utilizes some of the functionality in the new PS module. My intention is to enable IT administrators and security officers to discover weak (or bad) user passwords active in AD – hopefully before attackers do it.

The drawing below illustrates the concept of the script.

Concept of Get-bADpasswords

A Domain Controller contacted and asked to hand over user names and password hash values ​​(NT hash) of all active users (under a given naming context).

The script retrieves, from one or more text files (word lists), poor or unacceptable (non-compliant) passwords in the environment and then hashes (NT hash) so that they can be compared with the output from the AD.

Here is an example of the contents of such a word list that should be adjusted each organization, language and so on.


Word list with weak passwords

The script is executed with “-Verbose” prints the current status to the console.

Get-bADpasswords -Verbose

The script can write user names for users who have weak passwords to a CSV file.

Get-bADpasswords CSV output

The script can write a log of current status, including detailed (verbose) information.

Get-bADpasswords log file

Note mentioned that my script assumes that DSInternals module is properly installed on the executing machine.

DSInternals module folder

A few things to consider.

  1. Michael Grafnetter, who developed the DSInternals module, hasn’t released the source code yet. Therefore, you will have to trust his code (blindly) at the moment. However, Michael has told me that he will release the code later this year when he has had time to clean it up a bit. Thanks to Michael for his hard work and help.
  2. It is probably a good idea to get an approval of HR and/or the legal department when running this regularly. There might be objections to administrators or security officers potentially gaining insight into user passwords (although we will only detect the weak ones).
  3. This script works “after the fact”, after users have actually created a weak password for their AD account. In Windows you can create custom Password Filters, which could prevent users from setting weak passwords in the first place, but that is quite another matter.

My PowerShell script can be downloaded here: Get-bADpasswords.

In the hope of more password-guessing-robust Active Directory environments out there!

/Jakob H. Heidelberg




Deduplication and Compression vs Encrypted VM’s

Categories: Bitlocker, Compression, Deduplication, Hyper-V, Windows Server 2016
Comments Off on Deduplication and Compression vs Encrypted VM’s

So with 2016 server we now have the ability to enable virtual TPM inside fhe VM to help protect data from threats from anywhere to a rouge san snapshots to a stolen backup tape.

D:\New folder>ddpeval.exe “D:\UNSECURE”
Data Deduplication Savings Evaluation Tool
Copyright (c) 2013 Microsoft Corporation.  All Rights Reserved.

Evaluated folder: D:\UNSECURE
Evaluated folder size: 17,38 GB
Files in evaluated folder: 6

Processed files: 6
Processed files size: 17,38 GB
Optimized files size: 4,52 GB
Space savings: 12,87 GB
Space savings percent: 74

Optimized files size (no compression): 7,93 GB
Space savings (no compression): 9,46 GB
Space savings percent (no compression): 54

Default VM 54% deduplication with 2 default installed guests , sure this number will screw when data is added but just to give a small example

D:\New folder>ddpeval.exe “D:\SECURE”
Data Deduplication Savings Evaluation Tool
Copyright (c) 2013 Microsoft Corporation.  All Rights Reserved.

Evaluated folder: D:\SECURE
Evaluated folder size: 20,41 GB
Files in evaluated folder: 6

Processed files: 6
Processed files size: 20,41 GB
Optimized files size: 19,36 GB
Space savings: 1,06 GB
Space savings percent: 5

Optimized files size (no compression): 19,46 GB
Space savings (no compression): 981,13 MB
Space savings percent (no compression): 4

Files excluded by policy: 0
Files excluded by error: 0

The same 2 VM now with inguest bitlocker , almost all of the effect from deduplication is now gone , so secured VM’s will hurt storage cost if you rely on array based compression and or deduplication.

Sure not all VM’s will be encrypted but seeing this from a hoster perspective I can see all VM’s being encrypted

Shielded VM’s a new era for secured VM

Categories: Hyper-V, Windows Server 2016
Comments Off on Shielded VM’s a new era for secured VM

With the preview of Windows Server 2016 , we have a new feature that can help improve security.

With Shielded VM’s we can add a Virtual TPM module to each VM and use that to encrypt the content of the Virtual Machine.

The step by step guide to add this is provided by Microsoft here

Its supported for VM’s and VM’s managed by Windows Azure Pack

For deployment there is supported for a dedicated AD forest or using hardware with TPM2.0 but the servers that support this as if this writing is hard to find (Surface works for testing) , so this have been testing in our playground using a dedicated AD forest

After initial setup of the dedicated forest and installation of the Host Guardian Server we need to add protection to the VM’s

So here we have a dedicated forest that holds the Host Guardian Servers and have a oneway trust to the forest where our Hyper-V hosts and VM’s are located , this will enable us to secure the VM in the hosted environment , this will prevent a Hyper-V administrator to access data within a VM , this is also the same for Backup Operators

For compliance and in environments where encryption is a requirement this is a very big step to ensuring security across the hypervisor

PS C:\> $KP = New-HgsKeyProtector -Owner $Owner -Guardian $Guardian –AllowUntrustedRoot ( this is due to no dedicated PKI testing only)

PS C:\>
PS C:\> Set-VMTPM -vmname SECURE01 -Enabled $true -KeyProtector $KP.RawData
PS C:\> Set-VMTPM -vmname SECURE02 -Enabled $true -KeyProtector $KP.RawData


This add TPM information to the Virtual Machine and enforces Secure Boot , this works with Gen2 VM’s only


This is our VM before we enable vTPM


And this is our VM after vTPM have been enabled

If we then encrypt our VM with bitlocker and then try to open a “stolen” copy of the VM


We cant Smile

This was just a small teaser to show a area of Hyper-V 2016 Security Enhancements will dive a little deeper in the securing the enviroment using ShieldedVM’s later

Reblogged from Tao Yang , spend your money wisely

Categories: Uncategorized
Comments Off on Reblogged from Tao Yang , spend your money wisely

The following blog post is a copy from Tao Yang site , he does amazing work and publish it so everyone can enjoy it , personally I couldn’t live without his work.

You can agree with the posts or not , personally publishing others work almost 1-1 and charging for it is bad taste , but that’s the double edges sword of MIT licensing and publishing your stuff for others.

Nothing wrong with charging for your time/mp/scripts , just create them yourself or pay others to make them

Spend Your Money Wisely

clip_image002As what I’d like to consider myself as – a seasoned System Center specialist, I have benefitted from many awesome resources from the community during my career in System Center. These resources consist of blogs, whitepapers, training videos, management packs and various tools and utilities. Although some of them are not free (and in my opinion, they are not free for a good reason), but large percentage of these resources I value the most are all free of charge.

This is what I like the most about the System Center community. Over the last few years, I got to know many unselfish people and organisations in the System Center space, who have made their valuable work completely free and open source for the broader community. Due to what I am going to talk about in this post, I am not going to mention any names in this post (unless I absolutely have to) . But if anyone is interested t know my opinion, I’m happy to write a separate post introducing what I believe are valuable resources.

First of all, I’m just going to put it out there, I am not upset, and this is not going to be a rant and I’m trying to stay positive.

I started working on System Center around 2007-2008 (ConfigMgr and OpsMgr at that time) . I started working on OpsMgr because my then colleague and now fellow SCCDM MVP (like I mentioned, not going to mention names) has left the company we were working for and I had to pick up the MOM 2005 to OpsMgr 2007 project he left behind. The very first task for me was to figure out a way to pass the server’s NetBIOS name to the help desk ticketing system and I managed to achieve this by creating a PowerShell script and utilised the command notification channel to execute the script when alerts were raised. I then used the same concept and developed a PowerShell script to be used in the command notification to send content rich notification emails which covered many information not available from native email notification channel.

When I started blogging 5 years ago, this script was one of the very first posts I published here. I named this solution “Enhanced SCOM Alert Notification Emails”. Since it was published, it has received many positive feedbacks and recommendations. I have since published the updated version (2.0) here:

After version 2.0 was published, a fellow member in the System Center community, Mr. Tyson Paul has contacted me, told me he has updated my script. I was really happy to see my work got carried on by other members in the community and since then, Tyson has already made several updates to this script and published it on his blog (for free of course):

Version 2.1:

Version 2.2:

This morning, I have received an email from a person I have never heard of. This person told me his organisation has developed a commercial solution called “Enhanced Notification Service for SCOM” and I can request a NFR by filling out a form from his website. As the name suggests (and I had a look on the website), it does exactly what mine and Tyson’s script does – sending HTML based notification emails which include content rich information including associated knowledge articles.

Well, to be fair, on their website, they did mention a limitation of running command notifications that you have a AsyncProcessLimit of 5. But, there is a way to increase this limit and if your environment is still hitting the limit after you’ve increased it, I believe you have a more serious issue to fix (i.e. alert storm) rather than enjoying reading those “sexy” notification emails. Anyways, I don’t want to get into technical argument here, it’s not the intention of this post.

So, do I think someone took my idea and work from Tyson and myself? It is pretty obvious, make your own judgement. Am I upset? not really. If I want to make a profit from this solution, I wouldn’t have published out on my blog in the first place. And believe me, there are many solutions and proof-of-concepts I have developed in the past that I sincerely hope some software vendors can pickup and develop a commercial solution for the community – simply I don’t have the time and resources to do all these by myself (i.e. my recently published post on managing ConfigMgr log files using OMS would be a good commercial solution).

In the past, I have also seen people took scripts I published on my blog, replaced my name with theirs from the comment section and published it on social media without mentioning me whatsoever. I knew it was my script because other comments in the script are identical to my initial version. When I saw it, I have decided not to let these kind behaviour get under my skin, and I believe the best way to handle it is to let it go. So, I am not upset when I read this email today. Instead, I laughed! Hey, if this organisation can make people to pay $2 per OpsMgr agent per year (which means for a fully loaded OpsMgr management group would cost $30k per year for “sexy” notification emails), all I’m going to say is:

However, I do want to advise the broader System Center community: Please spend your money wisely!

There is only so much honey in the pot. You all have a budget. This is what the economist would call Opportunity Cost. If you have a certain needs or requirement and you can satisfy your requirement using free solutions, you can spend your budget on something that has a higher Price-Performance Ratio. If you think there’s a gap between the free and paid solution, please ask your self these questions:

  • Are these gaps really cost me this much?
  • Are there any ways to overcome this gap?
  • Have I reached out the the SMEs and confirm if this is a reasonable price?
  • How much would it cost me if I develop an in-house solution?

Lastly, I receive many emails from people in the community asking me for advise, and providing feedback to the tools I have published. I am trying my best to make sure I answer all the emails (and apologies if I have missed). So if you have any doubts in the future that you’d like to know my opinion, please feel free to contact me. And I am certain, not only myself, but other SMEs and activists in the System Center community would also love to help a fellow community member.