Securing Born in the Cloud Businesses

Everyone’s had this recently. Organisations they partner with are becoming (justifiably) more stringent about their security. It creates some thorny problems though:

  • How do we get the security without bludgeoning our business to death?
  • How do you improve data protection without making your staff rage quit?
  • How do we align initiatives I make with broader security standards.

Born in the Cloud

When we’re talking about a Born in the Cloud Business (BITC) we’re talking about this sort of company:

  • Not much in the way of legacy systems.
  • Mostly SaaS based tools.
  • A boat load of BYOD.
  • Loosey Goosey office security 🙂

Larger organisations like working with businesses like these. They’re small, agile and generally full of rock-star grade experts in their field. But large organisations are also terrified of working with these sorts of companies. The locked-down SOE based work day they’re used to which provide them with a measure of confidence isn’t present in these BITC businesses. The large org wants all the warm fuzzy security but wants to keep the innovation and glint in their partner’s eye.

Security Standards

In Europe this is lot more mature than it is in Australia. There are two different standards that get bandied about:

Essential 8

Here, there are a set of guidelines that the Australian Signals Directorate have adopted and provide as advice. This is called the Essential 8 Maturity Model. It covers several areas and each one has four levels of maturity and organisation can reach (0-3). It was originally envisaged as a straightforward, practical approach to data security but has been “beefed up” to be a lot more complex over time.

ISO:27001

Another standard is ISO 27001. This is a heavyweight standard to attain and can take 6-18 months depending on your complexity, maturity and size.

It covers a range of different technology and policy “controls” that should be applied. You an self-assert your compliance then have that audited externally.

Essential 8 Level 3 (the highest) is a sort of subset of the work you’d need to complete to get to ISO 27001. Essential 8 is used in Australian Federal and State Governments and ISO:27001 is a global standard.

What do I need to do?

We at jtwo have been on the journey of achieving both and we have some general advice on how to get going.

We aren’t security consultants and our professional indemnity doesn’t allow us to be so take this advice with a grain of salt. That should keep our insurers happy 🙂

So with that out of the way Its a big beast but here are some pointers on how to get started. We use Office365 with the E5 licensing so a lot of the tools we need to build this stuff out are there and we already pay for them.

Take it Seriously

You can’t fake this stuff. You have to embrace the idea of security in your bones or you won’t get anywhere. You have to think about the tools, processes and behaviours you use and think about them through a security lens. Once you’ve embraced the idea of security it all starts to look a bit more achievable.

Build Registers

In each of these security standards there are set of lists and registers you need to keep. They involve asset registers (physical and information based) and there’s lots of them. This is particularly the case with ISO27k1.

We use Office365 so we built each of these registers as SharePoint Lists. They are easy to use and they can be used in reporting too.

Embrace a SOE

Everyone hates them, they suck. They make it hard for you to be flexible and innovative. Developers hate them especially. But you should consider them part of your new world order. We use E5 licensing for Microsoft 365 and as part of this we get InTune and Defender. Rolling these out together can help you tick lots of boxes and actually be secure to boot.

MFA Everywhere, All at Once

You probably already do this, in fact if you don’t then do it as soon as you’ve read this. We use O365 and all the identities are in Azure AD. We’ve turned on MFA using Microsoft Authenticator and it does a lot of the heavy lifting.

Policies, Policies, Policies

You’ll need to write and maintain lots of policies. These are generally short (thankfully) but they need to be reviewed periodically and you need to record attestations that people have read, understood and agreed to the policies.

We build our policies as Word Documents and we built a PowerApp that lets people read and agree to the policies. The records for this go in our SharePoint lists for record keeping.

Enforcement

You need to enforce the use of policies, practices and tools. Consider making security compliance part of your staff meetings. Reward people for good behaviour and following policies. Gently (at first) nudge people towards good behaviour if they’re lagging behind.

Office365 and Purview are your friend

While many of the compliance activities you’ll need to do are policy and people based there’s a lot of technology stuff too. As a BITC business you have a lot of this at your fingertips. We use Microsoft 365 and Purview is part of the E5 licensing we have. Its got a bunch of great technology you can use to improve your security. It arranges it as a set of scores so you get the dopamine rush when you move the score up too. If you use M365 and have E5 you should definitely explore this. It will help greatly.

Data Classification

This is a big one and can be hard. Data classification is generally difficult but the Purview classification tools are able to use ML to do the classification work for you. Here’s what our Teams, email and other communication profile looks like…

We should probably tone down on the fruity language.

This is also what our data looks like from the perspective of sensitive information.

You can see that we use what might be considered sensitive information in the content of our comms. This will vary from org to org but you don’t have to do anything to get this, it works out of the box.

Standards Mapping

Another interesting capability is the standards mapping. You can choose a standard like E8L3 or ISO:27001 and apply that template to the controls you have in O365. This will give you a (probably massive) checklist of changes you need to make to meet those standarsd.

Microsoft also have their own standards for security which are applied to your controls. Here’s an example of how it provides a gauge on your security compliance:

Moving this score up will move you along with various standards at the same time.


AWS obtain PROTECTED level certification for Australian Region

Earlier this week Amazon Web Services made a statement, indicating that the battle of tier-one public cloud providers is still heating up. Yesterday Matthew Graham (AWS Head of Security Assurance for Australia and New Zealand) announced that The Australian Cyber Security Centre (ACSC) had awarded PROTECTED certification to AWS for 42 of their cloud services. 

In what appears to be a tactical move that has been executed hot off the trail of Microsoft announcing their PROTECTED accredited Azure Central Regions in the back half of last year. This clearly demonstrates that AWS aren’t prepared to reduce the boil to a gentle simmer any time soon.

Graham announced “You will find AWS on the ACSC’s Certified Cloud Services List (CCSL) at PROTECTED for AWS services, including Amazon Elastic Compute Cloud (Amazon EC2), Amazon Simple Storage Service (Amazon S3), AWS Lambda, AWS Key Management Service (AWS KMS), and Amazon GuardDuty.”

He continued to state “We worked with the ACSC to develop a solution that meets Australian government security requirements while also offering a breadth of services so you can run highly sensitive workloads on AWS at scale. These certified AWS services are available within our existing AWS Asia-Pacific (Sydney) Region and cover service categories such as compute, storage, network, database, security, analytics, application integration, management and governance. “

Finally, delivering a seemingly well orchestrated jab “Importantly, all certified services are available at current public prices, which ensures that you are able to use them without paying a premium for security.”

It is no secret that the blue team currently charges a premium for entry into their PROTECTED level facility (upon completion of a lengthy eligibility assessment process) due to a finite amount of capacity available.

Both vendors state that consumers must configure services in line with the guidance in the respective ACSC certification report and consumer guidelines. This highlights that additional security controls must be implemented to ensure workloads are secured head to toe whilst storing protected level data. Ergo, certification is not implicit by nature of consuming accredited services.

AWS have released the IRAP assessment reports under NDA within their Artefact repository. For more information, review the official press release here.


Backup Palo Alto VM Series Config with Azure Automation

If you have implemented a VM-Series firewall in Azure, AWS or on-premises but don’t have a Panorama Server for your configuration backups. Here is a solutions for getting the firewall configuration into an Azure Blob Storage, this could be done similarly with Lambda and S3 using python and the boto3 library.

Why Do This?

If there are multiple administrators of the firewall and configuration changes are happening frequently you may want a daily/hourly backup of the configuration to restore in the event that a recent commit has caused unwanted disruption to your network.

Azure Automation is a great place to start, we will have to interact with the API interface of the firewall to ask for a copy of the XML. Generally speaking we don’t want to expose the API interface to the internet, nor is it easy to allow on the Azure Automation public IPs, so in this case a Hybrid Worker (VM inside your trusted network) can execute the code against the internal trusted interface that has the API listening.

Depending on your version of PowerShell and Invoke-WebRequest you may not be able to ignore a certificate error coming from the API interface. Which is why I’m updating system .Net class for X509 certificates policies.

The steps are pretty simple

  1. Create a directory on the file system (I’m using the Azure VM with temporary D drive local storage)
  2. Request the XML from the URL
  3. Login to Azure with service credentials
  4. Map to the cold storage account i’m putting the files in
  5. Copy the file

add-type @"     using System.Net;     using System.Security.Cryptography.X509Certificates;     public class TrustAllCertsPolicy : ICertificatePolicy {         public bool CheckValidationResult(             ServicePoint srvPoint, X509Certificate certificate,             WebRequest request, int certificateProblem) {             return true;         }     } "@ [System.Net.ServicePointManager]::CertificatePolicy = New-Object TrustAllCertsPolicy [Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Ssl3, [Net.SecurityProtocolType]::Tls, [Net.SecurityProtocolType]::Tls11, [Net.SecurityProtocolType]::Tls12 
$todaydate = Get-Date -Format yy-MM-dd
$File = "PaloConfig-"+$todaydate+".xml"  
$FilePath = "D:\Palo\"+$File 
#Create Directory 
New-Item -ItemType directory -Path D:\Palo -Force 
#Download Config
Invoke-WebRequest -Uri "https://PaloIPAddress/api/?type=export&category=configuration&key=<ONETIMEKEY>=" -OutFile $FilePath 
#Login with service principal account 
$TenantId = 'AzureTenantID' 
$ApplicationId = 'ServiceID' 
$Thumbprint = (Get-ChildItem cert:\LocalMachine\My\ | Where-Object {$_.Subject -match "CN=AzureHybridWorker" }).Thumbprint 
Connect-AzureRMAccount -ServicePrincipal -TenantId $TenantId -ApplicationId $ApplicationId -CertificateThumbprint $Thumbprint 
#Get key to storage account 
$acctKey = (Get-AzureRmStorageAccountKey -Name StorageAccountName -ResourceGroupName ResourceGroupName).Value[0] 
#Map to the backup BLOB context 
$storageContext = New-AzureStorageContext -StorageAccountName StorageAccountName -StorageAccountKey $acctKey 
#Copy the file to the storage account 
Get-ChildItem -LiteralPath $FilePath | Set-AzureStorageBlobContent -Container "paloconfigbackup" -BlobType "Block" -Context $storageContext -Verbose 

If you are not currently using a Hybrid Worker in your subscription, create one from the below link:

https://docs.microsoft.com/en-us/azure/automation/automation-hybrid-runbook-worker

Paste the code into an Azure PowerShell Runbook and create a re-occuring schedule.

You’ll have backups saved in cold storage for as long as you would like to retain the data. Create a storage policy can help you


Azure MFA with Palo Alto Client VPN

Client VPNs have come along way in recent years and are still a necessity for organisations protecting their backend services that cannot be published to the public internet securely. The nirvana is having data presented by web applications and use SAML authentication to any good identity provider that supports MFA. This world still doesn’t exists in its entirety and you wont be the first or last to say that you have some old application that runs as a fat client on desktop machines with non standard protocols. Client VPNs will remain relevant until all these applications adopt modern web authentication frameworks. 

Azure Active Directory is a great cloud based identity and authentication provider with lots of built in functionality to explore in the security space. If you’re using Office 365, then you already have one, and more bells and whistles can be turned on to include features like Multi Factor Authentication.  

Most good firewall products will have a Client VPN that supports radius as a second factor of authentication, but sometimes that’s where the documentation finishes.

How do I get my firewall VPN authentication to talk to Azure MFA if all I have available is radius?

The article today talks explicitly about Palo Alto Global Protect client and VM Series firewall, but there is no reason if other firewall VPN supports radius that you couldn’t perform the same architecture. 

Palo Alto Configuration

The following authentication settings needs to be configured on the Palo Alto firewall. I won’t bore you with every step, cause if you administering a firewall then I’ll assume a certain level of knowledge. The authentication flow will be:

  1. LDAP authentication with username and password (connected to Active Directory)
  2. Radius authentication using the NPS Azure MFA Extension

LDAP Authentication

Configure LDAP as per normal, nothing special to note here. This will be the first factor of authentication in the VPN login sequence. 

Radius Server Profile

  1. Device > Server Profiles > Radius and Add a profile.
  2. Enter a Profile Name to identify the server profile.
  3. Enter a Timeout interval in seconds after which an authentication. request times out (default is 3; I’ve changed this to 30 seconds to allow for cloud connectivity and then messages to client devices).
  4. Select the Authentication Protocol (PAP) that the firewall uses to authenticate to the RADIUS server.
  5. Add a new RADIUS server and enter the IP, Secret and Port (1812).

Radius Authentication Profile

  1. Select DeviceAuthentication Profile and Add a profile.
  2. Set the Type to RADIUS.
  3. Select the Server Profile you configured.
  4. Select Retrieve user group from RADIUS to collect user group information from VSAs defined on the RADIUS server. I’d recommend an ‘All Staff’ approach as this won’t be the restrictive policy to allow VPN.

Network Policy Server Configuration

Follow Micrsoft’s guide to deploying the NPS:

https://docs.microsoft.com/en-us/azure/active-directory/authentication/howto-mfa-nps-extension

The incoming request condition for policies can be the NAS Identifier which will be the name of your authentication profile in the Palo Alto, the example I named profile ‘Radius Authentication’ so therefor this will be presented on incoming connection from the Palo to NPS.

NPS Nas Identifier

Make sure for your authentication method you have PAP selected. The rest of the settings can be default. (CHAP/MS-CHAP while more secure was problematic in my deployment, therefor PAP was used). The traffic flows from the internal trusted interface of a firewall to a port on a trusted network segment encryption is not the major security control for this request. 

NPS Request Authentication Settings

After you install the Azure NPS Extension (make sure you reboot). This extension as great as it is, isn’t heavily customisable, which is why I strongly suggest this be a seperate radius server. Think of this NPS server as the MFA radius server as the extensions will intercept all requests regardless of policy. You don’t want this extension on an existing radius server that maybe used for WiFi authentication using certificates (EAP) for domain joined workstations etc. 

MFA using Azure Authenticator App
MFA using Azure One Time Password (OTP)

Test the solution

Before you test end to end, a simple test of only the Radius configuration for MFA can be done by the firewall CLI. Log in via SSH and test the profile. 

test authentication authentication-profile "Radius Authentication" username test@cloudstep.io password  

If it fails, do a quick sanity check on your test user:

  1. Synced to Azure Active Directory.
  2. Assigned a MFA license (P1 etc).
  3. Enrolled in MFA (https://aka.ms/MFASetup).

If you find that you’re not successfully completing the above CLI in the Palo SSH session, then here are some places to look for troubleshooting. 

Follow the Radius Authentication Flow

Live monitor

Do a quick visual traffic check in the GUI to make sure the radius authentication is leaving to the correct destination:

Monitor > Logs > Traffic > Enter this search criteria ( port.dst eq 1812 )

Do a packet capture

Monitor > Packet Capture add a filter, just choose the interface that the Palo should talk to the radius server on. Leave everything else blank and it just look like this afterwards.

VM Series Capture Filter

Under the capture configuration, we need to choose a ‘stage’ and give the file a name, I’ve left packet count and byte count blank.

VM Series Capture Stage

What we want to find is source IP of the Palo (.4) with destination port 1812. If its not here then its going out a different interface. We can modify the default interface for a service in the Palo if incorrect using service routes.

Wireshark Capture UDP Request

Here we can see the radius protocol inside the UDP packet. If we aren’t seeing the request make it to the radius server we may not be allowing 1812 on UDP on the ACL between the firewall appliance and radius server.

Wireshark Capture Radius Request

If we see the Access-Request on its way to the radius server we can do a few checks inside the Windows Server OS to validate the NPS configuration. 

Check the radius server

Event Viewer > Custom Views > Server Roles > Network Policy and Access Service review the log entries for each authentication request. If the request is being denied by the NPS server then make sure it’s being handled by the correct policy. A failed authentication request will show you which profile determined it was a failure, if it isn’t matching your NPS rules for connection request and network policy review the NAS Identifier the request is sending in the authentication packet. 

Review the MFA extension logs via Event Viewer > Applications and Services > Microsoft > AzureMFA.

If all is successful we would see a return UDP packet in the capture with response ‘Access-Accept’ from the radius (.10) to Palo (.4). 

Wireshark Capture Radius Response

In the above image you can see the value for ‘Time from request: 11.78794200 seconds‘, this is why I recommend a longer timeout duration on the Radius Server Profile in the Palo Alto configuration. I did it in 11 seconds, but your end users will probably be less efficient than I am.

MFA Options

You can complete the MFA via the Authenticator application on your mobile device via an ‘Approve/Deny’ choice in the notification area or if you’re using SMS code (OTP) the Global Protect client will prompt after successful username and password which is nicely named by default.

Global Protect OTP Request