# Exploiting Weak S3 Bucket Policies

CTF Source: [Pwned Labs](https://pwnedlabs.io/labs/exploit-weak-bucket-policies-for-privileged-access)

## Overview

In this walkthrough, we're tasked with accessing sensitive data and demonstrating the extent of impact using an IP address and some AWS Access Keys discovered on a previous engagement.&#x20;

## Pre-Requisites

Install awscli: `brew install awscli` (mac) `apt install awscli` (linux)

Install nmap: `brew install nmap` (mac) `apt intall nmap` (linux)

Install gobuster: `brew install gobuster` (mac) `apt intall gobuster` (linux)

Install hashcat: `brew install hashcat` (mac) `apt intall hashcat` (linux)

Install JohnTheRipper: `brew install john` (mac) `apt intall john` (linux)

## Walkthrough

{% embed url="<https://youtu.be/G3YZNNh6vZ0>" %}

To start, we’re given an IP of `⁠13.43.144.61`⁠And some AWS Access Keys for the user, `⁠test`

### Nmap Enumeration

After configuring our AWS access keys (`⁠aws configure`⁠), let’s run some nmap scans and see what we’re dealing with.&#x20;

{% code overflow="wrap" %}

```sh
nmap -Pn 13.43.144.61

Host is up (0.12s latency).
Not shown: 925 filtered tcp ports (no-response), 74 closed tcp ports (conn-refused)
PORT     STATE SERVICE
3000/tcp open  ppp

Nmap done: 1 IP address (1 host up) scanned in 635.16 seconds
```

{% endcode %}

{% code overflow="wrap" %}

```sh
nmap -Pn -p3000 -sC -sV 13.43.144.61

PORT     STATE SERVICE VERSION
3000/tcp open  http    Node.js Express framework
|_http-title: Huge Logistics &gt; Home
|_http-cors: HEAD GET POST PUT DELETE PATCH

Service detection performed. Please report any incorrect results at https://nmap.org/submit/ .
Nmap done: 1 IP address (1 host up) scanned in 17.42 seconds
```

{% endcode %}

Alright so looks like we’re dealing with a website on port `⁠3000`⁠.&#x20;

### Gobuster Enumeration

Before we view the web page in the browser, let’s kick off a gobuster scan.&#x20;

Gobuster will enumerate over our wordlist and attempt to find other directories that may exist.`⁠-b 404`⁠ means we’ll hide any results that do not exist.&#x20;

{% code overflow="wrap" %}

```sh
gobuster fuzz -u http://13.43.144.61:3000/FUZZ -w /usr/share/wordlists/dirbuster/directory-list-2.3-medium.txt -b 404
```

{% endcode %}

Let’s let this run and come to it later.

### Website Enumeration

We can view the site in our browser but it looks like a pretty standard website.

<figure><img src="https://2721275171-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F8yu8YbDfwd1VqEdUxGyA%2Fuploads%2FEGreI2owyXHvUX9WMBES%2Fimage.png?alt=media&#x26;token=bdc068ca-2c7c-4799-af01-05efc283ef71" alt=""><figcaption><p>website homepage</p></figcaption></figure>

Looking at the source code though and we find an S3 bucket endpoint!

<figure><img src="https://2721275171-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F8yu8YbDfwd1VqEdUxGyA%2Fuploads%2FremebnV0YB8gptdiP9EV%2Fimage.png?alt=media&#x26;token=b598b984-52de-4e12-bd6a-c42e25bde3c7" alt=""><figcaption><p>s3 bucket name in website source code</p></figcaption></figure>

### S3 Bucket Enumeration

It doesn’t appear we can see the data from the browser.

<figure><img src="https://2721275171-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F8yu8YbDfwd1VqEdUxGyA%2Fuploads%2Fsgd1b1s2zEbWdQqe2myW%2Fimage.png?alt=media&#x26;token=02462e35-92f5-419e-9d10-58df53610549" alt=""><figcaption><p>s3 file access denied error in browser</p></figcaption></figure>

Let’s try the AWS CLI instead. Both of these commands fail.

```sh
aws s3 ls s3://hugelogistics-data 
aws s3 ls s3://hugelogistics-data --no-sign-request
```

But the command to retrieve the bucket policy works!

```json
aws s3api get-bucket-policy --bucket hugelogistics-data | jq -r .Policy | jq .

{
  "Version": "2012-10-17",                                                         
  "Statement": [                                                                   
    {                                                                              
      "Sid": "PublicReadForAuthenticatedUsersForObject",                           
      "Effect": "Allow",                                                           
      "Principal": {                                                               
        "AWS": "*"                                                                 
      },                                                                           
      "Action": [                                                                  
        "s3:GetObject",                                                            
        "s3:GetObjectAcl"                                                          
      ],                                                                           
      "Resource": [                                                                
        "arn:aws:s3:::hugelogistics-data/backup.xlsx",                             
        "arn:aws:s3:::hugelogistics-data/background.png"                           
      ]                                                                            
    },                                                                             
    {                                                                              
      "Sid": "AllowGetBucketPolicy",                                               
      "Effect": "Allow",                                                           
      "Principal": {                                                               
        "AWS": "*"                                                                 
      },                                                                           
      "Action": "s3:GetBucketPolicy",                                              
      "Resource": "arn:aws:s3:::hugelogistics-data"                                
    }                                                                              
  ]                                                                                
}   
```

Check that out! Let’s copy that `⁠backup.xlsx`⁠ file locally since we have the permission (`⁠s3:GetObject`⁠)

```bash
aws s3 cp s3://hugelogistics-data/backup.xlsx .

download: s3://hugelogistics-data/backup.xlsx to ./backup.xlsx 
```

If we try to open the file we’ll see it’s password protected.&#x20;

<figure><img src="https://2721275171-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F8yu8YbDfwd1VqEdUxGyA%2Fuploads%2FmNgU7pHKRIqtf0zwvjCU%2Fimage.png?alt=media&#x26;token=99a149d6-eb81-455c-89e7-7da12fc10d1b" alt=""><figcaption><p>password protected file</p></figcaption></figure>

### Cracking the Password with Hashcat

Let’s attempt to crack the password on this `⁠xlsx`⁠ file.&#x20;

Since I’m on Kali Linux, I’m going to run a tool called `⁠john2office`⁠ because this is a Microsoft Office file.

You can find the [full toolset here](https://github.com/openwall/john/tree/bleeding-jumbo).

```bash
office2john ./backup.xlsx > hash_backup.xlsx.txt       
```

{% code overflow="wrap" %}

```bash
cat hash_backup.xlsx.txt

backup.xlsx:$office$*2013*100000*256*16*5e8372cf384ae36827c769ef177230fc*c7367d060cc4cab8d01d887a992fbe2b*a997b2bfbbf996e1b76b1d4f070dc9214db97c19411eb1fe0ef9f5ff49b01904
```

{% endcode %}

Now we have a hash of the file and it looks like it’s an Office 2013 file.&#x20;

Next, we will use a tool called `⁠hashcat`⁠ and attempt to crack the file password.&#x20;

First, we need to determine the hash type to use. We can do this like so, `⁠hashcat --help | grep Office`

<figure><img src="https://2721275171-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F8yu8YbDfwd1VqEdUxGyA%2Fuploads%2FPa64wMYHVbVTrFD2AxLn%2Fimage.png?alt=media&#x26;token=727467fc-9bf8-441c-9a57-fdfb97cca831" alt=""><figcaption><p>hashcat hash type for microsoft office 2013 file</p></figcaption></figure>

So our hash type is `⁠9600`⁠.&#x20;

Now let’s verify the hash is in the correct format.&#x20;

We can refer to [hashcat’s documentation](https://hashcat.net/wiki/doku.php?id=example_hashes) and look for an example of an Office 2013 hash.

<figure><img src="https://2721275171-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F8yu8YbDfwd1VqEdUxGyA%2Fuploads%2FTwqdKHgEJroWaNSv9GaD%2Fimage.png?alt=media&#x26;token=65fffb06-f0d5-46b5-b39f-d8970eb27a65" alt=""><figcaption><p>hashcat hash format for microsoft office 2013 file</p></figcaption></figure>

If we compare this to the hash generated by `⁠office2john`⁠, we see we need to modify the hash and remove `⁠backup.xlsx:`

Let’s run the cracking with a popular wordlist, `⁠rockyou`⁠.

```bash
hashcat -a 0 -m 9600 hash_backup.xlsx.txt rockyou.txt
```

We cracked the password!

<figure><img src="https://2721275171-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F8yu8YbDfwd1VqEdUxGyA%2Fuploads%2FjWWpwXii1CKOyYToNYAZ%2Fimage.png?alt=media&#x26;token=4528f044-dd0e-4f4c-b551-56496385d458" alt=""><figcaption></figcaption></figure>

Side note, if you rerun this command, it won’t return the password.&#x20;

This is because it was already cracked and is stored in the file located here `⁠~/.local/share/hashcat/hashcat.potfile`⁠.

Now we can see that `⁠xlsx`⁠ file and find a ton of credentials to several systems!&#x20;

<figure><img src="https://2721275171-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F8yu8YbDfwd1VqEdUxGyA%2Fuploads%2FX9HIgrGZsOFa25eAOq9e%2Fimage.png?alt=media&#x26;token=91ede89d-ede5-4a01-a3ab-595b45f240c0" alt=""><figcaption><p>credentials for systems</p></figcaption></figure>

### Gaining Access to the Website

Remember the `⁠gobuster`⁠ scan we ran earlier? Let’s check on how that’s going.&#x20;

<figure><img src="https://2721275171-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F8yu8YbDfwd1VqEdUxGyA%2Fuploads%2Foe5X4Jwp1BajQ2fQKXg8%2Fimage.png?alt=media&#x26;token=1ad6ffe3-d658-4926-ad13-634e1bd62889" alt=""><figcaption><p>gobuster output of found directories</p></figcaption></figure>

Looks like we successfully found some more directories!

If we navigate to the page `⁠http://13.43.144.61:3000/crm`⁠ we’re met with a login page.&#x20;

<figure><img src="https://2721275171-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F8yu8YbDfwd1VqEdUxGyA%2Fuploads%2Ff2QlYnZJuTbQBLKp1YBD%2Fimage.png?alt=media&#x26;token=9f5fa6a2-5b9c-4a3b-b193-a7e5ec68f614" alt=""><figcaption><p>login page for crm system</p></figcaption></figure>

Let’s try the credentials from our file for the `⁠WebCRM`⁠ system to see if they work.

<figure><img src="https://2721275171-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F8yu8YbDfwd1VqEdUxGyA%2Fuploads%2FydoLf80D3PtntHV9EN2c%2Fimage.png?alt=media&#x26;token=38c776e5-737d-4623-8551-64bc25461a82" alt=""><figcaption><p>logged in to crm system as admin</p></figcaption></figure>

We’re in!&#x20;

### Exfiltrating Data and Finding the Flag!

Poking around, we can find some Invoice data. We’ll download that since there isn’t much else to do.&#x20;

<figure><img src="https://2721275171-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F8yu8YbDfwd1VqEdUxGyA%2Fuploads%2FSvD7soVEkKnQdUdcqxoT%2Fimage.png?alt=media&#x26;token=537ef229-1922-4af9-9b08-69840fea2403" alt=""><figcaption><p>exporting pii data</p></figcaption></figure>

Opening this up we find the Flag!

<figure><img src="https://2721275171-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F8yu8YbDfwd1VqEdUxGyA%2Fuploads%2FN1JA3THYI0upn8w9yQqO%2Fimage.png?alt=media&#x26;token=8706e2be-0983-47ae-a322-38ad78751071" alt=""><figcaption><p>finding the flag in pii data</p></figcaption></figure>

## Wrap Up

In this scenario, we were tasked with accessing sensitive data with AWS credentials discovered in a previous engagement. After enumerating our access to an S3 bucket we found a password-protected file. After cracking this file, it revealed credentials to several systems. Using these credentials, we gained access to the CRM platform and obtained access to sensitive data. \
Here are some recommended actions administrators can take to prevent this from happening.&#x20;

1. Tighten the S3 Bucket Policy
   * “Everyone” could perform the following actions,
     * `⁠s3:GetBucketPolicy`
       * Not needed by everyone, restrict to admins
     * `⁠s3:GetObject`
       * Okay for a website but non-website data was also stored in this bucket (see #2)
     * `⁠s3:GetObjectAcl`&#x20;
       * Not needed by everyone, restrict to admins
2. Eliminate S3 bucket multi-use
   * Using an S3 bucket for multiple purposes can lead to unintended consequences like information disclosure due to a lax or complicated bucket policy.
   * Since this bucket is hosting a publicly accessible website, only store files relevant to the website.
3. Ensure employees are trained and have access to a credential manager
   * In this case, a spreadsheet was found containing login credentials to several systems.
   * Securely store credentials in solutions such as Bitwarden, 1Password, AWS Secrets Manager, HashiCorp Vault, or similar.
4. Secure login to CRM system
   * Protect access to the CRM system by removing (or restricting) public access, restricting login behind SSO, and requiring MFA.
