Skip to content

Cloud Assets

Finding assets in AWS, GCP, and Azure is crucial for modern recon. Cloud misconfigurations cause tons of data breaches, so they're prime targets.

1. Introduction to Cloud Asset Discovery

Unlike on-prem infrastructure, cloud assets are ephemeral, numerous, and sometimes exposed through complex configs. Your goal? Find these assets, especially publicly accessible storage, databases, and services.

Key Cloud Services to Target

Cloud Provider Storage Service Compute Service Database Service Other Interesting Services
AWS S3 (Simple Storage Service) EC2 (Elastic Compute Cloud) RDS (Relational Database Service) Lambda, ECR, SQS
GCP Cloud Storage Compute Engine Cloud SQL Cloud Functions, Artifact Registry
Azure Blob Storage Virtual Machines SQL Database Functions, Container Registry

2. Core Methodology

The process of finding cloud assets involves a combination of DNS enumeration, permutation scanning, and using specialized tools.

  1. Identify Cloud Usage: Determine which cloud providers the target uses. This can be done by analyzing DNS records (e.g., MX records pointing to Google, CNAMEs to s3.amazonaws.com), IP address ownership, and job postings.
  2. Enumerate Storage Buckets/Blobs: This is often the most fruitful area. Use permutations of the company name, product names, and common keywords to guess the names of storage containers.
  3. Scan IP Ranges: Identify the IP ranges used by the target's cloud accounts and scan them for open ports and services.
  4. Analyze Web Content: Scour JavaScript files and web pages for references to cloud storage URLs or API endpoints.

3. Finding AWS S3 Buckets

S3 buckets are a common source of data leaks. Bucket names are globally unique, which makes them easier to discover if you can guess the naming pattern.

Naming Conventions

Companies often use predictable patterns for bucket names: - <companyname>-assets - <companyname>-backups - <companyname>-dev - <companyname>-media - assets.<companyname>.com (if configured with a CNAME)

Tools for S3 Discovery

s3scanner A tool to find open S3 buckets and list their contents.

# Create a list of potential bucket names
# (e.g., example, example-dev, example-prod, assets-example)

# Scan the list of names
s3scanner --buckets-file buckets.txt

# Dump the contents of a found, public bucket
s3scanner --bucket my-public-bucket --dump

lazys3 A Ruby script that brute-forces bucket names based on a domain.

lazys3.rb example.com

Manual Check via AWS CLI You can check if a bucket exists and is listable using the AWS CLI.

# The --no-sign-request flag attempts an anonymous request
aws s3 ls s3://bucket-name-to-test --no-sign-request

# If successful, it will list the bucket's contents.
# An "AccessDenied" error may still mean the bucket exists but is not public.
# A "NoSuchBucket" error means the bucket does not exist.

4. Finding GCP and Azure Storage

The principles are similar to S3, but the tools and naming conventions differ.

GCP Cloud Storage

GCP bucket names are also globally unique. They are accessible via storage.googleapis.com/<bucket-name>.

GCPBucketBrute A script to brute-force GCP bucket names.

python gcpbucketbrute.py -d example.com -t 20

Azure Blob Storage

Azure storage account names must be globally unique and are used as subdomains of blob.core.windows.net.

MicroBurst A collection of PowerShell scripts for Azure security auditing, including storage enumeration.

# From within PowerShell
Invoke-EnumerateAzureBlobs -Base companyname

5. General Cloud Reconnaissance Tools

cloud_enum A multi-cloud tool that can enumerate public resources for AWS, GCP, and Azure. It's one of the most comprehensive tools available.

# Enumerate based on a keyword (company name)
cloud_enum -k exampleinc

# Enumerate based on a domain, using permutations
cloud_enum -d example.com -b permutations.txt

6. Advanced Techniques

Certificate Transparency Logs

Subdomains related to cloud services (e.g., s3.eu-west-1.amazonaws.com) are sometimes found in Certificate Transparency logs. This can reveal the existence and region of S3 buckets.

See the Subdomain Enumeration cheatsheet for tools that can query CT logs.

JavaScript File Analysis

Web applications often contain hardcoded URLs to cloud storage assets.

# Use gau to get URLs, then grep for cloud patterns
gau --subs example.com | grep -E 's3.amazonaws.com|storage.googleapis.com|blob.core.windows.net'

Reverse IP Lookup

If you find an IP address belonging to a cloud provider, a reverse IP lookup can sometimes reveal other domains or services hosted on the same infrastructure, potentially belonging to the same company.

7. Notes and Pitfalls

  • False Positives: Permutation scanning will generate many non-existent bucket names. It's a numbers game.
  • Permissions vs. Existence: A bucket might exist but not be public. An "Access Denied" error is still a finding, as it confirms the existence of the asset.
  • Throttling: Cloud providers may throttle or block excessive, unauthenticated requests. Use tools with built in delay/retry mechanisms.
  • Default vs. Custom Domains: Assets can be served from default cloud URLs (e.g., s3.amazonaws.com) or from custom domains (e.g., assets.example.com). You must search for both.

8. Quick Reference Table

Task Tool / Method Example Command / Note
Multi-cloud enumeration cloud_enum cloud_enum -k companyname
Find S3 buckets s3scanner s3scanner --buckets-file list.txt
Find GCP buckets GCPBucketBrute python gcpbucketbrute.py -d example.com
Find Azure blobs MicroBurst Invoke-EnumerateAzureBlobs -Base companyname
Check single S3 bucket AWS CLI aws s3 ls s3://bucket-name --no-sign-request
Find cloud URLs in JS files gau + grep gau example.com \| grep 's3.amazonaws.com'
Identify cloud provider IPs whois / ASN info Look for "AMAZON-02", "GOOGLE-CLOUD", "MSFT" in ASN results.