Find information before we attack

We need to find out more info before attacking! Usually there's a ton of great info hidden in a site. Always check all sent and returned headers when analyzing a web app!

curl -is -X OPTIONS m4lwhere.org    # Find all supported options for an HTTP Server
curl -s --head m4lwhere.org | grep -i server    # Find out what server info is provided by the server
curl -d "param1=value&param2=value" https://m4lwhere.org/resource.cgi    # Send parameters with curl

# Below: Test all available HTTP methods for a site
for i in GET HEAD POST PUT DELETE TRACE OPTIONS; do echo "====Trying $i method===="; curl -X $i https://m4lwhere.org --head; done


Spidering through a website can make offline analysis super easy and great. Programs like wget and cewl are great for the command line, Burp and ZAP can automate spidering from the GUI.

Spider the site once as an authenticated user, and then attempt to reach the same pages without authentication. Determine if insecure direct object reference exists!

wget -r -P /tmp --no-check-certificate https://m4lwhere.org    # Manual spidering of site using wget, saves to local disk
wget -e robots=off    # Will spider items in robots.txt, without will ignore it
export https_proxy=    # Sets the proxy to a Burp instance running, useful to spider all info into Burp as well

cewl https://m4lwhere.org    # Gather a unique list of all words on a page, spiders to linked pages
cewl -d 3 -m 5 -w words.txt https://m4lwhere.org    # Depth of 3 pages, words min 5 chars long, output to file words.txt
cewl -d 5 -m 3 -w wordlist --with-numbers https://m4lwhere.org    # Depth of 5, min 3 char words, includes words with numbers in them!


ffuf is a tool which is exceptionally fast to enumerate a host.

# Enumerate files on website
ffuf -w /usr/share/wordlists/seclists/Discovery/Web-Content/directory-list-2.3-medium.txt -u http://horizontall.htb/FUZZ -e .html,.php,.txt

# Enumerate subdomains
ffuf -w subdomains.txt -u http://website.com/ -H "Host: FUZZ.website.com" -mc 200

Replace normal values with exploits or garbage data to identify vulnerabilities. Need to FUZZ EVERYTHING! Includes Headers, parameters, payloads. Search for changes in baseline requests, different bytes or content. Useful with python re library or something. Check SecLists [https://github.com/danielmiessler/seclists] for fuzzing sources and payloads.

wfuzz -z file,/usr/share/wordlists/wfuzz/general/big.txt --hc 404 http://obscurity.htb:8080/FUZZ/SuperSecureServer.py

Vhost Enumeration

With Virtual Hosts, we are searching for additional web servers which may be present on this host.

ffuf -H "Host: FUZZ.goblins.local" -H "User-Agent: Vhost Finder" -c -w /usr/share/seclists/Discovery/DNS/combined_subdomains.txt -u

Username harvesting searches for valid users for a webapp. Utilize login forms to find if there's differences between good username/badpass and bad username/badpass. Side channel attacks may reveal good usernames also, check timing for a known good username vs a bad username. A bad username may be returned instantly, where a good username may be hashed by the system, and a few milliseconds slower.

Identify Components

Plugins such as Wappalyzer and Shodan makes this very easy!

Apache, IIS, NGINX, Python?

Identified by port scans, default web pages, and fingerprinting tools. May display configuration information.

Check list


Last updated