# Enumeration

We need to find out more info before attacking! Usually there's a ton of great info hidden in a site. Always check all sent and returned headers when analyzing a web app!

```bash
curl -is -X OPTIONS m4lwhere.org    # Find all supported options for an HTTP Server
curl -s --head m4lwhere.org | grep -i server    # Find out what server info is provided by the server
curl -d "param1=value&param2=value" https://m4lwhere.org/resource.cgi    # Send parameters with curl

# Below: Test all available HTTP methods for a site
for i in GET HEAD POST PUT DELETE TRACE OPTIONS; do echo "====Trying $i method===="; curl -X $i https://m4lwhere.org --head; done
```

## Spidering

Spidering through a website can make offline analysis super easy and great. Programs like `wget` and `cewl` are great for the command line, Burp and ZAP can automate spidering from the GUI.

Spider the site once as an *authenticated user*, and then attempt to reach the same pages *without authentication*. Determine if insecure direct object reference exists!

```bash
wget -r -P /tmp --no-check-certificate https://m4lwhere.org    # Manual spidering of site using wget, saves to local disk
wget -e robots=off    # Will spider items in robots.txt, without will ignore it
export https_proxy=https://127.0.0.1:8080    # Sets the proxy to a Burp instance running, useful to spider all info into Burp as well

cewl https://m4lwhere.org    # Gather a unique list of all words on a page, spiders to linked pages
cewl -d 3 -m 5 -w words.txt https://m4lwhere.org    # Depth of 3 pages, words min 5 chars long, output to file words.txt
cewl -d 5 -m 3 -w wordlist --with-numbers https://m4lwhere.org    # Depth of 5, min 3 char words, includes words with numbers in them!
```

## Fuzzing

`ffuf` is a tool which is exceptionally fast to enumerate a host.

```bash
# Enumerate files on website
ffuf -w /usr/share/wordlists/seclists/Discovery/Web-Content/directory-list-2.3-medium.txt -u http://horizontall.htb/FUZZ -e .html,.php,.txt

# Enumerate subdomains
ffuf -w subdomains.txt -u http://website.com/ -H "Host: FUZZ.website.com" -mc 200
```

Replace normal values with exploits or garbage data to identify vulnerabilities. Need to FUZZ EVERYTHING! Includes Headers, parameters, payloads. Search for changes in baseline requests, different bytes or content. Useful with python `re` library or something. Check SecLists \[<https://github.com/danielmiessler/seclists>] for fuzzing sources and payloads.

```bash
wfuzz -z file,/usr/share/wordlists/wfuzz/general/big.txt --hc 404 http://obscurity.htb:8080/FUZZ/SuperSecureServer.py
```

### Vhost Enumeration

With Virtual Hosts, we are searching for additional web servers which may be present on this host.

```
ffuf -H "Host: FUZZ.goblins.local" -H "User-Agent: Vhost Finder" -c -w /usr/share/seclists/Discovery/DNS/combined_subdomains.txt -u http://10.0.0.1
```

Username harvesting searches for valid users for a webapp. Utilize login forms to find if there's differences between `good username/badpass` and `bad username/badpass`. Side channel attacks may reveal good usernames also, check timing for a known good username vs a bad username. A bad username may be returned instantly, where a good username may be hashed by the system, and a few milliseconds slower.

* [ ] SQL Injection
* [ ] XSS
* [ ] Password Spraying
* [ ] Directory Traversal
* [ ] LFI
* [ ] RFI

## Identify Components

Plugins such as Wappalyzer and Shodan makes this very easy!

{% tabs %}
{% tab title="Web Server" %}
Apache, IIS, NGINX, Python?

Identified by port scans, default web pages, and fingerprinting tools. May display configuration information.&#x20;
{% endtab %}

{% tab title="Application Frameworks" %}
Spring, ASP.NET, Django, Symfony

Identified by default pages, vuln scans, config files, admin pages, and fingerprinting tools
{% endtab %}

{% tab title="CMS" %}
WordPress, Drupal, Joomla, SharePoint

Default web pages, config files
{% endtab %}

{% tab title="Databases" %}
MySQL,  Microsoft SQL, Oracle, Postgres, MongoDB

Usually found in detailed application errors, may leak information about the backend database
{% endtab %}

{% tab title="Other Software" %}
SSH, RDP, FTP, SMB

Other vulnerable or interesting applications available on other ports of the server
{% endtab %}
{% endtabs %}

#### Check list

* [ ] Is the site on 80, 443, or some different port?
* [ ] Are there any vhosts?
  * [ ] Check HTTPS cert for other potential servers
* [ ] Check `robots.txt` for exclusions
* [ ] Read the HTML source for comments or hidden pages
* [ ] Try separate request methods when interacting
  * [ ] GET instead of POST for an interaction
* [ ] Brute force directories with gobuster
  * [ ] If 403, try bruteforcing PAST those directories
* [ ] Find parameters, test in order
  * [ ] Command Injection
  * [ ] SQLi
  * [ ] noSQLi
  * [ ] XXE
* [ ] Fuzz EVERYTHING!
  * [ ] Headers
  * [ ] Cookies
  * [ ] POST parameters&#x20;
  * [ ] GET parameters&#x20;
  * [ ] PUT payloads
  * [ ] ALL INPUTS
* [ ] Check `Accepted:` headers to see if new data types are served
  * [ ] Client side for SENDING DATA
  * [ ] Cliente side for RECIEVING DATA
* [ ] Check for differences in good username/badpass and bad username/badpass

## References

{% embed url="<https://owasp.org/www-project-web-security-testing-guide/stable/4-Web_Application_Security_Testing/01-Information_Gathering/README.html>" %}
Information Gathering - OWASP
{% endembed %}
