Subdomain enumeration technique to discover critical vulnerabilities…

Md. Mahim Bin Firoj
2 min readJan 25, 2025

--

Hi, This is Md. Mahim Bin Firoj Avi, commonly known as Blackfly in the cybersecurity space. Today i will share some techniques that i follow while enumerating subdomains. Let’s started…..

Step 1:

The first step while enumerating any target is finding as much as subdomains. The tools i use are, crt.sh, sublist3r, subfinder and amass

Go to crt.sh website and write your domain there. It will find out subdomains.

https://github.com/aboul3la/Sublist3r
apt install sublist3r -y
Sublist3r -d example.com -t 100 > sublist3r.txt (-t for thread, you can use 100 or 300 for faster result)
apt install subfinder -y
subfinder -up (To update it in it's latest version)

subfinder -silent -d example.com (-silent options for not to show other info on the shell)
or
subfinder -silent -d example.com > subfinder.txt
amass enum -passive -d owasp.org -o amass_results_owasp.txt
amass enum -active -d owasp.org -o amass_results_owasp.txt

Now use chatgpt to only extract the subdomains from the above files. Then save it as subamass.txt

Step 2:

Now the second step is merging all the domains from crt.sh, sublist3r, subfinder and amass, remove duplicates and uniquely sort those and save as subdomainsall.txt

Chatgpt will help you in this.

Step 3:

Now we need to use a tool called httpx probing (from projectdiscovery) to find out which subdomain are live among those many subdomainsall.txt

httpx requires go tool. Follow below commands:

apt install golang-go -y
go install -v github.com/projectdiscovery/httpx/cmd/httpx@latest

cd go/bin
./httpx -l /root/subdomainsall.txt -o /root/livesuball.txt

Step 4:

Now we need to crawl the subdomains using katana or waybackurls tool.

go install github.com/tomnomnom/waybackurls@latest

CGO_ENABLED=1 go install github.com/projectdiscovery/katana/cmd/katana@latest

./katana -u https://example.com > /root/urlskatana.txt

or

./waybackurls -u https://example.com > /root/urlswayback.txt

cat /root/livesuball.txt | ./waybackurls > /root/waybackurl_allsub.txt

Note: waybackurls can crawl more urls than katana. We crawl same domain using both the tool. But we saw different results.

cat /root/urlskatana.txt | wc -l
64

cat /root/urlswayback.txt | wc -l
1500

Now we can again use httpx against those crawl urls to see how many are active.

Step 5:

Vulnerability scanning with nuclei and other tools.

nuclei -update-templates
nuclei -u https://www.domain.com (single url scan)

nuclei -l /root/livesuball.txt -o nuclei_reports.txt -rl 50 (-rl is rate limit)

Custom templates:
nuclei -l /root/livesuball.txt -t /opt/Blackfly/custom_templates/ -o nuclei_reports.txt

You should also use burp, owasp zap and nessus tool for scanning…

Thanks. I hope you like this write up. Please subscribe below.

LinkedIn:

https://www.linkedin.com/in/md-mahimbin-firoj-7b8a5a113/

YouTube:

https://www.youtube.com/@mahimfiroj1802/videos

--

--

Responses (1)