My-Methodologies/extensive-recon-guide-for-bug-hunting.md

9.6 KiB
Raw Blame History

🔎 Extensive Recon Guide For Bug Hunting

WHAT IS RECONNAISSANCE?

Reconnaissance is one of the most important aspects of penetration testing. Its also known as Recon.
Recon will help you to increase attack surface area and may allow you to get more vulnerabilities but the ultimate goal is to dig deep in the target.

🔹 Recon = Increase in Attack surface = More vulnerabilities
🔹 Recon = Finding untouched endpoints = Fewer duplicates
🔹 Recon = Sharpening your axe before attack


1. SUBDOMAIN ENUMERATION

Subdomain enumeration is the process of finding subdomains for one or more domains.

Tools used ⇒

🔹Visual Recon⇒

🔹Command Line⇒

Oneliners for Subdomain Enumeration ⇒

$ amass enum -passive -norecursive -noalts -df domians.txt -o subs-list.txt
$ dnsx -silent -d $domain -w ~/wordlist.txt -o ~/dnsbrute.txt
$ cat domain.txt | dnsgen - | massdns -r ~/resolvers.txt -o S -w alive.txt

2. FILTERING THE SUBDOMAINS WITH HTTPX

$ httpx -l domain.txt -timeout 13 -o domain-probe.txt

PORT SCANNING ⇒

$ naabu -list sub-list.txt -top-ports 1000 -exclude-ports 80,443,21,22,25 -o ports.txt
$ naabu -list sub-list.txt -p - -exclude-ports 80,443,21,22,25 -o ports.txt
$ cat domain-subs.txt | aquatone -ports xlarge -scan-timeout 300 -out aquatone.txt

SUBDOMAIN OF SUBDOMAIN ENUMERATION ⇒
“This is one of the rare things people search for.”
Tools used:


3. SCREENSHOT TOOLS

These tools are capable of taking screenshots of websites to view offline anytime.

Tools used ⇒


4. BROADENING YOUR SCOPE

More targets lead to more option which ultimately lead to more opportunities.


5. WHAT TO DO AFTER ENUMERATION? | Collecting URLs


6. TIPS AND TRICKS

1. After collecting URLs, curl out the responses of the URLs and grep for the following URLs:

  • drive.google

  • docs.google

  • /spreadsheets/d/

  • /document/d/

    $ cat domains.txt | katana -silent | while read url; do cu=$(curl -s $url | grep -E '(drive. google | docs. google | spreadsheet\/d | document.\/d\/)' ;echo -e " ==> $url" "\n"" $cu" ; done
    

2. Using paramspider, gxss to detect Cross-site Scripting (XSS)

  • Commands:-

    $ cat params | qsreplace yogi | dalfox pipe --mining-dom --deep-domxss --mining-dict --remotepayloads=portswigger,payloadbox --remote wordlists=burp,assetnote -o xssoutput.txt
    
    $ cat alive.txt | waybackurls | gf xss | uro | httpx -silent | qsreplace '"><svg onload=confirm(1)>' | airixss -payload "confirm(1)" | tee xssBug3.txt
    

3. Using SQLidetector to search for sqli


7. SHODAN FOR RECON

Shodan: https://www.shodan.io/

Shodan Dorks ⇒

ssl: “target[.]com” 200 http.title: “dashboard” unauthenticated dashboard
org:“target.com” x-jenkins 200 — unauthenticated jenkins server
ssl:“target.com” 200 proftpd port:21 — proftpd port:21 org:“target.com”
http.html:zabbix — CVE-2022-24255 Main & Admin Portals: Authentication
Bypass org:“target.com” http.title:“phpmyadmin” —-php my admin
ssl:“target.com” http.title:"BIG-IP —F5 BIG-IP using CVE-2020-5902

Example Writeup:- How I found XSS by searching in Shodan - Writeup


8. CENSYS FOR RECON

Censys: https://www.censys.io/
Example Writeup:- Lets fuck waf using Origin IP: My approach on censys By Deepak Dhiman


9. FUZZING FOR SENSITIVE FILES & DIRECTORIES

$ for i in cat host.txt; do ffuf -u $i/FUZZ -w wordlist.txt -mc 200,302,401 -se ;done
  • Tip: Fuzz for “/wp-content/debug.log” || Sometimes they contain SQL error, which can be chained.

10. FINDING SOURCE/BACKUP FILES

Subdomain Name: y0gi.hacklido.com

y0gi.hacklido.com /y0gi.zip - hacklido.zip admin.zip backup.zip
y0gi.hacklido.com/y0gi/y0gi.zip - hacklido.zip admin.zip backup.zip
y0gi. hacklido.com/hacklido/y0gi.zip - hacklido.zip admin.zip backup.zip
y0gi. hacklido.com/admin/y0gi.zip - hacklido.zip admin.zip backup.zip

Tool Link: https://github.com/musana/fuzzuli


11. GOOGLE & GITHUB DORKING

Trivial Tricks:

  • Find Sensitive Data in Cloud storage through Google Dork:

site:http://s3.amazonaws.com “target[.]com”
site:http://blob.core.windows.net “target[.]com”
site:http://googleapis.com “target[.]com”
site:http://drive.google.com “target[.]com”

  • Github Leaks for AWS, Jira, Okta, etc:

Org:“target” pwd/pass/passwd/password
“target.atlassian” pwd/pass/passwd/password
“target.okta” pwd/pass/passwd/password
“Jira.target” pwd/pass/passwd/password

  • Also search in Google groups, Gitlabs.

12. JAVASCRIPT[JS] FILES RECON

🔹 Grep all urls from wayback or gau.

  • Collect all js file “.js”

  • Filter js file:

    $ httpx -content-type | grep 'application/javascript'
    
  • Perform Nuclei scan

    $ nuclei -t /root/nuclei-templates/exposures/
    

🔹 Js Recon Tip:

  • Collect all endpoints from Js files & Create a wordlist from those.
  • Craft a POST request with any parameter.
  • Use that request to fuzz for sensitive directory.
  • Tools:- JSFScan.sh , Jsminer {Burp Extension} , Trufflehog

13. SOME AUTOMATION FRAMEWORKS

🔹Sudomy: https://github.com/Screetsec/Sudomy
🔹Reconftw: https://github.com/six2dez/reconftw


Final Thoughts

🔹Verify Data

  • Some data are intended, No bug here.

🔹Reported > Invalid

  • Dont get angry, You may lose bonds with good program

🔹Yes, They do accept Third Party

  • Your crafting and exploits are gold. Make it high as you can

🔹Be humble with Program

  • Money going no where. Dont message constant to team