Building a custom automation script is a rite of passage for any serious bug bounty hunter. It allows you to combine the best tools into a single, high-speed machine that works exactly how you want it to.
In this post, I will break down the bash script I use to map out an organization's entire digital footprint. This script takes three inputs: a file of root domains (wildcards), a file of CIDR ranges (IP infrastructure), and a path to save the results.
You can find the script at my github page : github.com/Gopi-eng2202

The Workflow: Wide to Narrow Recon
My script follows a strategic flow: gather everything passively, expand into infrastructure, merge the data, and then probe for life.
1. Initialization and Setup
The script begins by checking if you provided the necessary arguments.
if [[ -z "$1" || -z "$2" || ! -f "$1" ]]; then
echo "Usage: $0 <domains-file> <output-directory>"
exit 1
fi
OUTPUT_DIR="/home/gopi2202/$2"$1: Your list of root domains (e.g.,tesla.com).$2: The name of the folder where your results will be stored.$3(Optional): A file containing CIDR ranges (e.g.,153.46.96.0/20).- I run the script in the terminal like below,
recon wildcards.txt /home/path/to/save/the/outputs cidr.txt- I have made my recon script in to an executable and made it global so that i can call it from any directory. Its actually quite easy to turn any bash script in to an executable, do your research on this, it will be game changing.
2. Multi-Threaded Passive Discovery
I run four heavy-hitters simultaneously. The & at the end of each line tells Linux to run them in the background so they all work at once.
- Assetfinder: A lightning-fast tool that finds subdomains from various web sources.
- Findomain: Known for its incredible speed and accuracy.
- Subfinder: The industry standard for scraping dozens of passive sources.
- BBOT: A "heavy" framework that maps relationships between assets.
The wait command ensures the script doesn't move forward until these four tasks are finished.
echo "Finding subdomains using assetfinder"
cat $1| assetfinder --subs-only | tee "$OUTPUT_DIR/assetfinder.txt" &
echo "Finding subdomains using findomain"
findomain -f $1 -u "$OUTPUT_DIR/findomain.txt" &
echo "Finding subdomains using subfinder"
subfinder -dL $1 -all -o "$OUTPUT_DIR/subfinder.txt" &
echo "Finding subdomains using bbot"
echo | bbot -t $1 -p subdomain-enum -rf passive --output-dir "$OUTPUT_DIR" --name bbot &
# Wait for all background jobs to finish
wait3. Refining BBOT and GitHub Scouring
BBOT produces a lot of data, so I use grep and awk to extract just the DNS names.
echo "refining output file of bbot"
grep '\[DNS_NAME\]' "/home/*****/$2/bbot/output.txt" | awk -F'\t' '{print $2}' | tee "$OUTPUT_DIR/bbot.txt"Note: BBOT is different from other subdomain enumeration tools, please read its documentation if you get stuck. Also don't forget to add api-keys to these tools in its config.
Then, I move to GitHub-Subdomains.
echo "finding subdomains from github"
while IFS= read -r domain; do
[ -z "$domain" ] && continue
echo " → Running GitHub enum for: $domain"
github-subdomains -d "$domain" \
-t <your github token> \
-o "$OUTPUT_DIR/${domain}_git.txt"
done < "$1"This scours public GitHub repositories for leaked subdomains that aren't indexed by search engines.
4. Infrastructure and Cloud Recon
This is the secret sauce that finds forgotten "Shadow IT."
CIDR Expansion (The "Reverse DNS" Duo)
# Check if a third argument ($3) is provided and is a valid file
if [[ -n "$3" && -f "$3" ]]; then
echo "Processing CIDR ranges from $3..."
cat "$3" | xargs -I{} prips {} | hakip2host | tee -a "$OUTPUT_DIR/cidrdomains.txt"
echo "Extracting third word from cidrdomains.txt..."
awk '{print $3}' "$OUTPUT_DIR/cidrdomains.txt" | tee -a "$OUTPUT_DIR/cidrsubs.txt"
fi- Prips: Expands a CIDR range into a list of every individual IP.
- Hakip2host: Checks those IPs for domain names.
Cloud Scraping
- From my previous post, i gather all the cloud data and put it into a single file called clfinal.txt.
- The script greps through a massive dataset (
clfinal.txt) of SNI (Server Name Indication) logs. This finds domains that are hidden behind shared cloud IPs like AWS or Azure.
# Create an empty file for storing results
> "$OUTPUT_DIR/cloud.txt"
# Loop through each domain in the input file ($1) and process it
while IFS= read -r domain; do
grep -F "$domain" /home/****/clouddata/clfinal.txt | awk -F '[][]' '{print $2}' | sed ':a;N;$!ba;s/\n/#/g' | grep "$domain" | sort -fu | cut -d ',' -f1 | sort -u | tr ' ' '\n' |sed 's/^[*.]//g'|sed 's/^\.//g'|tr '#' '\n' | tee -a "$OUTPUT_DIR/cloud.txt"
done < "$1"5. The "Clean Up" Phase
Now that I have thousands of results from different tools, I need a single, clean list.
cat "$OUTPUT_DIR/cloud.txt" "$OUTPUT_DIR"/assetfinder.txt ... > "$OUTPUT_DIR/subdomains_raw.txt"
sort -u "$OUTPUT_DIR/subdomains_raw.txt" > "$OUTPUT_DIR/subdomains.txt"
sed -i 's/^\*\.//g' "$OUTPUT_DIR/subdomains.txt"sort -u: Alphabetizes the list and deletes every duplicate.sed: Removes wildcards like*.to ensure every line is a clean, testable URL.
6. Probing for Life and Port Scanning
The final step is verifying which of these domains are actually active.
HTTPX: The "Hit List" Generator
cat "$OUTPUT_DIR/subdomains.txt" | httpx -title -td -sc -cl -ct -web-server -o "$OUTPUT_DIR/httpx.txt"This probes every subdomain for a web server. It records the Status Code (is it 200 OK or 403 Forbidden?), the Title (look for "Admin" or "Staging"), and the Tech Stack.
Naabu: The Port Scanner
naabu -list "$OUTPUT_DIR/subdomains.txt" --exclude-ports 80,443 --passiveFinally, I use Naabu to check if there are any other services running on non-standard ports (like databases or SSH), skipping common web ports to stay efficient.
Summary
By running this one script, I go from a single domain name to a fully detailed spreadsheet of live assets, hidden cloud infrastructure, and leaked internal portals. This automation is what allowed me to find the $1,300 data leak on a forgotten QA environment.
In my next post I will discuss about how you can make this script run every day automatically using a vps and get new subdomains sent to your slack/gmail everyday. See you guys soon.
Thanks,
Gopi