I ran tools, collected subdomains, saved endpoints, and filled folders with notes. On paper, it looked like progress.
In reality, I wasn't finding anything.
Weeks passed. Then months. My recon data kept growing, but my bug reports stayed at zero.
The problem wasn't a lack of effort or tools. The problem was that I treated recon as the goal, not the starting point.
This article explains how I learned to turn raw recon notes into real, testable bug bounty findings, using the same tools most hunters already have.
What Recon Looked Like at the Beginning
My early recon process looked like this:
subfinder -d example.com -silent > subs.txt
httpx -l subs.txt -silent > live.txtThen I'd run:
katana -u https://example.com -jc -o urls.txtSometimes:
waybackurls example.com > old_urls.txtAt the end, I had:
- Hundreds of subdomains
- Thousands of URLs
- Dozens of parameters
And no idea what to test first.
I was collecting data, not understanding the application.
The First Shift: Recon Is Input, Not Output
The first mindset change that helped me:
Recon is not the result. Recon is the starting point for questions.
Instead of storing things like:
/api/v1/user/update
/admin
Authorization: Bearer <JWT>I started asking:
- Who is allowed to call this endpoint?
- What happens if I remove this parameter?
- What happens if I change the user ID?
- Does the backend trust the frontend too much?
That's when recon stopped being boring — and started becoming useful.
Step 1: Focus on One Subdomain, Not Everything

After running:
subfinder -d example.com -silent | httpx -silentI noticed this subdomain:
api.example.comInstead of scanning the whole domain, I focused only on this.
katana -u https://api.example.com -jc -o api_endpoints.txtFrom the output, I manually picked a few endpoints that looked interesting:
/api/v1/profile/api/v1/update/api/v1/upload
Immediately, I had a hypothesis:
"If this is API-driven, authorization is likely handled by tokens. That's where I should test."
Step 2: Watch Real Traffic Instead of Guessing
I logged in as a normal user and opened Browser DevTools → Network.
I used a feature (profile update) and captured the request:
POST /api/v1/profile/update
Authorization: Bearer eyJhbGciOi...
Content-Type: application/json
{
"user_id": 1021,
"email": "test@example.com"
}This single request revealed:
- User ID is numeric
- ID is sent from the frontend
- Backend accepts it directly
That's not a bug yet.
That's a testing opportunity.
Step 3: Test Small Changes, Not Everything at Once
I sent the request to Burp Repeater.
First test: change only the user ID.
{
"user_id": 1020,
"email": "attacker@example.com"
}Response:
{
"status": "success"
}I refreshed the page.
Another user's email had changed.
That was my first real IDOR, and it didn't come from advanced recon — it came from actually testing what I observed.
Step 4: Why This Worked (And My Earlier Recon Didn't)
Earlier, I would have:
- Collected
/api/v1/profile/update - Saved it in a text file
- Moved on
This time, I:
- Focused on one feature
- Looked at real traffic
- Changed only one value
- Verified impact
No automation. No fancy scanner. Just intent.
Step 5: How I Test Common Recon Findings (Simple Way)
If I Find a File Upload Feature
I upload a normal file first.
Then I test:
- File without extension
- File with double extension (
image.jpg.php) - Change
Content-Typein Burp - Upload after logging out
I'm checking:
"Is the backend validating this — or trusting the browser?"
If I Find an Admin URL
Recon note:
/adminI log in as a normal user and test:
- Direct access to the URL
- Replaying admin requests in Burp
- Changing role or user ID values
- Reusing the request after logout
I'm checking:
"Is access control enforced on the backend?"
If I See Authorization Tokens or JWT
Recon note:
Authorization: Bearer <token>I capture one request and test:
- Reuse token after logout
- Use the token on another endpoint
- Remove Authorization header
- Replace the token with another user's token
I'm checking:
"Does the server verify identity every time?"
Why This Approach Works
Earlier, I tried to test everything.
Now, I test one thing properly.
Most valid bugs come from:
- Focusing on a single feature
- Making small changes
- Observing real behavior
Recon tools help you find where to look. Testing helps you find what's broken.
The Mindset That Finally Worked for Me
I stopped trying to look like a hacker.
I started thinking like a tester.
Every recon finding became a question. Every question became a small experiment.
Final Thought
Most bug hunters don't fail because they don't know tools.
They fail because they never slow down enough to challenge what they found.
If your recon folder is growing but your reports aren't, the issue isn't skill.
It's focus.
Turn recon into questions. Turn questions into tests. That's where real bugs live.