This isn't just a story about a bug; it's a story about the "Subdomain Graveyard" that dusty, forgotten corner of the internet where billion-dollar companies leave their keys under the mat.
If you're a bug hunter, you know the feeling of hitting a wall. I had spent forty-eight hours pounding on the main production environment of a global logistics titan. We're talking about a company that moves millions of tons of freight across oceans and continents. Their main web app was beautiful, React-based, and sanitized to within an inch of its life. No XSS, no IDORs, nothing.
But here's the secret: The bigger the company, the more they forget.
Phase 1: The Reconnaissance Rabbit Hole
I stopped looking at the "fortress" and started looking at the "shacks" built around it. I ran a standard subdomain enumeration and found over 400 entries. Most were dead, but one caught my eye:
dev-internal-legacy.shipping-giant.com
"Legacy" is a bug hunter's favorite word. It usually means old code, forgotten patches, and developers who have long since left the company. I fired up a directory brute-forcer, letting a wordlist hammer the server with thousands of requests for common sensitive files like .git, .phpinfo, and config.php.
Every single request came back with a 404 Not Found.
Phase 2: The 404 That Wasn't
Most hunters see a 404 and move to the next target. But I noticed something in the HTTP response headers that felt off. The Server header didn't say Nginx or Apache. It said AmazonS3.
When an S3 bucket is configured as a website and you request a file that doesn't exist, it gives you a 404. But that doesn't mean the bucket is empty. It just means that specific file isn't there.
I decided to go "up" a level. Instead of looking for a file, I tried to list the root of the bucket itself. I stripped the URL back to its base and checked for common misconfigurations.
Suddenly, my screen filled with XML.
The bucket was set to Public Read. I wasn't looking at a 404 page anymore; I was looking at the file tree for their entire legacy development environment.
Phase 3: The .env File in the Haystack
I started sifting through the folders.
/logs/— Empty./scripts/— Just some old Javascript./backups_2022/— Bingo.
Inside the backups folder was a file simply named .env. For the uninitiated, .env files are where developers store "secrets"—passwords, database credentials, and API keys—so they don't have to hardcode them.
I downloaded it, opened it in VS Code, and my jaw hit the desk.
Maps_API_KEY=AIzaSyA...[Redacted]
Phase 4: From Data Leak to Physical Security Threat
An API key might not sound like much. But this wasn't just any key. I ran a quick check using a tool to see what permissions the key had. It was unrestricted.
This logistics company used Google Maps to track their entire fleet. Because the key had no IP or HTTP referrer restrictions, I could use it myself. I plugged the key into a simple script to see what data I could pull.
Within seconds, I was looking at a live JSON feed of:
- Truck IDs and Driver Names
- GPS Coordinates (Latitude/Longitude)
- Cargo Descriptions (including "High Value" tags)
- Destination Timestamps
I could literally see where every single truck was in North America in real-time. A hijacked API key had turned a digital oversight into a physical hijacking risk.
The Report and The Payday
I didn't wait. I wrote up the report, emphasizing the "Physical Safety" and "Privacy" impact. I didn't just say "I found a key"; I showed them a map (with blurred coordinates) of their own fleet.
The Timeline:
- 11:00 PM: Report submitted via their Bug Bounty platform.
- 11:20 PM: Status changed to "Triaged."
- 01:00 AM: The S3 bucket was taken offline.
- Next Morning: Awarded $5,000.
𝒯𝒶𝓃𝓋𝒾 ♡