I was shopping for a t-shirt. My proxy was quietly running. One buried JS file later — I had write access to their entire Order Management System.
TL;DR Casually browsing a massive e-commerce platform. Burp Suite flagged an S3 bucket URL buried deep inside a minified JavaScript chunk. Navigating to the bucket threw a
403 Access Denied. Most people stop there. I threw aPUTrequest at it using an anonymous AWS CLI command. The bucket swallowed the file whole and served it back to the public internet. Time to confirm: ~30 minutes.
01 — Context: The Silent Watcher
It was supposed to be a quiet evening of window shopping. I was browsing a massive, well-known e-commerce brand — the kind of place where you add five things to your cart and then stare at the total before closing the tab.
But I'm a pentester. Which means even when I'm "just shopping," my proxy is usually quietly humming in the background, logging traffic. I wasn't looking for a bug; I was looking for a t-shirt. I found both.
When your background tools suddenly flag a hardcoded AWS S3 bucket URL in the frontend source code, you don't just ignore it. It's the digital equivalent of seeing a door slightly ajar in a bank vault.
02 — Discovery: The JavaScript Goldmine
Modern web applications love to bundle everything into massive, minified JavaScript files. Developers often assume that because the code looks like absolute gibberish, secrets are safe inside it.
I popped open the JS chunk routing the core Order Management System (OMS) and ran a quick search. Right there, sitting in plain text, were the environment URLs. Production, Staging, and UAT.
// A sanitized recreation of the JS chunk
const OMS_CONFIG = {
"prod_bucket": "https://redacted-oms-prod.s3.ap-south-1.amazonaws.com",
"stage_bucket": "https://redacted-oms-stage.s3.ap-south-1.amazonaws.com",
"uat_bucket": "https://redacted-oms-uat.s3.ap-south-1.amazonaws.com"
}Naturally, I copied the Production bucket URL and pasted it into my browser.
<Error><Code>AccessDenied</Code><Message>Access Denied</Message></Error>
Ah. A 403. The universal web server response for "Go away."

This is where a lot of automated scanners and junior hunters stop. The bucket isn't listing its contents to the public, so it must be secure, right?
Wrong.
03 — Exploitation: The 403 Illusion
AWS S3 permissions are granular. A bucket policy might strictly forbid s3:GetObject (reading files) or s3:ListBucket (seeing what's inside), which results in that 403 error when you visit the root URL.
But what about s3:PutObject? What if they restricted read access, but left the upload door wide open to the public? In security terms: The bucket had locked the front door, but nobody had checked the mail slot.
Let's ask the bucket a different question. I created a simple text file:
echo "Vulnerability verification by Tester - No malicious intent" > poc_test.txtThen, I fired up the AWS CLI. The --no-sign-request flag is the crucial part here—it tells AWS, "I don't have any credentials, I'm just a random guy on the internet, please let me upload this."
$ aws s3 cp poc_test.txt s3://redacted-oms-prod/poc_test.txt --no-sign-request
upload: ./poc_test.txt to s3://redacted-oms-prod/poc_test.txtIt worked. No errors. No rejections. The production bucket accepted an anonymous file upload from an unauthenticated user. I repeated the process for Staging and UAT. Same result across the board.
I navigated to the file path in my browser: https://redacted-oms-prod.s3.ap-south-1.amazonaws.com/poc_test.txt
There it was. Plain text, served on their official infrastructure.

04 — The Impact: Why This Matters
It's easy to look at a .txt file upload and shrug. But this is the core Order Management System infrastructure. If a malicious actor found this instead of me, the possibilities get very dark:
- Malware Hosting: They could host phishing pages or malware on the company's trusted AWS subdomain, instantly bypassing security filters.
- Asset Overwriting: If the bucket hosted active CSS or JS files, an attacker could overwrite them, injecting malicious code directly into the administrative dashboards.
- Denial of Wallet: An attacker could script a loop to upload terabytes of garbage data anonymously, leaving the company with a financial incident instead of a security one.
05 — Root Cause & The Fix
This wasn't a sophisticated exploit. It was two simple oversights that compounded into something serious.
- Infrastructure Leak: Internal bucket identifiers do not belong in client-side JavaScript. Minified or not, these are internal details that should never leave the server layer.
- The Wildcard Policy: Somewhere in the storage policy, a wildcard principal had been granted write permission.
// The "anyone in the universe can write" policy
{
"Effect": "Allow",
"Principal": "*",
"Action": "s3:PutObject",
"Resource": "arn:aws:s3:::redacted-oms-prod/*"
}The Fix: Enable the "Block Public Access" settings at the bucket level. It acts as a hard override, regardless of what individual policy statements say.
06 — The Real Takeaway
Security isn't just about complex zero-days. Often, it's about questioning assumptions. I packaged up the screenshots, wrote a polite email to their security team, and immediately stopped testing.
The team was responsive and professional, they patched the gaping holes in their infrastructure within hours and awarded a bug bounty for the responsible disclosure. Which, naturally, I spent right back at their store to grab those clothes I was looking at.
The next time you see an Access Denied error on a cloud asset, don't just walk away. Knock on the door a little differently. You might be surprised when it simply swings open.

Disclosure: All testing was performed on authorized systems. To protect the brand, all PII, bucket names, and specific URLs have been redacted or replaced with placeholders. Responsible disclosure was followed and the vulnerability has been fully remediated.