Introduction
The challenge description sets the theme and gives a hint that there may be vulnerabilities or hidden content in the target application. The narrative makes a distinction between what is visible and what is "hidden," suggesting that there could be undisclosed paths, files, or interfaces that require exploration.
The high-level goal is to find and retrieve a specific flag that is concealed somewhere in the web application.
Room: https://tryhackme.com/room/lafb2026e9
Step 1 — Initial Reconnaissance
The first step in any web challenge is accessing the target in a browser or via a simple HTTP request. The target URL was:
http://10.67.165.101:5000
Visiting this URL presented a static homepage with a welcoming message and no visible forms or user interaction elements. Because the homepage did not offer obvious interaction points, it was clear that we needed to explore further.
Step 2 — Directory Enumeration
Since the homepage was static and contained no obvious attack surface, directory enumeration was the next logical step. Directory enumeration tools help uncover hidden endpoints, files, and directories that are not linked from the homepage.
We ran the following command using gobuster:
gobuster dir -u http://10.67.165.101:5000 -w /usr/share/wordlists/dirb/common.txtOutput:
/console (Status: 400)
/robots.txt (Status: 200)
The response showed that robots.txt existed and was accessible, which warranted further inspection.
Step 3 — Inspecting robots.txt
The robots.txt file provides instructions for search engine crawlers but in CTFs and real-world applications it can inadvertently reveal sensitive paths. The file was retrieved with:
curl http://10.67.165.101:5000/robots.txtContents:
User-agent: *
Disallow: /cupids_secret_vault/*
# cupid_arrow_2026!!!
The Disallow directive indicated a restricted directory named cupids_secret_vault. More importantly, the comment # cupid_arrow_2026!!! contained a string that did not fit the pattern of a typical comment, making it suspicious and worth keeping in mind as a potential password or key.
url : http://10.67.165.101:5000/cupids_secret_vault/

The presence of a hidden directory and a suspicious string suggested the next step: explore the vault directory.
Step 4 — Exploring Hidden Directory
We used gobuster to enumerate the contents of the hidden vault directory:
gobuster dir -u http://10.67.165.101:5000/cupids_secret_vault/ -w /usr/share/wordlists/dirb/common.txtOutput:
/administrator (Status: 200)
This revealed an administrator page, which was likely protected and could be the gateway to the flag if authenticated correctly.
Step 5 — Administrator Login
Accessing:
http://10.67.165.101:5000/cupids_secret_vault/administratordisplayed a login page asking for a username and password.
At this point, we recalled the suspicious string in the robots.txt comment: cupid_arrow_2026!!!. Given its format and placement, it was reasonable to assume that this string might be the password.
We attempted to log in using:
Username: admin
Password: cupid_arrow_2026!!!This combination was accepted, granting access and revealing the hidden flag.

Final Flag
After successfully logging in, the flag was:

THM{l0v3_is_in_th3_r0b0ts_txt}Conclusion
This challenge demonstrates the value of basic web enumeration techniques, a methodical approach to hidden directories, and the importance of inspecting files like robots.txt for non-obvious clues.
The key steps in solving this challenge were:
- Accessing the homepage and identifying a lack of input points
- Using directory enumeration to discover hidden resources
- Inspecting
robots.txt, which contained both a location hint and potential credentials - Enumerating inside the restricted directory to find an admin login
- Using the credential string from the comment in
robots.txtto authenticate
The challenge reinforces that seemingly innocuous files and comments can disclose critical information when examined carefully.