I applied to 142 jobs last year.

I got 4 interviews.

Every rejection email sounded polite:

"We were impressed by your background, but…"

But we all know the truth.

I wasn't rejected by a human. I was rejected by an algorithm.

An Applicant Tracking System (ATS) scanned my resume, failed to find enough keyword matches from the Job Description, and quietly buried my application before a recruiter ever saw it.

That's when I realized I had only two realistic options:

  • Spend 2 painful hours rewriting my resume for every single jobor
  • Build a bot to beat their bot.

So I chose the second option.

By the end of one weekend, I had built a local AI resume-tailoring agent that:

  • Takes my master resume
  • Scrapes a job posting URL
  • Matches keywords perfectly
  • Rewrites bullets for relevance
  • Generates a beautiful, submission-ready PDF

All in about 30 seconds per job.

Result? My callback rate jumped from ~3% → 28%.

This article shows you exactly how I built it — code, prompts, and all.

You can run this on your own laptop today.

⚠️ Important Ethical Note (Read This)

This system does NOT lie. It does NOT fabricate experience. It only reframes your real experience using the employer's own language.

You are still responsible for everything you submit.

This simply prevents the ATS from auto-rejecting you before a human ever looks.

The "Broke Engineer" Stack (Free, Private, Fast)

I wanted this pipeline to be:

  • ✅ Free
  • ✅ Private (no uploading resumes to random websites)
  • ✅ Local
  • ✅ Fast

Here's what I used:

  • The Brain: Llama-3 via Ollama (local & private) (You can swap this with GPT-4o if you want speed over privacy)
  • The Eyes: BeautifulSoup (to read job descriptions)
  • The Hands:
  • PyMuPDF → Read your resume
  • Markdown-PDF → Generate a polished resume PDF

Step 1 — The "Spy" (Scraping the Job Description)

We don't want to manually copy-paste job descriptions.

We want to feed the agent a URL and let it do the dirty work.

import requests
from bs4 import BeautifulSoup

def get_job_description(url):
    """
    Scrapes job description text from a given URL.
    Works best on company career pages (Lever, Greenhouse, etc).
    """
    try:
        response = requests.get(url)
        soup = BeautifulSoup(response.content, 'html.parser')
        
        # Remove noise
        for script in soup(["script", "style", "nav", "footer"]):
            script.decompose()
            
        text = soup.get_text(separator=' ')
        return ' '.join(text.split())
    
    except Exception as e:
        return f"Error scraping job: {str(e)}"
# Test
url = "https://boards.greenhouse.io/company/jobs/12345"
jd_text = get_job_description(url)
print(f"Scraped {len(jd_text)} characters.")

Now we have the exact language the ATS will scan for.

Step 2 — The "Reader" (Parsing Your Master Resume)

You need one file:

A Master Resume containing everything you've ever done.

Projects. Metrics. Internships. Freelance work. Research. Everything.

The agent's job is to select and prioritize — not invent.

import fitz  # PyMuPDF

def extract_text_from_pdf(pdf_path):
    doc = fitz.open(pdf_path)
    text = ""
    for page in doc:
        text += page.get_text()
    return text
my_resume_text = extract_text_from_pdf("Master_Resume.pdf")

Step 3 — The "Brain" (The Tailoring Engine)

This is where the transformation happens.

Most people fail because they use weak prompts like:

"Rewrite my resume for this job."

That's useless.

We force the model to act like a merciless ATS optimizer.

import ollama

def tailor_resume(resume_text, job_description):
    prompt = f"""
You are an expert Resume Writer and ATS Optimization Specialist.
JOB DESCRIPTION:
{job_description}
MY MASTER RESUME:
{resume_text}
TASK:
Rewrite my resume to perfectly match the Job Description.
RULES:
1. Use the EXACT keywords from the JD in skills and summary sections.
2. Rephrase bullet points to match the responsibilities and metrics they care about.
3. Do NOT lie or invent new experience.
4. Only use real experience from my resume.
5. Output in clean, well-formatted MARKDOWN.
"""
    
    response = ollama.chat(
        model='llama3',
        messages=[{'role': 'user', 'content': prompt}]
    )
    
    return response['message']['content']
tailored_markdown = tailor_resume(my_resume_text, jd_text)

Step 4 — The "Artist" (Generating a Beautiful PDF)

Text alone isn't enough.

You need a clean, professional PDF that looks like it came from LaTeX or Figma.

from markdown_pdf import MarkdownPdf, Section

def save_as_pdf(markdown_content, output_filename="Tailored_Resume.pdf"):
    pdf = MarkdownPdf(toc_level=0)
    
    css_style = """
    body { font-family: 'Helvetica', sans-serif; font-size: 12px; line-height: 1.4; color: #333; }
    h1 { font-size: 24px; border-bottom: 2px solid #333; margin-bottom: 10px; text-transform: uppercase; }
    h2 { font-size: 16px; border-bottom: 1px solid #ccc; margin-top: 20px; text-transform: uppercase; color: #555; }
    ul { padding-left: 20px; }
    li { margin-bottom: 5px; }
    strong { color: #000; }
    """
    
    pdf.add_section(Section(markdown_content), user_css=css_style)
    pdf.save(output_filename)
save_as_pdf(tailored_markdown)

Now you get a submission-ready PDF automatically.

One Command. One Perfect Resume.

My full process today:

python apply.py "https://jobs.company.com/role"

30 seconds later, I have:

  • ✅ ATS-aligned keywords
  • ✅ Relevant project prioritization
  • ✅ JD-specific metrics
  • ✅ Clean professional PDF

No burnout. No rewriting. No emotions attached.

Before vs After (Real Impact)

Before

• Built ML models for classification

After

• Built production-ready LLM classification pipelines improving F1-score by 18% across multilingual datasets

Same experience. Radically better positioning.

Why This Changed Everything for Me

The biggest lie we're told about job hunting is:

"Quality over quantity."

That's only half true.

Reality: You need quality AND quantity.

And no human can manually craft 100+ high-quality applications without burning out.

But an agent can.

Once I automated this, everything changed:

  • I stopped taking rejections personally
  • I stopped wasting hours per resume
  • I started treating job hunting like what it actually is: A data filtering problem

And for the first time in years, I felt in control again.

Go Build This. Seriously.

Steal the code. Customize the prompt. Make it yours.

And if this helped you:

👉 Follow me. I publish practical AI systems you can actually use to:

  • Get hired
  • Build products
  • Automate income
  • And stay sane doing it

Your future employer is already using automation. It's time you did too.