What is a Prompt Engineer? The tech job everyone wants (but no one can define)

💡 What even is Prompt Engineering?

In the simplest terms: Prompt Engineering is the art of getting an AI to do what you want.

You’re basically learning how to talk to LLMs (large language models, like ChatGPT, Claude, or Bard) in a way that gets them to generate the outputs that are exactly what you need. Think of it like programming, but instead of code, you’re crafting sentences.

So… why are companies suddenly hiring prompt engineers like it’s the gold rush?

Because LLMs don’t magically work right out of the box. If you want high-quality, on-brand, safe, and accurate AI outputs, someone has to tell the machine exactly what to do.

Companies use LLMs for everything from customer support and content generation to product recommendations and internal tools. Without properly engineered prompts, these models can underperform, go off-script, or just plain confuse users. Prompt engineers help businesses scale their AI usage while minimizing risk and maximizing ROI. In other words: if you want your AI to sound less like a confused 3-year-old and more like a productive team member, you need a prompt engineer.

🧠 What does a Prompt Engineer actually do?

Daily duties might include:

  • Writing and refining prompts for different use cases: 

    • Example: Designing prompts to generate product descriptions for an e-commerce site across various categories.

  • Testing prompts across different models: 

    • Example: Evaluating prompt performance across ChatGPT, Claude, and other LLMs to determine consistency and accuracy.

  • Evaluating output quality and adjusting prompts accordingly: 

    • Example: Reviewing generated content for clarity and factual correctness, then adjusting instructions to improve results.

  • Building prompt libraries and documentation: 

    • Example: Creating a shared database of approved prompts categorized by task, such as summarization, rewriting, or Q&A.

  • Collaborating with product and dev teams: 

    • Example: Working with developers to integrate prompts into a customer support chatbot within a web platform.

  • Train models to respond more accurately: 

    • Example: Curating data and feedback to help improve the model's output behavior in specific domains.

  • Develop prompt chaining strategies: 

    • Example: Creating multi-step prompt flows where the output of one step feeds into the next to complete a complex task.

  • Create synthetic data for machine learning training: 

    • Example: Using prompts to generate example data that can be used to train or fine-tune language models.

📋 What do companies look for when hiring a Prompt Engineer?

Let’s decode the buzzword salad you’ll find in most job listings—and translate it into actual skills you should aim to have:

  • Experience with LLMs (Large Language Models): You should be comfortable using tools like ChatGPT, Claude, or Gemini, and understand how prompts affect the quality, accuracy, and tone of the outputs. Knowing the difference between models (e.g., GPT-3.5 vs. GPT-4, or open-source vs. proprietary) helps you select the right tool for different tasks.

  • Strong writing & communication skills: Prompt engineering is rooted in language, so the ability to clearly and concisely express complex tasks is essential. You’ll need to write instructions that are unambiguous, testable, and scalable across use cases—from summarization to code generation.

  • Familiarity with AI tools & frameworks: Many roles ask for experience using platforms like OpenAI API (for accessing models programmatically), Hugging Face Transformers (for working with open-source models), or LangChain (for building AI agents and pipelines). You don’t have to be a full-stack engineer, but you should understand how these tools are used in production.

  • Basic programming knowledge: Python is the most common language in the AI/ML ecosystem. You should be able to work with basic scripts, use libraries like requests or pandas, and interact with APIs. This also enables you to prototype prompt workflows or integrate with backend systems.

  • Understanding of model behavior: This means having a working knowledge of how and why LLMs behave the way they do—things like token limits, context windows, temperature settings, and common failure modes (e.g., hallucinations, bias, repetition). This helps you design prompts that work reliably under constraints.

Bonus points for:

  • UX knowledge

  • QA/testing experience

  • Prompt tuning experience (but this usually means fine-tuning in Python + OpenAI API)

🎓 Where do I get these skills if I'm currently just vibing?

Welcome to the internet’s favorite pastime: taking free/cheap courses. Here are some top picks:

🚀 Free & paid courses

DeepLearning.AI – ChatGPT Prompt Engineering for Developers (Free, on Coursera)

Learn Prompting – Beginner to advanced, free

OpenAI Cookbook + Documentation – Technical but gold

Fast.ai – Practical Deep Learning for Coders – For more technical folks

LangChain Documentation – Not fun, but necessary if you want to go pro

📚 Certifications (if you’re into that)

Purdue Prompt Engineering Certification – Looks great on LinkedIn

Hugging Face Courses – From intro to advanced

LinkedIn Learning – Prompt Engineering Basics – Bite-sized and accessible

🕵️ Where do I actually find prompt engineering jobs?

Let’s go beyond Indeed and LinkedIn spam:

Wellfound (AngelList) – Startup goldmine

Y Combinator Jobs – Mostly early-stage, chaotic good energy

PromptBase – Sell prompt templates and browse jobs

Hugging Face Jobs Board – Nerdy and wonderful

Discord & X – Follow indie AI builders and LLM researchers. Seriously. Jobs get posted in threads and DMs all the time.

RemoteOK, Otta, AI Jobs List – Filter by prompt engineer or keyword hunt to your heart’s content

🧠 Final thoughts from your favorite prompt ninja

Prompt engineering is weird. And new. And a little unhinged. But it’s also very real, very lucrative, and very accessible to people who are smart, creative, and willing to get weird with an AI.

It might’ve started as a niche side hustle or a novelty skill, but companies are now realizing it’s the glue between great AI and actual business value. As models get more powerful—and more deeply embedded into products, workflows, and decision-making—the need for people who know how to steer them well is only going to grow.

Prompt engineering is evolving too. We’re moving from “try this weird phrasing and see what happens” to systematic methodologies, reusable frameworks, and even automation pipelines. It may eventually blur into adjacent roles like AI UX design, model tuning, or tool orchestration—but the skill of understanding how to speak to models effectively? That’s not going away.

So start experimenting, document what you’re doing, build a portfolio, and tell people what you’re learning out loud. It’s less about the degree and more about the receipts.

You’re not late. You’re just early enough to still look cool. And honestly? That’s the best time to get in.

Lisa Kilker

I explore the ever-evolving world of AI with a mix of curiosity, creativity, and a touch of caffeine. Whether it’s breaking down complex AI concepts, diving into chatbot tech, or just geeking out over the latest advancements, I’m here to help make AI fun, approachable, and actually useful.

https://www.linkedin.com/in/lisakilker/
Previous
Previous

Featured AI artist: Mohamad Dadmand

Next
Next

AI: The new Field of Dreams - If you prompt it, jobs will come