Growth.Talent
Guest Profileaicontentteam-building

Justin Kistner on Treating AI Like a Junior Employee, Not a Senior Expert

The founder of CopyClub.ai built an $18K MRR content subscription by using AI as a team member, not magic—and his biggest insight is that most growth leaders prompt like they're talking to senior colleagues when they should be talking to interns.

Apr 11, 2026|8 min read|By Growth.Talent|

The Unspecified Noun Problem: Why Your AI Outputs Are Mediocre

Justin Kistner has a linguistics term for why most growth professionals get terrible results from ChatGPT: unspecified nouns. When someone asks an AI to "create a content calendar for me" or "define our North Star metric," they're committing the cardinal sin of prompting—assuming the tool has context it doesn't possess.

"If you're used to talking to senior colleagues who are highly competent, you're probably bad at prompting," Kistner explains. The mental model shift he advocates is radical: treat AI like a team member fresh out of college, not a seasoned expert. Senior colleagues bring years of company knowledge, industry nuance, and implicit understanding to every request. ChatGPT and Claude bring amnesia.

You have to actually act like this is somebody fresh outta college. Not a senior colleague, and then what would you tell them?

— Justin Kistner

Kistner developed what he calls the "prompt evaluation prompt" that scans for unspecified nouns—terms like "my project" or even "North Star metric," which carries wildly different definitions depending on whether you're channeling Andrew Chen's viral loop philosophy or Amplitude's product analytics framework. The fix isn't just adding more words. It's adding specificity about whose interpretation matters. Sean Ellis, co-host of the Breakout Growth Podcast, adopted this immediately: instead of asking ChatGPT to analyze product-market fit survey results generically, he prompts it to "do it the Sean Ellis way"—filtering only for users who'd be "very disappointed" without the product before identifying the primary benefit.

This isn't pedantry. It's the difference between AI regurgitating the average of all knowledge on a topic and AI channeling the specific expertise your growth challenge demands. Kistner's clients at CopyClub.ai and his subscription content service CopySub learned this the hard way when early AI-generated articles all started with "in a world" and invited readers to "delve into the tapestry." The tool was capable. The instructions were vague.

The Context Library: RAM for AI's Perpetual Amnesia

Kistner's breakthrough came from frustration. When he pivoted from his previous startup UpFocus (which faced capital market headwinds in June 2022) to building domain authority through content marketing, ChatGPT 3.5 had just launched. The output was magical—and terrible. Every article was generic. Every idea surface-level. Then he tried something different: instead of asking AI to write articles, he fed it research, company positioning, ideal customer profiles, and pain points before requesting ideas.

This evolved into what he now calls the "context library"—a collection of Markdown files or Google Docs containing everything AI needs to act like it knows your business. For CopySub customers, onboarding meant capturing: company description, product offerings, ideal customer profile, competitor landscape, and buyer pain points. All saved. All reusable. All prompt-ready.

AI is super smart, but it always has amnesia. So you have to load up into its RAM again. Who are we? What are we doing? Who's my customer?

— Justin Kistner

One of Kistner's clients creates trivia-style learning games. They struggled for months to get AI-generated questions that matched their pedagogical standards. The problem wasn't AI capability—it was lack of context about their specific "school of thought" on what makes good versus bad questions. Once Kistner helped them document that philosophy in their context library, they could @mention those files in tools like Cursor or Windsurf (integrated developer environments he repurposed for text workflows) and generate on-brand content at scale.

For growth teams, this approach transforms workflows. Analyzing product-market fit surveys? Upload the CSV, attach your context library file defining how your company interprets "primary benefit," and ask for job-to-be-done statements. Critiquing landing page copy? Feed AI your survey insights and prompt it to evaluate homepage messaging alignment. Kistner even built a free tool that automates processing Sean Ellis's product-market fit survey—the number one complaint from his UpFocus days was the manual tagging burden.

From Individual Contributor to Manager of AI Contributors

When Kistner reflects on UpFocus, his previous venture, he keeps circling back to one regret: "Man, if I'd have started that now with AI, I could do it all over so differently." The shift isn't just tooling. It's identity. Growth professionals trained as individual contributors now need to operate as managers directing a team of AI employees.

This mental leap explains why CopySub scaled to $18K MRR within months. Kistner didn't replace human writers. He built a "heavy human-in-the-loop process to solve for the quality problems of AI" while "leveraging AI to solve for the scale problems of just human only." Clients kept asking him to teach the system, which led to CopyClub.ai—a community offering mastermind-style "quests" where members build AI agents and learn prompting best practices.

It really is about you have a ton more team members now and you have to have a mental shift from being the individual contributor to being the manager of the individual contributors.

— Justin Kistner

The management analogy holds. Good managers don't do the work—they direct it, provide context, and evaluate output. Kistner's developmental writing technique for content creation mirrors this: research the topic, create the outline, write in chunks, edit iteratively. AI handles execution. Humans handle judgment. One CopySub customer wanted AI to "find important trends to help grow our business" by dumping in all their data. Kistner's response: senior colleagues understand business nuance intuitively. Junior employees need explicit direction. AI is the junior employee.

The skill isn't obsolete—it's leveraged. Sean Ellis uses ChatGPT over 20 hours per week, bouncing ideas, tapping into books he's read, consulting virtual experts. But he's still the one who knows which expert to channel, which framework applies, and whether the output is actionable or noise. Ethan Garr, his co-host, described a conversation with Kistner that left him "rage typing" Sean about new use cases. The revelation wasn't that AI replaces expertise. It's that expertise becomes exponentially more valuable when you know how to deploy it through AI.

The Onboarding Sequence: From ChatGPT UI to Cursor

Kistner's advice for growth teams entering the AI workflow is sequential, not all-at-once. Start with basic prompting in the ChatGPT interface. Practice being specific. Use frameworks like STAR (situation, task, action, result) from interviewing to ensure clarity and context. Watch how specificity improves output quality.

Next, create one singular Google Doc as your context library. Consolidate company positioning, customer profiles, product descriptions. Attach it every time you prompt. Feel the pain of that attachment process. Let the friction build.

Then graduate to tools like Cursor, Zed, or Windsurf—IDE environments designed for developers but perfect for text-based AI workflows. Kistner's setup: file structure on the left with a folder labeled "context library" containing Markdown files (about, market, customer profile, competitors). AI chat on the right. Working document in the middle. @Mention files or drag them into prompts. The user experience eliminates the ChatGPT UI attachment clumsiness.

Just start with one doc and then just get used to attaching that every time you're doing it. Then once you feel the pain of that, then take a look at something like Cursor.

— Justin Kistner

For growth professionals without engineering backgrounds, the progression matters. Kistner doesn't recommend diving straight into Cursor. Build the muscle memory of good prompting first. Understand why context transforms output. Then adopt tools that make context management seamless. The technical barrier is low—opening a folder, saving Markdown files—but the conceptual leap is high. You're no longer asking AI questions. You're managing a team that needs onboarding, documentation, and performance evaluation.

Ellis's experience analyzing product-market fit surveys captures this evolution. First attempt: upload CSV, ask basic questions. Next iteration: prompt ChatGPT to answer "the Sean Ellis way." Advanced use: attach context library defining customer segments, filter for high-intent users, generate job statements, ideate landing page copy variations, critique existing messaging. Each layer adds specificity. Each prompt compounds previous insights. The AI doesn't get smarter—the manager does.

The $18K MRR Validation and What Growth Teams Miss

CopySub's rapid growth to $18K MRR validated a thesis many growth teams still miss: AI isn't a replacement strategy, it's an augmentation strategy. Kistner launched the subscription content service after reading about Design Joy, a subscription-based design service. He spent two days building the site. The differentiator wasn't automation—it was the hybrid model. AI handled scale. Humans handled quality assurance.

Clients didn't just buy content. They bought the methodology. When Kistner explained his human-in-the-loop process and context library system, the response was consistent: "Can you just teach me how to do what you're doing?" That demand created CopyClub.ai, where growth professionals learn to build AI agents, optimize prompts, and select tools for specific outcomes.

It's not a replacement for skill. Like you still need to have that skill, but it is a like augmentation of your team.

— Justin Kistner

The mistake Kistner sees repeatedly: treating AI as magic instead of labor. Early ChatGPT users marveled at speed, then dismissed quality. Kistner's approach inverts that. Leverage developmental writing techniques—research, outline, draft, edit—but distribute tasks between human judgment and AI execution. The same logic applies across growth functions. Experimentation design, landing page optimization, survey analysis, roadmap prioritization—all benefit from AI that has been properly onboarded with context.

Ellis's observation resonates: every time he gets a helpful result from a prompt, he shares it with Garr. They text each other prompting strategies. That peer learning loop, Kistner argues, is what separates growth teams that unlock AI productivity from those that bounce off it. The conversation isn't about tools. It's about mental models. How do you talk to someone who has no context? What documentation do they need? How do you evaluate their work?

For Kistner, the regret about UpFocus isn't just operational. It's strategic. With AI, he could have moved faster, tested more hypotheses, scaled content without raising capital in a frozen market. The lesson for growth leaders: the teams building context libraries and treating AI as managed contributors won't just move faster. They'll compound advantages while competitors are still asking ChatGPT to "enhance" their strategy.

Related Insights