Discovery Call Prompts That Don't Sound Like ChatGPT Wrote Them
Sales discovery is the highest-leverage conversation you have. AI-generated discovery sounds robotic and kills trust. Here are six prompts that produce questions a senior consultant would actually ask.
Discovery calls are where bad sales lose to good sales. The questions you ask determine whether a prospect feels seen or processed. AI can help. AI can also ruin it.
These six prompts produce discovery questions that don't sound like a template. Stop generating "what are your biggest challenges" and start asking what a senior advisor would.
The setup
Each of these prompts assumes you've done basic homework before the call. LinkedIn. Company news. Their website. 5 minutes of prep.
The AI's job is to turn that homework into the right next question, not to do the homework for you.
1. The opener (after pleasantries)
``` Generate an opening discovery question for a sales call.
Prospect: {NAME, COMPANY, ROLE} What I noticed about their business in 5 minutes of research: {NOTES} What they said in the initial inquiry: {THEIR_REQUEST} My theory of their real underlying problem: {MY_HYPOTHESIS}
Produce one specific opening question that: - References something specific from my research (not generic) - Tests my hypothesis without revealing it - Invites a story, not a yes/no - Sounds like a curious peer, not a salesperson
Banned: "tell me about your business", "what's keeping you up at night", "walk me through your day", anything that could be asked of any prospect. ```
The banned list is the whole point. Generic openers die on contact.
2. The follow-up to "we tried X and it didn't work"
``` Generate a follow-up question for a prospect who said a prior approach failed.
What they tried: {APPROACH} What they said about why it failed: {THEIR_STATED_REASON} My read on what actually failed: {MY_READ}
Produce one question that surfaces: - The actual failure mode (not their face-saving version) - Whether the underlying problem still exists - Their appetite to try again with a different approach
Tone: respectful curiosity, not judgment. Avoid implying their prior choice was dumb. ```
Most prospects have already tried something. Their stated reason for failure is rarely the real reason. The follow-up is where you find out.
3. The dig into a specific number
``` Generate a follow-up question when a prospect mentions a number.
The number they mentioned: {NUMBER} in {CONTEXT} What I think the real story is behind that number: {HYPOTHESIS}
Produce one question that: - Tests whether the number is normal for their industry or unusually high/low - Surfaces what they've done about it - Asks what would change if the number got 20% better
No follow-up that just rephrases their number back. Push for the story. ```
Numbers in discovery are doors. The follow-up walks through them.
4. The "who else cares about this"
``` Generate a stakeholder-mapping question.
The pain point we're discussing: {PAIN} Who I'm talking to: {ROLE} Who I suspect would also care about this: {OTHER_STAKEHOLDERS}
Produce one question that: - Surfaces other people who feel this pain - Without making the prospect feel I'm trying to go around them - That reveals who has budget authority vs influence
Examples of good shape: - "Who else in the company would feel it most if this changed?" - "If we built this, who'd be in the room when you showed it for the first time?"
Generate one new question in this voice. Not a copy of the examples. ```
The examples-don't-copy framing keeps the AI from producing the same question every time.
5. The budget question without asking about budget
``` Generate a question that surfaces budget without asking directly.
What we've discussed so far: {SUMMARY} The likely range I'd charge: {MY_RANGE} Their company size: {SIZE}
Produce one question that: - Tests whether their thinking about value is in the right ballpark - Surfaces how they'd justify the spend internally - Reveals who approves spending in this range
Examples of good shape: - "How does spending typically get approved for this kind of work at your company?" - "What's the size of investment you've made on similar initiatives?"
Don't ask "what's your budget" or any direct variant. Generate a new question in this indirect style. ```
Direct budget questions kill momentum. Indirect ones get better answers.
6. The closing question (whether to advance or kill)
``` Generate a closing question that lets me kill or advance the deal cleanly.
The call so far: {SUMMARY} What I've learned: {KEY_INSIGHTS} My read on fit: {FIT_ASSESSMENT}
Produce one question that: - Forces a real answer about whether to keep going - Doesn't pressure them into a yes - Makes it easy for them to say "no, not a fit" if that's the truth
Examples of good shape: - "Based on this conversation, would it be useful to put together a specific proposal, or does it seem like the timing isn't right?" - "I'd want to make sure we'd actually deliver on this — should we set up a deeper working session before talking proposals?"
Generate a new question that does this work. ```
You want the "no" as fast as the "yes." This question gets both.
The meta-pattern
Notice what every prompt has: - Specific inputs (not generic) - "Banned" or "no" lists (forces away from defaults) - "Examples of good shape" + "don't copy" framing - A voice constraint (curious peer, not salesperson)
The AI's instinct is to produce average questions. The prompt's job is to push it past average.
What I do with these
I run them before every important call. I generate 3-4 question candidates per prompt and pick the one that fits the moment.
I don't pre-script the whole call. I script the openers and the closers. The middle is conversation.
The pre-call prep with these prompts takes 8-10 minutes. The call quality improvement is measurable.
What changes for non-sales contexts
You can swap "sales call" for "client review meeting", "board prep", "interview", "investor pitch follow-up". The pattern transfers.
The constants: - Specific prep inputs (do the homework) - Banned generic phrases (force specificity) - Voice constraint (be a peer, not a role) - One question at a time (don't generate a script, generate the next move)
What I'd never skip
The "what I'm not going to ask" list inside each prompt. That's the discipline. AI defaults to generic. The negative constraints push it specific.
If you only adopt one habit from this post: build a negative list for every prompt you save. The negatives matter more than the positives.
Want the full guide? Check out our deep-dive page for more context, FAQs, and resources.
read the full guide