In-Depth Explanation
B2B Prompt Categories by Buyer Stage
Stage 1: Awareness Prompts
Buyers at the awareness stage are discovering problems and solutions:
- "What is [category]?"
- "How does [category] work?"
- "Why do I need [category]?"
- "What problems does [category] solve?"
- "[Category] for [industry]"
These prompts require educational content that builds foundational understanding without pushing specific solutions. AI models cite educational resources, guides, and overviews that provide clear, comprehensive explanations.
Stage 2: Discovery Prompts
Buyers are exploring options and identifying potential solutions:
- "What are the best [category] tools?"
- "Top [category] software for [use case]"
- "Most popular [category] platforms"
- "[Category] tools under [budget]"
- "Free [category] software"
These prompts demand comparison and recommendation content that positions your software among top options. AI models cite comparison pages, "best of" lists, and review aggregations.
Stage 3: Evaluation Prompts
Buyers are comparing specific options and narrowing choices:
- "[Your Software] vs [Competitor]"
- "Which is better: [Software A] or [Software B]?"
- "[Software] alternatives"
- "[Category] features comparison"
- "[Software] pricing vs [Competitor]"
These prompts require detailed comparison content that provides objective analysis. AI models cite head-to-head comparison pages, feature breakdowns, and competitive analysis.
Stage 4: Decision Prompts
Buyers are ready to commit and need implementation details:
- "[Software] pricing"
- "[Software] setup guide"
- "[Software] integrations"
- "[Software] customer support"
- "[Software] for [specific company size]"
These prompts require practical, actionable information. AI models cite pricing pages, implementation guides, integration documentation, and use case pages.
Stage 5: Post-Purchase Prompts
Buyers are implementing and optimizing their choice:
- "How to use [Software feature]"
- "[Software] best practices"
- "[Software] tips and tricks"
- "[Software] troubleshooting"
- "[Software] advanced features"
These prompts demand comprehensive documentation and educational content. AI models cite help centers, knowledge bases, and tutorial content.
Prompt Patterns AI Values
1. Question Prompts
Direct questions seeking information:
- "What is [feature]?"
- "How does [feature] work?"
- "When should I use [feature]?"
- "Why choose [software]?"
AI models cite content that provides direct, answer-first responses.
2. Comparison Prompts
Requests for comparisons between options:
- "[Software A] vs [Software B]"
- "Best [category] for [use case]"
- "Top 5 [category] tools"
- "[Category] alternatives to [Competitor]"
AI models cite comparison tables, feature lists, and objective analysis.
3. List Prompts
Requests for lists of options:
- "Best [category] tools"
- "Top [category] software"
- "[Category] under [price]"
- "[Category] for [industry]"
AI models cite curated lists, rankings, and comprehensive recommendations.
4. Recommendation Prompts
Requests for personalized recommendations:
- "Recommend [category] for my [industry] business"
- "What [category] should I choose for [company size]?"
- "Which [category] integrates with [platform]?"
AI models cite recommendation engines, use case pages, and integration guides.
5. How-To Prompts
Requests for implementation guidance:
- "How to set up [software]"
- "How to use [feature]"
- "How to integrate [software] with [platform]"
- "How to migrate from [competitor]"
AI models cite step-by-step guides, tutorials, and documentation.
6. Pricing Prompts
Requests for cost information:
- "[Software] pricing"
- "[Software] cost vs [competitor]"
- "Is [software] worth the price?"
- "[Category] under [budget]"
AI models cite pricing pages, cost analysis, and value comparisons.
7. Review Prompts
Requests for user experiences:
- "[Software] reviews"
- "Is [software] any good?"
- "[Software] pros and cons"
- "What users say about [software]"
AI models cite review platforms, customer testimonials, and case studies.