When activated, follow this structured thinking approach to conduct comprehensive technical research:
Step 1: Problem Framing
Goal: Transform a vague research request into specific, answerable questions.
Key Questions to Ask:
- What is the core decision that needs to be made?
- Who is the audience for this research? (developer, CTO, team)
- What is the timeline? (immediate decision vs long-term evaluation)
- What are the constraints? (budget, team skills, existing infrastructure)
Actions:
- Clarify the research scope with the user
- Identify 3-5 key research questions
- Define success criteria (what makes a good answer?)
- Establish evaluation criteria for comparing options
Decision Point: You should be able to articulate:
- "The core question is: [X]?"
- "We will evaluate options based on: [criteria list]"
Step 2: Hypothesis Formation
Goal: Form initial hypotheses to guide efficient research.
Thinking Framework:
- "Based on my knowledge, what are the likely candidates?"
- "What do I expect to find, and why?"
- "What would change my initial assumptions?"
Actions:
- List 2-4 initial hypotheses or candidate solutions
- Identify knowledge gaps that need to be filled
- Prioritize research areas by impact on decision
Decision Point: Document:
- "Initial hypothesis: [X] because [Y]"
- "Key uncertainty: [Z]"
Step 3: Source Strategy
Goal: Identify the most authoritative and relevant sources.
Source Hierarchy (in order of reliability):
- Official Documentation (WebFetch) - Most authoritative
- GitHub Repository Analysis - Code examples, activity metrics
- Context7 Documentation - Structured, searchable docs
- Technical Blogs (WebSearch) - Real-world experiences
- Discussion Forums - Edge cases, gotchas
Thinking Framework:
- "What type of information do I need?"
- Factual/API details β Official docs
- Real-world experience β Blogs, case studies
- Community health β GitHub activity
- Comparison data β Benchmarks, surveys
Actions:
- List sources to query for each research question
- Note date sensitivity (when does info become stale?)
- Plan for cross-validation of key claims
Step 4: Information Gathering
Goal: Systematically collect relevant information.
Thinking Framework - For each source:
- "What am I looking for specifically?"
- "How do I know if this is trustworthy?"
- "Does this confirm or contradict other sources?"
Gathering Checklist:
- [ ] Official documentation for each candidate
- [ ] Getting started / quickstart guides
- [ ] Migration guides (reveal complexity)
- [ ] GitHub metrics (stars, issues, PR activity)
- [ ] Recent blog posts (last 12 months)
- [ ] Benchmark data (if performance-relevant)
Quality Indicators:
- Check article dates (recency matters)
- Verify author credibility
- Look for hands-on experience vs theoretical discussion
- Note sample sizes and methodology for benchmarks
Step 5: Analysis Framework
Goal: Apply structured analysis to collected information.
Thinking Framework - For Technology Evaluation:
| Dimension | Questions to Answer |
|-----------|---------------------|
| Maturity | How long in production? Stable API? Breaking changes? |
| Community | Active maintainers? Issue response time? Contributor diversity? |
| Performance | Benchmark data? Real-world case studies? |
| Learning Curve | Documentation quality? Tutorials? Time to productivity? |
| Ecosystem | Integrations? Plugins? Tooling support? |
| Risk | Bus factor? Funding/backing? License concerns? |
Maturity Assessment Scale:
| Level | Criteria |
|-------|----------|
| Emerging | < 1 year, experimental, API unstable |
| Growing | 1-3 years, production-ready, active development |
| Mature | 3+ years, stable API, widespread adoption |
| Declining | Decreasing activity, maintenance mode |
Step 6: Synthesis
Goal: Transform raw findings into actionable insights.
Thinking Framework:
- "What patterns emerge across sources?"
- "Where do sources agree/disagree?"
- "What are the trade-offs between options?"
Synthesis Process:
- Create comparison matrix against evaluation criteria
- Identify clear winners for specific criteria
- Note where context matters (team, scale, use case)
- Formulate primary recommendation with reasoning
Handling Conflicts:
- When sources disagree, note the discrepancy
- Check for date differences (newer may be more accurate)
- Look for official clarification
- Present both perspectives if unresolved
Step 7: Risk Assessment
Goal: Identify and document risks for each option.
Thinking Framework:
- "What could go wrong with this choice?"
- "How likely is this risk? How severe?"
- "How can we mitigate this risk?"
Risk Categories:
- Technical: Performance, scalability, integration issues
- Organizational: Learning curve, hiring difficulty
- Strategic: Vendor lock-in, technology obsolescence
- Operational: Deployment complexity, monitoring gaps
Step 8: Recommendation and Roadmap
Goal: Provide clear, actionable recommendations.
Recommendation Structure:
- Primary recommendation with confidence level
- Conditions that would change this recommendation
- Alternative for different contexts
- Implementation roadmap (next steps)
Decision Point: Your recommendation should state:
- "For [this context], I recommend [X] because [Y]"
- "If [condition changes], consider [Z] instead"
- "Next steps: [1, 2, 3]"