backend-dev
๐ฏSkillfrom srbhr/resume-matcher
Backend development skill from Resume Matcher, an open-source tool that uses AI to match resumes with job descriptions.
Same repository
srbhr/resume-matcher(14 items)
Installation
npx vibeindex add srbhr/resume-matcher --skill backend-devnpx skills add srbhr/resume-matcher --skill backend-dev~/.claude/skills/backend-dev/SKILL.mdSKILL.md
More from this repository10
AI-powered tool that helps create tailored resumes for each job application with suggestions, working locally with Ollama or connecting to LLM providers via API.
Builds robust Python APIs with FastAPI, implementing production-ready patterns for async database integration, JWT authentication, and comprehensive validation.
Generates reusable Tailwind CSS design patterns and utility classes for consistent styling across the Resume Matcher application's user interface components.
Resume Matcher is an AI-powered tool that creates tailored resumes for each job application. Works locally with Ollama or connects to your favorite LLM provider via API to provide AI-powered suggestions.
Helps developers quickly orient and navigate complex codebases by providing structured guidance on project architecture, file locations, and key documentation paths.
AI-powered tool that helps create tailored resumes for each job application with suggestions, working locally with Ollama or connecting to LLM providers via API.
Provides performance optimization guidelines for React and Next.js apps, focusing on local/offline and Docker deployments with Vercel best practices.
A skill from Resume Matcher, an AI-powered tool that creates tailored resumes for each job application with suggestions, working locally with Ollama or via external LLM provider APIs.
A UI review skill from Resume Matcher, an AI-powered resume analysis platform, providing interface evaluation and design feedback capabilities.
Skill for Next.js performance optimization within the Resume Matcher project, an AI-powered tool that creates tailored resumes for each job application using local Ollama or LLM provider APIs.