🎯

kalshi-markets

🎯Skill

from disler/beyond-mcp

VibeIndex|
What it does

kalshi-markets skill from disler/beyond-mcp

kalshi-markets

Installation

uv runRun with uv
uv run kalshi status
uv runRun with uv
uv run kalshi events
uv runRun with uv
uv run kalshi events --json
uv runRun with uv
uv run kalshi events --json --limit 100
uv runRun with uv
uv run status.py

+ 1 more commands

πŸ“– Extracted from docs: disler/beyond-mcp
1
-
Last UpdatedNov 9, 2025

Skill Details

SKILL.md

Overview

# Beyond MCP

> It's time to push beyond MCP Servers... Right?

>

> Let's breakdown real engineering trade offs between MCP, CLI, File System Scripts, and Skills based approaches for building reusable toolsets for your AI Agents.

>

> Watch the full video breakdown here: [Beyond MCP](https://youtu.be/OIKTsVjTVJE)

Purpose of this Repo

  • MCP Servers are the standard way to build reusable toolsets for your AI Agents. But they are not the only way.
  • MCP Servers come with a massive cost - instant context loss.
  • When you have a single, or a few MCP Servers, this is not a big deal. But as you scale to many agents, many tools, and many contexts - this cost quickly becomes a bottleneck.
  • So what are the alternatives that big players are using to build powerful, reusable, context preserving toolsets for their AI Agents?

_Here we explore 4 concrete approaches in this repo, all implementing access to Kalshi prediction market data._

The 4 Approaches

![The 4 Approaches Revealed](images/NodeGraphLR-revealed.gif)

`apps/1_mcp_server/` - MCP Server

![MCP Server Architecture](images/NodeGraphLR-mcp-server.gif)

`apps/2_cli/` - CLI

![CLI Architecture](images/NodeGraphLR-cli.gif)

`apps/3_file_system_scripts/` - File System Scripts

![File System Scripts Architecture](images/NodeGraphLR-scripts.gif)

`apps/4_skill/` - Skill

![Agent Skill Architecture](images/NodeGraphLR-skills.gif)

Quick Start

1. MCP Server

```bash

cp .mcp.testing .mcp.json

claude --mcp-config .mcp.json

prompt: "kalshi: get exchange status"

```

2. CLI

```bash

# or by agent

claude

prompt: "/prime_kalshi_cli_tools"

prompt: "kalshi: Get exchange status"

prompt: "kalshi: List events"

prompt: "kalshi: List events in JSON"

prompt: "kalshi: List events in JSON, limit 100"

# or by hand

cd apps/2_cli

uv sync

uv run kalshi status

uv run kalshi events

uv run kalshi events --json

uv run kalshi events --json --limit 100

```

3. File System Scripts

```bash

# by agent

claude

prompt: "/prime_file_system_scripts"

prompt: "kalshi: Get exchange status"

prompt: "kalshi: List events"

...

# or by hand

cd apps/3_file_system_scripts/scripts

uv run status.py

uv run *.py

```

4. Skill

```bash

cd apps/4_skill/

claude

prompt: "kalshi markets: Get exchange status"

prompt: "kalshi markets: search for events about 'best ai'" # Note this will trigger the cache build on first run which will take several minutes

...

```

The 4 Approaches In Detail

  • apps/1_mcp_server/ - MCP Server
  • apps/2_cli/ - CLI
  • apps/3_file_system_scripts/ - File System Scripts
  • apps/4_skill/ - Skill

1. MCP Server (`apps/1_mcp_server/`)

Classic Model Context Protocol implementation

  • βœ… Standardized integration - Works with any MCP-compatible client
  • βœ… Tool discovery - Auto-exposes 15 tools to LLMs
  • βœ… Clean abstractions - MCP protocol handles complexity
  • ❌ Instant context loss - Every tool call loses conversational context
  • ❌ Wrapper overhead - Delegates to CLI via subprocess

Architecture:

```

Claude/LLM β†’ MCP Protocol β†’ MCP Server β†’ subprocess β†’ CLI β†’ Kalshi API

```

Key files:

  • server.py - FastMCP server with 15 tool definitions
  • Wraps CLI commands in MCP tool interface
  • Each tool call is stateless

When to use: Building tools for multiple LLM clients, need standardized protocol, context loss is acceptable.

---

2. CLI (`apps/2_cli/`)

Direct HTTP API access via command-line interface

  • βœ… Single source of truth - Direct API calls, no wrappers
  • βœ… Dual output modes - Human-readable or pure JSON
  • βœ… Smart caching - Pandas-based search with 6-hour TTL
  • βœ… Minimal overhead - Direct httpx calls, no SDK
  • βœ… Improved Context - Agent reads ~half as much context as the MCP Server

Architecture:

```

Claude β†’ subprocess β†’ CLI (13 commands) β†’ Direct HTTP β†’ Kalshi API

```

Key files:

  • kalshi_cli/cli.py - All 13 commands (552 lines)
  • kalshi_cli/modules/client.py - HTTP