Claude Desktop
claude.ai/downloadOpen Settings → Developer → Edit Config, then add:
{
"mcpServers": {
"hyperstore": {
"url": "https://mcp.store.hypergpt.ai/mcp"
}
}
}HyperStore MCP turns the world's first AI apps directory into a native tool that Claude, ChatGPT, Cursor, Windsurf, Cline, Zed and Gemini can use directly — no plugins, no scraping, no API keys.
uvx hyperstore-mcp
The Model Context Protocol is an open standard from Anthropic that lets any AI assistant call external tools the same way a developer calls an API — but in plain conversation. Think of it as USB-C for AI: one protocol, every client, every data source.
Your AI doesn't browse the web or guess. It calls our directory directly and gets structured, up-to-date answers.
Read endpoints are public. Install, connect, ask. Nothing to register, nothing to bill.
Streamable HTTP, SSE and stdio transports — every major MCP client is supported out of the box.
Eight first-class tools your assistant can call on your behalf.
Full-text search across every app — name, description, features.
Natural-language semantic search: "a writing assistant cheaper than Jasper".
Full details for a single app: pricing, features, pros/cons, social links.
Paginated listing with filters (pricing model, has API, open source, rating).
Every curated category with live app counts.
All apps in a category, sorted by popularity or rating.
Editorially curated picks — trending, new, staff favorites.
The full homepage in one shot — featured apps, top categories, hero copy.
Pick your client. Pick your transport. Copy. Done.
Open Settings → Developer → Edit Config, then add:
{
"mcpServers": {
"hyperstore": {
"url": "https://mcp.store.hypergpt.ai/mcp"
}
}
}One command from your terminal:
claude mcp add --transport http \
hyperstore https://mcp.store.hypergpt.ai/mcpIn Settings → Connectors → Add a custom connector, paste:
https://mcp.store.hypergpt.ai/mcpAdd to ~/.cursor/mcp.json:
{
"mcpServers": {
"hyperstore": {
"url": "https://mcp.store.hypergpt.ai/mcp"
}
}
}Open Cascade → MCP Servers → Add Custom:
{
"mcpServers": {
"hyperstore": {
"serverUrl": "https://mcp.store.hypergpt.ai/mcp"
}
}
}In your Zed settings under context_servers:
"context_servers": {
"hyperstore": {
"source": "custom",
"url": "https://mcp.store.hypergpt.ai/mcp"
}
}Open the MCP panel → Configure MCP Servers:
{
"mcpServers": {
"hyperstore": {
"transportType": "streamableHttp",
"url": "https://mcp.store.hypergpt.ai/mcp"
}
}
}From your terminal:
gemini mcp add --transport http \
hyperstore https://mcp.store.hypergpt.ai/mcpAdd this entry to any MCP-compatible client's config (Claude Desktop, Cursor, Windsurf, Cline, Zed, etc):
{
"mcpServers": {
"hyperstore": {
"command": "uvx",
"args": ["hyperstore-mcp"]
}
}
}Requires uv. The package runs locally and proxies to store.hypergpt.ai — no data leaves your machine other than the directory queries themselves.
We publish to every major MCP registry so your client can discover us automatically.
No. Read tools (search, list, get) are open and unauthenticated. We rate-limit per IP.
Yes — the source is MIT-licensed on GitHub. Clone, run with uvx, point your client at it.
The public read API powering MCP is free and stays free. Write operations (listing your app) follow the normal submission flow.
We publish a standard discovery document at /.well-known/mcp.json. Clients that support auto-discovery will find us with just the domain. View discovery document →
Install in under 30 seconds. No signup, no key, no friction.