List available DeepSeek models via MPP-gated endpoint, settled on Tempo L2.
What it does
This endpoint returns the list of available DeepSeek models accessible through the Locus MPP (Micropayment Protocol) gateway. It is part of a broader DeepSeek service that also exposes chat completion and fill-in-the-middle (FIM) code completion endpoints. The list-models call costs a fixed 3000 base units of pathUSD (0.003 USD) per request, settled via the Tempo method on Tempo L2.
The parent service provides access to DeepSeek-V3 (aliased as deepseek-chat) for fast chat and code generation, and DeepSeek-R1 (aliased as deepseek-reasoner) for chain-of-thought reasoning. All endpoints follow an OpenAI-compatible API format. The chat endpoint costs approximately $0.004–$0.025 per call depending on model and token count, while the FIM endpoint costs roughly $0.003–$0.005.
Note that the probe did not capture a 402 challenge on HEAD or GET for this specific endpoint; the OpenAPI spec declares it as a POST route, so the lack of a 402 on HEAD/GET is expected. The endpoint is likely live when called with POST. No request body schema is documented for list-models, suggesting it requires an empty POST. Documentation is available at the DeepSeek API reference and the Locus MPP skill file.
Capabilities
Use cases
- —Discovering which DeepSeek models are available before making a chat or FIM request
- —Programmatically checking model availability for routing logic in an AI agent
- —Building a model selector UI that dynamically lists supported DeepSeek models
Fit
Best for
- —Agents that need to enumerate available models before calling chat or FIM endpoints
- —Workflows that dynamically select between DeepSeek-V3 and DeepSeek-R1
- —Developers integrating DeepSeek via pay-per-call micropayments
Not for
- —Actually generating text or code — use the /deepseek/chat or /deepseek/fim endpoints instead
- —Users who need free model discovery — this endpoint charges $0.003 per call
Quick start
curl -X POST https://deepseek.mpp.paywithlocus.com/deepseek/list-models \
-H "Content-Type: application/json"Endpoint
Quality
The OpenAPI spec provides clear path definitions, pricing, and service context, but the probe did not capture a 402 challenge for this specific endpoint (it was probed with HEAD/GET while the spec declares POST). No request or response schema is documented for list-models. Crawled pages returned only 404 JSON errors. The endpoint is likely live but unconfirmed via probe.
Warnings
- —Probe did not return 402 on HEAD or GET; the OpenAPI spec declares this as a POST endpoint, so liveness is unconfirmed by probe
- —No request body or response schema documented for list-models
- —No example response available — actual model list contents are unknown
- —Currency address 0x20c000000000000000000000b9537d11c60e8b50 assumed to be pathUSD with 6 decimals based on Tempo L2 convention; if decimals differ, the $0.003 price would be incorrect
Citations
- —The list-models endpoint costs 3000 base units settled via Tempo methodhttps://deepseek.mpp.paywithlocus.com
- —DeepSeek-V3 for fast chat and code, DeepSeek-R1 for deep chain-of-thought reasoning, OpenAI-compatible API formathttps://deepseek.mpp.paywithlocus.com
- —Chat endpoint costs approximately $0.004–$0.025 per call; FIM endpoint costs approximately $0.003–$0.005https://deepseek.mpp.paywithlocus.com
- —API reference available at api-docs.deepseek.comhttps://api-docs.deepseek.com
- —Locus MPP documentation at beta.paywithlocus.com/mpp/deepseek.mdhttps://beta.paywithlocus.com/mpp/deepseek.md