Prompt Studio
Fan a prompt across providers, see streamed outputs side by side.
System (optional) applies to all providers
Prompt use {{var}} for substitution, {{var?}} for optional
idle
Eval Bench
Score a dataset across providers with assertions or LLM-as-judge.
Prompt template use {{input}} and {{expected}} substitutions
Assertions scored against each provider's output
Providers pick in Prompt Studio's provider row, then come back
idle
RAG Lab
Chunk → embed → retrieve → build context. Hand off to Studio for inference.
① Documents & chunking
② Embeddingsnot loaded
③ Retriever

Dense (cosine)

BM25

Hybrid (RRF)

④ Context builder0 chunks · 0 chars
Run history
Last 0 runs · stored in OPFS.
Models & Keys
Keys stay in this browser's localStorage. Test buttons hit each provider's models endpoint.

Providers

Model matrix

Pricing per 1K tokens · last refreshed 2026-05-15.

Rate-limit calculator

Given an RPM and per-call token budget, estimate throughput.
Settings

JS API

window.mantra
Cross-tab postMessage

Storage

Runs in OPFS
·
Clear local data

About

Mantra is part of the NakliTechie single-file browser-native series. Source: github.com/NakliTechie/Mantra