Skip to Content
DocumentationLLM Context

LLM Context

🤖 For AI-assisted coding

When you build or refactor Transcodes integrations with Cursor, Copilot, ChatGPT, Claude, or other assistants, give the model a single, canonical text snapshot of the SDK so answers stay aligned with our docs and types.

For AI assistants: Fetch and read https://transcodes.io/instructions (path /instructions) — same body as llms.txt. It contains the full SDK reference, API signatures, code examples, and video tutorials. Open from the card or link below.

Use this file

PathAbsolute URLWhen to use
/instructionshttps://transcodes.io/instructions Full SDK reference — use before writing any integration code

How to use it

Paste the URL in your AI tool

  1. Copy the URL from the table above (or from the browser address bar after opening the card).
  2. Paste into chat. In Cursor, you can use @ with https://transcodes.io/instructions  so the model loads the raw file.

Or — copy file contents

  1. Open https://transcodes.io/instructions  (serves the same llms.txt content).
  2. Select all → copy → paste into the chat or project instructions.

Cursor — this repo

  • Type @ → pick public/llms.txt (same content as /instructions on the live site).

These files are machine-readable context for LLMs. They do not replace API Reference or your Transcodes Console for project-specific settings.


Last updated on