Why LLMs Need QR Code Generation
"Make me a QR code for my website." It is one of the most common practical requests people make to ChatGPT, Claude, and other large language models. And until recently, every LLM had to give the same disappointing answer: sorry, I can't generate images like that.
The problem is straightforward. LLMs are text-generation systems. They can write code, explain concepts, and produce structured data, but they cannot render a bitmap image of a QR code from scratch. A QR code is a precise binary matrix governed by the ISO/IEC 18004 standard, with exact module placement, error correction codewords, and masking patterns. Even if an LLM could describe every black and white module in a 177x177 grid, it would have no way to deliver that as a scannable image in a chat interface.
This is where QR Gen's API changes everything. By providing a simple HTTP GET endpoint that returns a QR code image, QR Gen gives any LLM the ability to "generate" QR codes. The LLM doesn't need to understand Reed-Solomon error correction or Galois field arithmetic. It just needs to construct a URL. And constructing URLs is something every LLM is exceptionally good at.
The implications are significant. Millions of people interact with AI assistants daily, and a meaningful fraction of those conversations involve requests that could benefit from a QR code: sharing WiFi credentials, linking to a website, encoding contact information, generating payment links, or creating event check-in codes. With QR Gen's API, every one of those requests can now be fulfilled instantly. Let's walk through the different integration approaches, from the simplest to the most sophisticated.
The API Approach: One GET Request
QR Gen exposes a single GET endpoint that accepts the data to encode as a query parameter and returns a PNG image. There is no API key, no authentication, no rate limiting for reasonable usage. The simplest possible call looks like this:
curl "https://qrgenapp.com/api/qr?data=https://example.com" -o qr.pngThat's it. The response is a PNG image of a QR code encoding the URL https://example.com. You can customize the output with additional parameters:
curl "https://qrgenapp.com/api/qr?data=https://example.com&size=400&fg=1a1a2e&bg=ffffff" -o qr.pngThe size parameter controls the image dimensions in pixels. The fg and bg parameters set foreground and background colors as hex values (without the # prefix). The API also supports format=svg for vector output, which is useful for print applications.
Here's the key insight that makes this work for LLMs: because the endpoint is a simple GET request that returns an image, the URL itself can be embedded directly in markdown as an image. Any LLM that can output markdown can "generate" a QR code by writing:
When a chat interface renders this markdown, the user sees an actual QR code image inline in the conversation. The LLM never touched a pixel. It wrote a URL, and the browser did the rest. This approach works in ChatGPT, Claude, Perplexity, and any other interface that renders markdown images. See the full API reference for all available parameters.
Function Calling and Tool Use
The markdown embed trick is elegant, but function calling (also called tool use) provides a more structured approach. Both OpenAI and Anthropic support function calling, where the model can invoke predefined tools during a conversation. You define a tool schema, and when the model determines it needs to generate a QR code, it emits a structured function call instead of free-form text.
Here's the tool schema you would register for QR code generation:
{
"name": "generate_qr_code",
"description": "Generate a QR code image for the given data. Returns an image URL. Supports URLs, plain text, WiFi credentials (WIFI:T:WPA;S:ssid;P:password;;), vCards, and any other string.",
"parameters": {
"type": "object",
"properties": {
"data": {
"type": "string",
"description": "The data to encode in the QR code"
},
"size": {
"type": "integer",
"description": "Image size in pixels (default: 300)",
"default": 300
},
"fg": {
"type": "string",
"description": "Foreground color as hex without #, e.g. '000000'",
"default": "000000"
},
"bg": {
"type": "string",
"description": "Background color as hex without #, e.g. 'ffffff'",
"default": "ffffff"
}
},
"required": ["data"]
}
}Your backend handler receives the function call, constructs the QR Gen API URL, and returns it to the model. Here's how a typical conversation flows:
User: Create a QR code for my WiFi network MyNetwork with password abc123
LLM: [calls generate_qr_code with data="WIFI:T:WPA;S:MyNetwork;P:abc123;;"]
Tool response: https://qrgenapp.com/api/qr?data=WIFI%3AT%3AWPA%3BS%3AMyNetwork%3BP%3Aabc123%3B%3B&size=400
LLM: Here's your WiFi QR code. When scanned, it will automatically connect the device to "MyNetwork" using WPA security.
The model knows to use the WIFI: URI scheme because that's part of its training data. The function calling layer handles the API interaction cleanly, and the user gets a working QR code in seconds. The model can also handle vCard encoding for contact cards, geo: URIs for locations, MATMSG: for pre-composed emails, and plain URLs. For more details on integrating with OpenAI models, see our OpenAI integration guide.
MCP: Model Context Protocol
Anthropic's Model Context Protocol (MCP) takes tool use a step further by standardizing how AI models discover and interact with external services. Instead of hardcoding tool schemas in your application, an MCP server advertises its capabilities, and MCP-compatible clients like Claude Desktop, Claude Code, Cursor, and Windsurf can connect to it dynamically.
An MCP server for QR Gen would expose a generate_qr tool that any connected client can use. Here's a minimal MCP server implementation in TypeScript:
import { McpServer } from "@modelcontextprotocol/sdk/server/mcp.js";
import { z } from "zod";
const server = new McpServer({ name: "qrgen", version: "1.0.0" });
server.tool(
"generate_qr",
"Generate a QR code image URL for the given data",
{
data: z.string().describe("Data to encode in the QR code"),
size: z.number().optional().default(300).describe("Image size in pixels"),
},
async ({ data, size }) => {
const params = new URLSearchParams({ data, size: String(size) });
const url = `https://qrgenapp.com/api/qr?${params}`;
return {
content: [{ type: "text", text: url }],
};
}
);Once this server is running, any MCP client can add it to their configuration. In Claude Desktop, you'd add it to claude_desktop_config.json. In Cursor or Windsurf, you'd add it to the project's MCP settings. The AI model then has access to QR code generation in every conversation without any additional setup. For the complete MCP integration walkthrough, see our full LLM integration guide.
Claude Code: The Bash Tool Approach
Claude Code is Anthropic's command-line coding agent, and it has a particularly direct path to QR code generation. Because Claude Code can execute shell commands via its Bash tool, it can call the QR Gen API using curl without any additional configuration:
curl "https://qrgenapp.com/api/qr?data=https://myapp.com/download&size=500" -o qr-code.pngThis is not a hypothetical integration. Claude Code can do this right now, natively, with no plugins or MCP servers required. When a developer working in Claude Code says "generate a QR code for our app's download link," the agent constructs the curl command, executes it, and saves the PNG to the project directory. The developer gets a production-ready QR code asset without leaving their terminal.
Claude Code can also generate QR codes as part of larger workflows. For example, a developer might ask: "Add a QR code to our README that links to the documentation site." Claude Code would generate the image via curl, save it to the repo, and update the README markdown to reference it, all in a single conversation turn. It can also use the format=svg parameter when vector output is needed for web assets or print materials.
For more advanced Claude Code workflows, including batch generation and CI integration, see our Claude Code guide.
Custom GPTs: QR Gen as an Action
OpenAI's Custom GPTs allow you to create specialized ChatGPT variants with custom instructions and external API integrations called "actions." Adding QR Gen as an action turns any Custom GPT into a QR code generator.
The setup requires an OpenAPI specification that describes the QR Gen endpoint. Here's the minimal spec:
{
"openapi": "3.1.0",
"info": {
"title": "QR Gen API",
"version": "1.0.0"
},
"servers": [
{ "url": "https://qrgenapp.com" }
],
"paths": {
"/api/qr": {
"get": {
"operationId": "generateQrCode",
"summary": "Generate a QR code image",
"parameters": [
{
"name": "data",
"in": "query",
"required": true,
"schema": { "type": "string" },
"description": "The data to encode"
},
{
"name": "size",
"in": "query",
"schema": { "type": "integer", "default": 300 },
"description": "Image size in pixels"
},
{
"name": "fg",
"in": "query",
"schema": { "type": "string", "default": "000000" },
"description": "Foreground hex color"
},
{
"name": "bg",
"in": "query",
"schema": { "type": "string", "default": "ffffff" },
"description": "Background hex color"
}
],
"responses": {
"200": {
"description": "QR code PNG image",
"content": {
"image/png": {
"schema": { "type": "string", "format": "binary" }
}
}
}
}
}
}
}
}In the Custom GPT builder, paste this spec under "Actions," give the GPT instructions like "When the user asks for a QR code, use the generateQrCode action," and publish. Users of your Custom GPT can then say "make a QR code for my portfolio at janesmith.dev" and get a scannable image directly in the chat. No API key is needed on the user's side since the action is configured at the GPT level.
Perplexity, Copilot, and Other AI Assistants
Not every AI assistant supports function calling or custom actions. But the markdown image embed approach works across a surprisingly wide range of platforms. The requirement is simple: the AI must be able to output markdown, and the interface must render markdown images by fetching external URLs.
Perplexity renders markdown images in its answers. If Perplexity's model outputs , the user sees a QR code. The same applies to Microsoft Copilot in Bing, which renders markdown in its responses. GitHub Copilot Chat in VS Code also renders markdown, meaning developers can ask Copilot to generate QR codes for use in documentation or README files.
The pattern extends to any AI-powered tool that renders markdown. Notion AI, Slack-integrated bots, Discord bots, documentation generators, and custom chatbots built on LLM APIs can all leverage this technique. If your application renders markdown and has access to the internet, it can display QR codes from QR Gen's API.
Even in contexts where markdown images aren't rendered, the URL itself is still useful. An LLM can output the raw URL, and the user can paste it into a browser to download the QR code. It's one extra step, but it still solves the fundamental problem of generating QR codes from a conversation.
Real-World Use Cases
The combination of LLMs and QR Gen's API opens up use cases that weren't practical before:
- Developer tooling. A developer asks their AI coding assistant to add a QR code to a project's README that links to the live demo. The assistant generates the image via the API, commits it to the repo, and updates the markdown. What previously required opening a separate tool, downloading an image, and manually editing the README is now a single conversational request.
- Customer support bots. A support chatbot for a SaaS product can generate QR codes on the fly: "Here's a QR code that links directly to your account settings page" or "Scan this to download our mobile app." The bot constructs the URL with the customer-specific data and embeds it in the conversation.
- Documentation generation. Teams using AI to generate or update documentation can automatically include QR codes for every external link. A script can process a docs folder, identify URLs, and generate corresponding QR codes via the API. This is especially valuable for printed documentation where clickable links don't exist.
- Event management. An AI assistant helping plan an event can generate unique QR codes for tickets, check-in links, WiFi access, feedback forms, and venue maps, all within a single planning conversation. Each QR code is just an API call with different data.
- Education. Teachers using AI to create worksheets or handouts can include QR codes that link to supplementary videos, interactive exercises, or answer keys. The AI generates the content and the QR codes together, producing a complete handout in one pass.
- Retail and restaurant operations. A small business owner chatting with an AI assistant can say "create a QR code for my menu at myrestaurant.com/menu" and immediately get a print-ready image. No design skills, no separate tools, no learning curve.
Getting Started
The fastest way to start is the markdown embed. If you're building a chatbot or AI-powered application, have your LLM output this pattern when users request QR codes:
For more control, implement function calling with the tool schema shown above. For Claude-based applications, consider adding QR Gen as an MCP server. And if you're building a Custom GPT, paste the OpenAPI spec into your action configuration.
All of these approaches use the same underlying API, which is free, requires no authentication, and supports any data format that QR codes can encode: URLs, plain text, WiFi credentials, vCards, email addresses, phone numbers, geographic coordinates, and more.
For the complete integration guide covering all LLM platforms, authentication options, error handling, and advanced parameters, see the LLM integration documentation.