# Upstash Redis Documentation Upstash Redis is a highly available, infinitely scalable Redis-compatible database designed for serverless and edge computing environments. It provides 99.99% uptime guarantee with auto-scaling, ultra-low latency worldwide through multi-region replication, and durable persistent storage. The platform uniquely offers both traditional TCP-based Redis protocol access and HTTP/REST API access, making it ideal for environments where TCP connections are unavailable or problematic, such as Cloudflare Workers, Vercel Edge Functions, AWS Lambda, and other serverless platforms. The service operates on a serverless pricing model where you pay per request rather than provisioning capacity, with a generous free tier (256MB data, 500K commands/month). Global databases distribute read replicas across multiple regions for <1ms read latency from the same region and <50ms within the same continent. All databases include automatic backups, TLS encryption by default, and optional enterprise features like SOC-2 compliance, encryption at rest, and RBAC. ## REST API - Basic Commands Execute Redis commands via HTTP/REST without TCP connections. ```bash # Set a key-value pair curl https://us1-merry-cat-32748.upstash.io/set/foo/bar \ -H "Authorization: Bearer 2553feg6a2d9842h2a0gcdb5f8efe9934" # Response: {"result":"OK"} # Get a value curl https://us1-merry-cat-32748.upstash.io/get/foo \ -H "Authorization: Bearer 2553feg6a2d9842h2a0gcdb5f8efe9934" # Response: {"result":"bar"} # Set with expiration (100 seconds) curl https://us1-merry-cat-32748.upstash.io/set/key/value/EX/100 \ -H "Authorization: Bearer 2553feg6a2d9842h2a0gcdb5f8efe9934" # Get multiple keys curl https://us1-merry-cat-32748.upstash.io/mget/key1/key2/key3 \ -H "Authorization: Bearer 2553feg6a2d9842h2a0gcdb5f8efe9934" # Response: {"result":["value1","value2","value3"]} # Hash operations curl https://us1-merry-cat-32748.upstash.io/hget/employee:23381/salary \ -H "Authorization: Bearer 2553feg6a2d9842h2a0gcdb5f8efe9934" # Sorted set operations curl https://us1-merry-cat-32748.upstash.io/zadd/teams/100/team-x/90/team-y \ -H "Authorization: Bearer 2553feg6a2d9842h2a0gcdb5f8efe9934" ``` ## REST API - JSON and Binary Data Post JSON or binary data as request body for complex values. ```bash # Post JSON data curl -X POST -d '{"user":"alice","role":"admin","created":"2025-10-10"}' \ https://us1-merry-cat-32748.upstash.io/set/user:1001 \ -H "Authorization: Bearer 2553feg6a2d9842h2a0gcdb5f8efe9934" # Response: {"result":"OK"} # Set with expiration via query parameter curl -X POST -d '{"session":"active","ip":"192.168.1.1"}' \ "https://us1-merry-cat-32748.upstash.io/set/session:xyz?EX=3600" \ -H "Authorization: Bearer 2553feg6a2d9842h2a0gcdb5f8efe9934" # Send entire command as JSON array curl -X POST -d '["SET", "config:app", "{\"theme\":\"dark\",\"lang\":\"en\"}", "EX", 86400]' \ https://us1-merry-cat-32748.upstash.io \ -H "Authorization: Bearer 2553feg6a2d9842h2a0gcdb5f8efe9934" # Get with base64 encoding for binary data curl https://us1-merry-cat-32748.upstash.io/GET/binarykey \ -H "Authorization: Bearer 2553feg6a2d9842h2a0gcdb5f8efe9934" \ -H "Upstash-Encoding: base64" # Response: {"result":"YmFy"} ``` ## REST API - Pipeline Commands Execute multiple commands in a single HTTP request for better throughput. ```bash # Pipeline multiple commands (non-atomic) curl -X POST https://us1-merry-cat-32748.upstash.io/pipeline \ -H "Authorization: Bearer 2553feg6a2d9842h2a0gcdb5f8efe9934" \ -d '[ ["SET", "user:count", "0"], ["INCR", "user:count"], ["SET", "user:1", "alice"], ["SET", "user:2", "bob"], ["GET", "user:count"], ["ZADD", "leaderboard", 100, "alice", 85, "bob"] ]' # Response: [ # {"result":"OK"}, # {"result":1}, # {"result":"OK"}, # {"result":"OK"}, # {"result":"1"}, # {"result":2} # ] # Commands execute in order but are not atomic # Other clients can interleave operations ``` ## REST API - Transactions Execute atomic transactions via multi-exec endpoint. ```bash # Atomic transaction (all or nothing) curl -X POST https://us1-merry-cat-32748.upstash.io/multi-exec \ -H "Authorization: Bearer 2553feg6a2d9842h2a0gcdb5f8efe9934" \ -d '[ ["SET", "balance:alice", "1000"], ["DECRBY", "balance:alice", "100"], ["INCRBY", "balance:bob", "100"], ["GET", "balance:alice"], ["GET", "balance:bob"] ]' # Response: [ # {"result":"OK"}, # {"result":900}, # {"result":100}, # {"result":"900"}, # {"result":"100"} # ] # All commands execute atomically # No other commands interleave during execution # If any command fails, others still execute (Redis behavior) ``` ## REST API - Pub/Sub and Monitoring Subscribe to channels and monitor database activity using Server-Sent Events. ```bash # Subscribe to a channel (keeps connection open) curl -X POST https://us1-merry-cat-32748.upstash.io/subscribe/chat \ -H "Authorization: Bearer 2553feg6a2d9842h2a0gcdb5f8efe9934" \ -H "Accept: text/event-stream" # Incoming events: # data: subscribe,chat,1 # data: message,chat,hello # data: message,chat,how are you? # Publish to a channel curl -X POST https://us1-merry-cat-32748.upstash.io/publish/chat/hello \ -H "Authorization: Bearer 2553feg6a2d9842h2a0gcdb5f8efe9934" # Monitor all commands (debugging) curl -X POST https://us1-merry-cat-32748.upstash.io/monitor \ -H "Authorization: Bearer 2553feg6a2d9842h2a0gcdb5f8efe9934" \ -H "Accept: text/event-stream" # Output: # data: "OK" # data: 1721284005.242090 [0 0.0.0.0:0] "GET" "k" # data: 1721284008.663811 [0 0.0.0.0:0] "SET" "k" "v" ``` ## TypeScript SDK - Basic Usage Connectionless HTTP-based Redis client for TypeScript and JavaScript. ```typescript import { Redis } from "@upstash/redis"; // Initialize with credentials const redis = new Redis({ url: "https://us1-merry-cat-32748.upstash.io", token: "2553feg6a2d9842h2a0gcdb5f8efe9934", }); // Or load from environment variables const redis = Redis.fromEnv(); // UPSTASH_REDIS_REST_URL, UPSTASH_REDIS_REST_TOKEN // String operations await redis.set("key", "value"); await redis.set("session:abc", "active", { ex: 3600 }); // 1 hour expiration const data = await redis.get("key"); console.log(data); // "value" // Sorted sets await redis.zadd("scores", { score: 100, member: "team1" }); await redis.zadd("scores", { score: 85, member: "team2" }); const topScores = await redis.zrange("scores", 0, 100); console.log(topScores); // ["team2", "team1"] // Lists await redis.lpush("queue", "job1"); await redis.lpush("queue", "job2"); const jobs = await redis.lrange("queue", 0, 10); console.log(jobs); // ["job2", "job1"] // Hashes await redis.hset("user:1001", { name: "alice", age: "30", city: "NYC" }); const name = await redis.hget("user:1001", "name"); console.log(name); // "alice" // Sets await redis.sadd("tags", "javascript", "typescript", "node"); const tag = await redis.spop("tags", 1); console.log(tag); // ["javascript"] ``` ## TypeScript SDK - Cloudflare Workers Use Redis from edge environments without TCP connections. ```typescript // worker.ts - Cloudflare Worker example import { Redis } from "@upstash/redis/cloudflare"; export default { async fetch(request: Request, env: Env): Promise { const redis = new Redis({ url: env.UPSTASH_REDIS_REST_URL, token: env.UPSTASH_REDIS_REST_TOKEN, }); // Rate limiting check const ip = request.headers.get("CF-Connecting-IP") || "unknown"; const count = await redis.incr(`rate:${ip}`); if (count === 1) { await redis.expire(`rate:${ip}`, 60); // 60 second window } if (count > 10) { return new Response("Rate limit exceeded", { status: 429 }); } // Cache response const cacheKey = `cache:${new URL(request.url).pathname}`; let cached = await redis.get(cacheKey); if (!cached) { cached = { data: "expensive computation result", timestamp: Date.now() }; await redis.set(cacheKey, JSON.stringify(cached), { ex: 300 }); } return new Response(JSON.stringify(cached), { headers: { "Content-Type": "application/json" }, }); }, }; ``` ## Python SDK - Basic Usage Connectionless HTTP-based Redis client for Python with sync and async support. ```python from upstash_redis import Redis # Initialize with credentials redis = Redis( url="https://us1-merry-cat-32748.upstash.io", token="2553feg6a2d9842h2a0gcdb5f8efe9934" ) # Or load from environment variables redis = Redis.from_env() # UPSTASH_REDIS_REST_URL, UPSTASH_REDIS_REST_TOKEN # String operations redis.set("key", "value") redis.set("counter", 0) print(redis.get("key")) # "value" # Increment operations redis.incr("counter") redis.incr("counter") print(redis.get("counter")) # "2" # Hash operations redis.hset("product:1001", {"name": "laptop", "price": "999", "stock": "50"}) price = redis.hget("product:1001", "price") print(price) # "999" # Lists redis.lpush("notifications", "msg1") redis.lpush("notifications", "msg2") messages = redis.lrange("notifications", 0, 10) print(messages) # ["msg2", "msg1"] # Custom commands not implemented in SDK result = redis.execute(["XLEN", "stream:events"]) print(result) ``` ## Python SDK - Async and Pipelines Use async Redis client and batch commands for better performance. ```python from upstash_redis.asyncio import Redis import asyncio # Async client for concurrent operations redis = Redis.from_env() async def main(): # Async operations await redis.set("user:1", "alice") await redis.set("user:2", "bob") # Concurrent operations results = await asyncio.gather( redis.get("user:1"), redis.get("user:2"), redis.incr("visit:count") ) print(results) # ["alice", "bob", 1] asyncio.run(main()) # Pipelines - batch multiple commands redis = Redis.from_env() pipeline = redis.pipeline() pipeline.set("key1", "value1") pipeline.incr("counter") pipeline.get("counter") pipeline.zadd("scores", {"score": 100, "member": "player1"}) results = pipeline.exec() print(results) # [True, 1, "1", 1] # Transactions - atomic execution transaction = redis.multi() transaction.set("balance:alice", 1000) transaction.decrby("balance:alice", 100) transaction.incrby("balance:bob", 100) results = transaction.exec() print(results) # [True, 900, 100] # Chain commands pipeline = redis.pipeline() result = pipeline.set("a", 1).incr("a").get("a").exec() print(result) # [True, 2, "2"] ``` ## Rate Limiting - Basic Usage Implement rate limiting with multiple algorithm options. ```typescript import { Ratelimit } from "@upstash/ratelimit"; import { Redis } from "@upstash/redis"; // Create rate limiter with sliding window algorithm const ratelimit = new Ratelimit({ redis: Redis.fromEnv(), limiter: Ratelimit.slidingWindow(10, "10 s"), // 10 requests per 10 seconds analytics: true, prefix: "@upstash/ratelimit", }); // Check rate limit per user/API key/IP async function handleRequest(userId: string) { const { success, limit, remaining, reset } = await ratelimit.limit(userId); if (!success) { return { error: "Rate limit exceeded", retryAfter: Math.floor((reset - Date.now()) / 1000), remaining: 0, }; } // Process request const result = await expensiveOperation(); return { data: result, rateLimit: { limit, remaining, reset: new Date(reset).toISOString(), }, }; } // Different algorithms for different use cases const fixedWindow = new Ratelimit({ redis: Redis.fromEnv(), limiter: Ratelimit.fixedWindow(5, "30 s"), // 5 requests per 30 sec window }); const tokenBucket = new Ratelimit({ redis: Redis.fromEnv(), limiter: Ratelimit.tokenBucket(10, "1 m", 20), // 10 refill/min, max 20 tokens }); ``` ## Rate Limiting - Advanced Features Multi-region setup, custom rates, and analytics for production use. ```typescript import { Ratelimit } from "@upstash/ratelimit"; import { Redis } from "@upstash/redis"; // Multi-region configuration for low latency const ratelimit = new Ratelimit({ redis: [ new Redis({ url: US_URL, token: US_TOKEN }), new Redis({ url: EU_URL, token: EU_TOKEN }), new Redis({ url: ASIA_URL, token: ASIA_TOKEN }), ], limiter: Ratelimit.slidingWindow(100, "1 m"), analytics: true, }); // Different limits for different user tiers async function rateLimitByTier(userId: string, tier: string) { const limits = { free: new Ratelimit({ redis: Redis.fromEnv(), limiter: Ratelimit.slidingWindow(10, "1 m"), }), pro: new Ratelimit({ redis: Redis.fromEnv(), limiter: Ratelimit.slidingWindow(100, "1 m"), }), enterprise: new Ratelimit({ redis: Redis.fromEnv(), limiter: Ratelimit.slidingWindow(1000, "1 m"), }), }; const limiter = limits[tier] || limits.free; return await limiter.limit(userId); } // Custom token consumption based on request size const ratelimit = new Ratelimit({ redis: Redis.fromEnv(), limiter: Ratelimit.tokenBucket(1000, "1 m", 2000), }); async function handleUpload(file: File, userId: string) { const tokens = Math.ceil(file.size / 1024); // 1 token per KB const { success, pending } = await ratelimit.limit(userId, { rate: tokens }); if (!success) { return { error: "Rate limit exceeded" }; } // Wait for background tasks (analytics, multi-region sync) await pending; return { success: true }; } // Timeout fallback for Redis unavailability const ratelimit = new Ratelimit({ redis: Redis.fromEnv(), limiter: Ratelimit.slidingWindow(10, "10 s"), timeout: 1000, // Allow request if Redis doesn't respond in 1s analytics: true, }); ``` ## Next.js API Route with Redis Complete Next.js integration with caching and rate limiting. ```typescript // app/api/products/route.ts import { Redis } from "@upstash/redis"; import { Ratelimit } from "@upstash/ratelimit"; import { NextRequest, NextResponse } from "next/server"; const redis = Redis.fromEnv(); const ratelimit = new Ratelimit({ redis, limiter: Ratelimit.slidingWindow(10, "10 s"), }); export async function GET(request: NextRequest) { // Rate limiting by IP const ip = request.ip || "anonymous"; const { success, pending } = await ratelimit.limit(ip); if (!success) { return NextResponse.json( { error: "Too many requests" }, { status: 429 } ); } // Check cache const cacheKey = "products:all"; let products = await redis.get(cacheKey); if (!products) { // Fetch from database products = await db.query("SELECT * FROM products"); // Cache for 5 minutes await redis.set(cacheKey, JSON.stringify(products), { ex: 300 }); } else { products = JSON.parse(products); } // Wait for analytics and multi-region sync await pending; return NextResponse.json({ products }); } export async function POST(request: NextRequest) { const ip = request.ip || "anonymous"; const { success } = await ratelimit.limit(ip); if (!success) { return NextResponse.json({ error: "Too many requests" }, { status: 429 }); } const body = await request.json(); // Save to database const product = await db.insert("products", body); // Invalidate cache await redis.del("products:all"); // Update counter await redis.incr("stats:products:created"); return NextResponse.json({ product }, { status: 201 }); } ``` ## Vercel Edge Middleware Global rate limiting and authentication at the edge. ```typescript // middleware.ts import { Redis } from "@upstash/redis/cloudflare"; import { Ratelimit } from "@upstash/ratelimit"; import { NextRequest, NextResponse } from "next/server"; const redis = new Redis({ url: process.env.UPSTASH_REDIS_REST_URL!, token: process.env.UPSTASH_REDIS_REST_TOKEN!, }); const ratelimit = new Ratelimit({ redis, limiter: Ratelimit.slidingWindow(20, "10 s"), analytics: true, }); export async function middleware(request: NextRequest) { const ip = request.ip || "anonymous"; const { success, pending, limit, remaining, reset } = await ratelimit.limit(ip); const response = success ? NextResponse.next() : NextResponse.json({ error: "Too Many Requests" }, { status: 429 }); // Add rate limit headers response.headers.set("X-RateLimit-Limit", limit.toString()); response.headers.set("X-RateLimit-Remaining", remaining.toString()); response.headers.set("X-RateLimit-Reset", new Date(reset).toISOString()); // Check authentication const token = request.headers.get("authorization")?.replace("Bearer ", ""); if (token) { const session = await redis.get(`session:${token}`); if (session) { response.headers.set("X-User-Id", JSON.parse(session).userId); } } // Wait for background operations request.waitUntil(pending); return response; } export const config = { matcher: "/api/:path*", }; ``` ## Use Cases and Integration Patterns Upstash Redis excels in serverless and edge computing scenarios where traditional TCP-based Redis connections are problematic or impossible. Primary use cases include session caching for Next.js applications with server-side rendering, where the REST API eliminates connection pooling issues in serverless functions; API rate limiting across distributed edge networks using the @upstash/ratelimit library; real-time leaderboards for gaming applications using sorted sets; job queues with Redis lists for background processing; and feature flags with hash operations for instant configuration updates. The global database feature enables ultra-low latency reads worldwide by routing requests to the nearest read replica while maintaining consistency through asynchronous replication to the primary region. Integration patterns leverage the connectionless HTTP model for maximum compatibility: Vercel Edge Middleware uses the Cloudflare-compatible import for global rate limiting before requests reach your application; AWS Lambda functions avoid connection timeouts by using the REST SDK instead of TCP clients; Cloudflare Workers access Redis data without WebSocket or TCP support using the specialized Cloudflare import; and Next.js API routes cache expensive database queries in Redis with automatic invalidation strategies. The pay-per-request pricing model ($0.20 per 100K requests) aligns perfectly with serverless usage patterns, while the free tier (500K commands/month, 256MB storage) supports development and small production workloads. For write-heavy workloads, regional databases offer better performance, while read-heavy global applications benefit from multi-region replication with <1ms same-region latency.