# Remix 3
Remix 3 is a comprehensive collection of composable, web-standard packages for building modern web applications and APIs. Built on the foundation of Web APIs and designed for AI-first development, Remix provides portable abstractions that work seamlessly across Node.js, Bun, Deno, and Cloudflare Workers. The project follows model-first development principles, optimizing both source code and documentation for LLMs while maintaining runtime-first design with zero dependency on static analysis.
Each package in the Remix ecosystem is independently usable and follows the single responsibility principle. The packages leverage server-side web APIs like Web Streams, Uint8Array, Web Crypto, Blob, and File instead of runtime-specific APIs, ensuring code is not just reusable but future-proof. From HTTP routing and proxying to file storage and multipart parsing, Remix provides the building blocks for creating robust, portable web applications.
## APIs and Key Functions
### fetch-proxy - HTTP Proxy with Fetch API
Creates HTTP proxies using the familiar fetch() API with automatic cookie rewriting and header forwarding.
```typescript
import { createFetchProxy } from '@remix-run/fetch-proxy'
// Basic proxy forwarding requests to backend
let proxy = createFetchProxy('https://api.backend.com')
let response = await proxy(new Request('https://myapp.com/api/users'))
// Proxy with cookie domain/path rewriting
let proxy = createFetchProxy('https://remix.run:3000', {
rewriteCookieDomain: true,
rewriteCookiePath: true
})
// Add X-Forwarded headers for origin tracking
let proxy = createFetchProxy('https://api.example.com', {
xForwardedHeaders: true // Adds X-Forwarded-Proto and X-Forwarded-Host
})
// Custom fetch implementation
let proxy = createFetchProxy('https://backend.com', {
fetch: customFetch
})
```
### fetch-router - Composable Web Router
Minimal, composable router built on Fetch API and route-pattern for building APIs and server-rendered applications.
```typescript
import { createRouter, createRoutes, html, json, redirect } from '@remix-run/fetch-router'
// Basic routing with HTTP methods
let router = createRouter()
router.get('/', () => html('
Home
'))
router.post('/api/users', async ({ request }) => {
let user = await createUser(await request.json())
return json(user, { status: 201 })
})
// Type-safe route mapping with URL generation
let routes = createRoutes({
home: '/',
blog: {
index: '/blog',
show: '/blog/:slug',
edit: '/blog/:slug/edit',
},
api: {
users: {
index: { method: 'GET', pattern: '/api/users' },
show: { method: 'GET', pattern: '/api/users/:id' },
create: { method: 'POST', pattern: '/api/users' },
update: { method: 'PUT', pattern: '/api/users/:id' },
destroy: { method: 'DELETE', pattern: '/api/users/:id' },
},
},
})
// Generate URLs with type safety
routes.blog.show.href({ slug: 'hello-world' }) // '/blog/hello-world'
// Map routes to handlers
router.map(routes.api.users, {
index() { return json(users) },
show({ params }) { return json(getUser(params.id)) },
create({ request }) { return json(createUser(request)) },
update({ params, request }) { return json(updateUser(params.id, request)) },
destroy({ params }) { return new Response(null, { status: 204 }) },
})
// Middleware for logging, auth, etc.
router.use(async (context, next) => {
console.log(`${context.request.method} ${context.url.pathname}`)
let response = await next()
response.headers.set('X-Powered-By', 'Remix')
return response
})
// Mount sub-routers for composition
let apiRouter = createRouter()
apiRouter.get('/users', () => json(users))
apiRouter.get('/posts', () => json(posts))
let mainRouter = createRouter()
mainRouter.mount('/api', apiRouter)
// Access route parameters and headers
router.get(routes.blog.show, ({ params, headers }) => {
let post = getPost(params.slug)
if (headers.accept.accepts('application/json')) {
return json(post)
}
return html(`${post.content}`)
})
```
### file-storage - Key/Value File Storage
Key/value interface for storing File objects with metadata preservation and streaming support.
```typescript
import { LocalFileStorage } from '@remix-run/file-storage/local'
// Initialize storage
let storage = new LocalFileStorage('./uploads')
// Store files with metadata preservation
let avatar = new File(['binary data'], 'avatar.jpg', {
type: 'image/jpeg',
lastModified: Date.now()
})
await storage.set('user-123-avatar', avatar)
// Retrieve file with original metadata intact
let file = await storage.get('user-123-avatar')
console.log(file.name) // 'avatar.jpg'
console.log(file.type) // 'image/jpeg'
console.log(file.size) // file size in bytes
// Check file existence
if (await storage.has('user-123-avatar')) {
await storage.remove('user-123-avatar')
}
// List files with pagination and prefix filtering
let result = await storage.list({
prefix: 'user-123/',
includeMetadata: true,
limit: 50
})
for (let metadata of result.files) {
console.log(metadata.key, metadata.size, metadata.type)
}
// Handle pagination
if (result.cursor) {
let nextPage = await storage.list({
cursor: result.cursor,
prefix: 'user-123/'
})
}
// Put and return file in one operation
let storedFile = await storage.put('doc-456', document)
return new Response(storedFile.stream())
```
### form-data-parser - Streaming Multipart Parser
Enhanced request.formData() wrapper enabling efficient, streaming file uploads without memory exhaustion.
```typescript
import { parseFormData, MaxFileSizeExceededError } from '@remix-run/form-data-parser'
import { LocalFileStorage } from '@remix-run/file-storage/local'
// Basic file upload handling
async function handleUpload(request: Request) {
let formData = await parseFormData(request, async (upload) => {
if (upload.fieldName === 'avatar') {
let path = `/uploads/${upload.name}`
await fs.writeFile(path, upload.bytes)
return path // Return reference instead of file data
}
})
let avatarPath = formData.get('avatar')
let username = formData.get('username')
return Response.json({ username, avatarPath })
}
// Enforce file size and count limits
const oneMb = 1024 * 1024
try {
let formData = await parseFormData(request, {
maxFiles: 5,
maxFileSize: 10 * oneMb,
maxHeaderSize: 8192
}, async (upload) => {
return await processUpload(upload)
})
} catch (error) {
if (error instanceof MaxFileSizeExceededError) {
return new Response('File too large', { status: 413 })
}
throw error
}
// Integration with file-storage
let fileStorage = new LocalFileStorage('/var/uploads')
async function uploadHandler(fileUpload) {
if (fileUpload.fieldName === 'document') {
let key = `user-${userId}-${fileUpload.name}`
await fileStorage.set(key, fileUpload)
return fileStorage.get(key) // Return lazy File reference
}
}
let formData = await parseFormData(request, uploadHandler)
// Handle multiple file fields
let formData = await parseFormData(request, async (upload) => {
if (upload.fieldName === 'avatar') {
return await saveAvatar(upload)
} else if (upload.fieldName === 'documents') {
return await saveDocument(upload)
}
})
```
### headers - Type-Safe HTTP Headers
Enhanced Headers class with type-safe parsing and manipulation of complex HTTP header values.
```typescript
import Headers from '@remix-run/headers'
// Content negotiation with Accept headers
let headers = new Headers()
headers.accept = 'text/html, application/json;q=0.9, text/*;q=0.8'
headers.accept.mediaTypes // ['text/html', 'application/json', 'text/*']
headers.accept.accepts('application/json') // true
headers.accept.getPreferred(['text/html', 'application/json']) // 'text/html'
// Content-Type parsing and manipulation
headers.contentType = 'application/json; charset=utf-8'
headers.contentType.mediaType // 'application/json'
headers.contentType.charset // 'utf-8'
headers.contentType.charset = 'iso-8859-1'
headers.get('Content-Type') // 'application/json; charset=iso-8859-1'
// Multipart boundary extraction
headers.contentType = 'multipart/form-data; boundary=----WebKitFormBoundary'
headers.contentType.boundary // '----WebKitFormBoundary'
// Cookie parsing and manipulation
headers.cookie = 'session_id=abc123; theme=dark; lang=en'
headers.cookie.get('session_id') // 'abc123'
headers.cookie.set('theme', 'light')
headers.cookie.delete('lang')
headers.cookie.has('session_id') // true
// Set-Cookie with options
headers.setCookie = []
headers.setCookie.push({
name: 'session',
value: 'xyz789',
path: '/',
httpOnly: true,
secure: true,
sameSite: 'Strict',
maxAge: 3600
})
// Content-Disposition for file downloads
headers.contentDisposition = 'attachment; filename="report.pdf"'
headers.contentDisposition.type // 'attachment'
headers.contentDisposition.filename // 'report.pdf'
// Accept-Language preferences
headers.acceptLanguage = 'en-US, en;q=0.9, fr;q=0.8'
headers.acceptLanguage.accepts('en') // true
headers.acceptLanguage.getPreferred(['en', 'fr', 'de']) // 'en'
// Cache-Control directives
headers.cacheControl = 'public, max-age=3600, must-revalidate'
// Object initialization
let headers = new Headers({
contentType: {
mediaType: 'text/html',
charset: 'utf-8',
},
setCookie: [
{ name: 'session', value: 'abc', path: '/', httpOnly: true },
{ name: 'theme', value: 'dark', maxAge: 86400 },
],
})
// If-None-Match for ETags
headers.ifNoneMatch = '"abc123", "def456"'
headers.ifNoneMatch.includes('"abc123"') // true
```
### lazy-file - Lazy-Loading File Implementation
Lazy, streaming Blob/File implementation that defers reading contents until needed.
```typescript
import { LazyFile, type LazyContent } from '@remix-run/lazy-file'
import { openFile, writeFile } from '@remix-run/lazy-file/fs'
// Open file without reading data
let file = openFile('./large-video.mp4')
console.log(file.name) // 'large-video.mp4'
console.log(file.size) // 1048576000 (1GB)
// Data is only read when accessed
let text = await file.text()
let buffer = await file.arrayBuffer()
let bytes = await file.bytes()
// Stream file for HTTP responses
let file = openFile('./report.pdf')
return new Response(file.stream(), {
headers: {
'Content-Type': 'application/pdf',
'Content-Disposition': `attachment; filename="${file.name}"`
}
})
// Slice for HTTP Range requests
let videoFile = openFile('./video.mp4')
let rangeStart = 1024 * 1024 // Skip first 1MB
let chunk = videoFile.slice(rangeStart, rangeStart + 1024 * 1024)
return new Response(chunk.stream(), {
status: 206,
headers: {
'Content-Type': 'video/mp4',
'Content-Range': `bytes ${rangeStart}-${rangeStart + 1024 * 1024 - 1}/${videoFile.size}`
}
})
// Copy files efficiently
let source = openFile('./source.jpg')
await writeFile('./destination.jpg', source)
// Custom lazy content implementation
let content: LazyContent = {
byteLength: 1000,
stream(start = 0, end = 1000) {
return new ReadableStream({
start(controller) {
let data = generateData(start, end)
controller.enqueue(new TextEncoder().encode(data))
controller.close()
}
})
}
}
let file = new LazyFile(content, 'generated.txt', { type: 'text/plain' })
// Integration with file storage
let storage = new LocalFileStorage('./files')
let lazyFile = openFile('./document.pdf')
await storage.set('doc-key', lazyFile) // Streams to storage
```
### multipart-parser - Fast Multipart Parser
High-performance, streaming parser for multipart messages, perfect for handling file uploads.
```typescript
import {
parseMultipartRequest,
isMultipartRequest,
MaxFileSizeExceededError
} from '@remix-run/multipart-parser'
// Parse multipart request
async function handleUpload(request: Request) {
if (!isMultipartRequest(request)) {
return new Response('Not multipart', { status: 400 })
}
for await (let part of parseMultipartRequest(request)) {
if (part.isFile) {
console.log(`File: ${part.filename}`)
console.log(`Field: ${part.name}`)
console.log(`Type: ${part.mediaType}`)
console.log(`Size: ${part.size} bytes`)
await fs.writeFile(`/uploads/${part.filename}`, part.bytes)
} else {
console.log(`Field ${part.name}: ${part.text}`)
}
}
return new Response('Upload complete')
}
// Enforce file size limits
const oneMb = 1024 * 1024
try {
for await (let part of parseMultipartRequest(request, {
maxFileSize: 10 * oneMb,
maxHeaderSize: 8192
})) {
if (part.isFile) {
await processFile(part)
}
}
} catch (error) {
if (error instanceof MaxFileSizeExceededError) {
return new Response('File exceeds 10MB limit', { status: 413 })
}
throw error
}
// Stream large files
for await (let part of parseMultipartRequest(request)) {
if (part.isFile && part.filename.endsWith('.mp4')) {
let stream = part.stream()
await uploadToS3(stream, part.filename)
}
}
// Node.js http.IncomingMessage support
import * as http from 'node:http'
import { parseMultipartRequest } from '@remix-run/multipart-parser/node'
let server = http.createServer(async (req, res) => {
for await (let part of parseMultipartRequest(req)) {
if (part.isFile) {
let path = `/tmp/${part.filename}`
fs.writeFileSync(path, part.bytes)
}
}
res.end('OK')
})
// Low-level streaming API
import { parseMultipartStream, getMultipartBoundary } from '@remix-run/multipart-parser'
let boundary = getMultipartBoundary(request)
for await (let part of parseMultipartStream(request.body, { boundary })) {
console.log(part.name, part.size)
}
```
### node-fetch-server - Node.js Fetch Server
Build portable Node.js HTTP servers using the web-standard fetch() API.
```typescript
import * as http from 'node:http'
import * as https from 'node:https'
import { createRequestListener, type FetchHandler } from '@remix-run/node-fetch-server'
// Basic HTTP server with Fetch API
let handler: FetchHandler = async (request) => {
let url = new URL(request.url)
if (url.pathname === '/') {
return new Response('Home Page')
}
if (url.pathname === '/api/users' && request.method === 'GET') {
return Response.json(await getUsers())
}
if (url.pathname.startsWith('/api/users/') && request.method === 'GET') {
let id = url.pathname.split('/').pop()
return Response.json(await getUser(id))
}
return new Response('Not Found', { status: 404 })
}
let server = http.createServer(createRequestListener(handler))
server.listen(3000, () => console.log('Server running on port 3000'))
// Access client IP address and port
let handler: FetchHandler = async (request, client) => {
console.log(`Request from ${client.address}:${client.port}`)
if (isRateLimited(client.address)) {
return new Response('Too Many Requests', { status: 429 })
}
return Response.json({
ip: client.address,
port: client.port
})
}
// Custom hostname
let server = http.createServer(
createRequestListener(handler, {
host: process.env.HOST || 'localhost'
})
)
// HTTPS server
let options = {
key: fs.readFileSync('key.pem'),
cert: fs.readFileSync('cert.pem')
}
let httpsServer = https.createServer(
options,
createRequestListener(handler)
)
httpsServer.listen(443)
// Streaming responses
let handler: FetchHandler = async (request) => {
if (request.url.endsWith('/stream')) {
let stream = new ReadableStream({
async start(controller) {
for (let i = 0; i < 100; i++) {
controller.enqueue(
new TextEncoder().encode(`data: ${JSON.stringify({ count: i })}\n\n`)
)
await new Promise(r => setTimeout(r, 100))
}
controller.close()
}
})
return new Response(stream, {
headers: {
'Content-Type': 'text/event-stream',
'Cache-Control': 'no-cache'
}
})
}
}
// Low-level API for custom processing
import { createRequest, sendResponse } from '@remix-run/node-fetch-server'
let server = http.createServer(async (req, res) => {
let startTime = Date.now()
let request = createRequest(req, res)
let response = await handler(request)
let duration = Date.now() - startTime
response.headers.set('X-Response-Time', `${duration}ms`)
await sendResponse(res, response)
})
```
### route-pattern - Flexible URL Pattern Matching
Powerful URL pattern matching library with type-safe URL parsing and generation.
```typescript
import { RoutePattern, createHrefBuilder } from '@remix-run/route-pattern'
// Basic pattern matching
let pattern = new RoutePattern('/blog/:slug')
let match = pattern.match('https://remix.run/blog/remixing-shopify')
// { params: { slug: 'remixing-shopify' }, url: URL }
pattern.href({ slug: 'hello-world' }) // '/blog/hello-world'
// Multiple parameters in one segment
let pattern = new RoutePattern('/blog/:year-:month-:day/:slug(.html)')
let match = pattern.match('https://example.com/blog/2024-01-15/intro.html')
// { params: { year: '2024', month: '01', day: '15', slug: 'intro' } }
pattern.href({ year: '2024', month: '12', day: '25', slug: 'christmas' })
// '/blog/2024-12-25/christmas.html'
// Optional segments with parentheses
let pattern = new RoutePattern('/api(/v:major(.:minor))/users/:id(.:format)')
pattern.match('/api/users/123')
// { params: { major: undefined, minor: undefined, id: '123', format: undefined } }
pattern.match('/api/v2.1/users/123.json')
// { params: { major: '2', minor: '1', id: '123', format: 'json' } }
pattern.href({ major: '2', minor: '1', id: '123', format: 'json' })
// '/api/v2.1/users/123.json'
pattern.href({ id: '123' }) // '/api/users/123'
// Full URL patterns with host/protocol
let pattern = new RoutePattern('https://:subdomain.example.com/users/:id')
let match = pattern.match('https://api.example.com/users/456')
// { params: { subdomain: 'api', id: '456' } }
pattern.href({ subdomain: 'admin', id: '789' })
// 'https://admin.example.com/users/789'
// Wildcards for path segments
let pattern = new RoutePattern('/assets/*path.:ext')
let match = pattern.match('/assets/images/logo.png')
// { params: { path: 'images/logo', ext: 'png' } }
let pattern = new RoutePattern('https://cdn.example.com/assets/*path')
pattern.match('https://cdn.example.com/assets/css/themes/dark.css')
// { params: { path: 'css/themes/dark.css' } }
// Search parameters
let pattern = new RoutePattern('/search?q=&category=')
pattern.match('/search?q=routing&category=web') // matches
pattern.match('/search?q=routing') // null (missing category)
// Case-insensitive matching
let pattern = new RoutePattern('/Users/:id', { ignoreCase: true })
pattern.match('/users/123') // matches!
pattern.match('/USERS/456') // matches!
// Type-safe href builder
type ValidPatterns =
| '/api/v:version/products/:id.json'
| '/assets/*path.:ext'
| '/shop/:category/:product'
let href = createHrefBuilder()
href('/api/v:version/products/:id.json', {
version: '2',
id: 'laptop-123'
}) // '/api/v2/products/laptop-123.json'
href('/shop/:category/:product',
{ category: 'electronics', product: 'phone' },
{ color: 'black', size: 'large' }
) // '/shop/electronics/phone?color=black&size=large'
// Pattern joining for composition
let base = new RoutePattern('https://api.shopify.com')
let versioned = base.join('/v:version')
let full = versioned.join('/products/:id')
full.source // 'https://api.shopify.com/v:version/products/:id'
// Test without extracting params
let pattern = new RoutePattern('/admin/*')
if (pattern.test(request.url)) {
return requireAuth(request)
}
```
### tar-parser - Streaming Tar Archive Parser
Fast, efficient tar archive parser with streaming support for any JavaScript environment.
```typescript
import { parseTar } from '@remix-run/tar-parser'
// Parse tar archive from HTTP response
let response = await fetch('https://github.com/user/repo/archive/main.tar.gz')
await parseTar(
response.body.pipeThrough(new DecompressionStream('gzip')),
(entry) => {
console.log(entry.name) // 'package.json'
console.log(entry.size) // 1234
console.log(entry.mtime) // Date object
console.log(entry.type) // 'file'
console.log(entry.mode) // 0o644
}
)
// Extract specific files
await parseTar(response.body, async (entry) => {
if (entry.name === 'package.json') {
let pkg = JSON.parse(entry.text)
console.log(pkg.name, pkg.version)
}
if (entry.name.startsWith('src/') && entry.name.endsWith('.ts')) {
await fs.writeFile(entry.name, entry.bytes)
}
if (entry.name.endsWith('.jpg')) {
let imageBlob = new Blob([entry.bytes], { type: 'image/jpeg' })
await uploadImage(imageBlob)
}
})
// Stream large files to avoid memory issues
await parseTar(archiveStream, async (entry) => {
if (entry.size > 100 * 1024 * 1024) { // Files larger than 100MB
let stream = entry.stream()
await uploadToStorage(entry.name, stream)
} else {
await fs.writeFile(entry.name, entry.bytes)
}
})
// Custom filename encoding
await parseTar(
archiveStream,
{ filenameEncoding: 'latin1' },
(entry) => {
console.log(entry.name) // Decoded with latin1
}
)
// Low-level parser for incremental processing
import { TarParser } from '@remix-run/tar-parser'
let parser = new TarParser({ filenameEncoding: 'utf-8' })
for await (let chunk of readArchiveChunks()) {
for (let entry of parser.write(chunk)) {
console.log(entry.name, entry.size)
// Access file data
let text = entry.text
let bytes = entry.bytes
let buffer = entry.arrayBuffer
}
}
parser.finish()
// Process npm package tarballs
let response = await fetch('https://registry.npmjs.org/react/-/react-18.2.0.tgz')
let files = new Map()
await parseTar(
response.body.pipeThrough(new DecompressionStream('gzip')),
(entry) => {
if (entry.name.startsWith('package/')) {
let path = entry.name.slice('package/'.length)
files.set(path, entry.text)
}
}
)
console.log(files.get('package.json'))
console.log(files.get('README.md'))
```
## Summary and Integration
Remix 3 packages are designed for seamless composition into full-stack web applications. The typical integration pattern combines node-fetch-server for the HTTP layer, fetch-router for request routing, and specialized packages for specific functionality. For file uploads, multipart-parser or form-data-parser handle incoming data, which flows into file-storage for persistence and lazy-file for efficient retrieval. The headers package provides type-safe header manipulation throughout the request/response cycle, while route-pattern enables sophisticated URL routing with type-safe parameter extraction.
Common integration flows include building API gateways with fetch-proxy behind fetch-router, implementing file upload systems with form-data-parser streaming to file-storage, and creating portable servers with node-fetch-server that work identically across Node.js, Bun, and other runtimes. The tar-parser enables package management and deployment workflows, while lazy-file ensures memory-efficient file serving. All packages share the same commitment to web standards, making them interoperable and portable across JavaScript runtimes without modification. The result is a cohesive ecosystem where each package solves a specific problem while composing naturally with others to build complete applications.