# Model Viewer Web Component ## Introduction Model Viewer is a web component library developed by Google that enables easy rendering of interactive 3D models directly in web browsers, with optional augmented reality (AR) support across multiple platforms. The project provides a standards-based approach to 3D content delivery, leveraging modern web technologies including WebGL, WebXR, and platform-specific AR features like ARCore Scene Viewer and ARKit Quick Look. Built on top of Three.js, it abstracts away the complexity of 3D rendering while providing developers with powerful customization options through a simple HTML element interface. The library follows a modular mixin-based architecture where features like animation, camera controls, environment lighting, AR capabilities, and scene graph manipulation are composed together to create the final `` custom element. This design enables progressive enhancement and allows developers to use only the features they need. The component supports glTF 2.0 models, the standard format for 3D content on the web, and provides extensive APIs for runtime manipulation of materials, textures, animations, and camera positioning. With support for all major evergreen browsers and comprehensive fallback strategies, Model Viewer strives to provide a seamless development experience while maintaining high rendering quality and performance. ## Core APIs and Functions ### Basic Model Viewer Element A web component for displaying interactive 3D models with automatic setup and rendering. ```html ``` ### Animation Control API Control and manipulate 3D model animations programmatically. ```javascript // Get reference to model-viewer element const viewer = document.querySelector('model-viewer'); // Wait for model to load viewer.addEventListener('load', () => { // Get available animations const animations = viewer.availableAnimations; console.log('Available animations:', animations); // ['Walk', 'Run', 'Jump'] // Play specific animation viewer.animationName = 'Walk'; viewer.play({ repetitions: 5, pingpong: false }); // Control playback viewer.addEventListener('play', () => console.log('Animation started')); viewer.addEventListener('finished', () => { console.log('Animation completed'); viewer.animationName = 'Idle'; viewer.play({ repetitions: Infinity }); }); // Pause/resume setTimeout(() => { viewer.pause(); console.log('Animation paused at:', viewer.currentTime, 'seconds'); }, 2000); // Adjust playback speed viewer.timeScale = 0.5; // Half speed // Scrub to specific time viewer.currentTime = 1.5; // Jump to 1.5 seconds // Append additional animation layers viewer.appendAnimation('Hand_Wave', { weight: 0.5, timeScale: 1.2, fade: 0.3, repetitions: 2 }); // Detach animation layer viewer.detachAnimation('Hand_Wave', { fade: 0.5 }); }); ``` ### Augmented Reality (AR) Integration Enable AR experiences across multiple platforms with a unified API. ```html ``` ### Camera Control and Positioning Programmatically control camera position, field of view, and interaction constraints. ```html ``` ### Material and Texture Manipulation Runtime modification of materials, colors, and textures through the Scene Graph API. ```javascript const viewer = document.querySelector('model-viewer'); viewer.addEventListener('load', async () => { // Access the model's materials const model = viewer.model; const materials = model.materials; console.log(`Model has ${materials.length} materials`); // Get material by name const bodyMaterial = model.getMaterialByName('Body'); if (bodyMaterial) { console.log('Found material:', bodyMaterial.name); // Modify PBR properties const pbr = bodyMaterial.pbrMetallicRoughness; // Change base color (RGBA or CSS color string) pbr.setBaseColorFactor([1.0, 0.0, 0.0, 1.0]); // Red // or pbr.setBaseColorFactor('#ff0000'); // Adjust metalness and roughness pbr.setMetallicFactor(0.8); // More metallic (0-1) pbr.setRoughnessFactor(0.2); // More shiny (0-1) // Modify emissive properties bodyMaterial.setEmissiveFactor([0.5, 0.2, 0.0]); // Orange glow bodyMaterial.setEmissiveStrength(2.0); // Advanced PBR properties bodyMaterial.setClearcoatFactor(0.5); bodyMaterial.setClearcoatRoughnessFactor(0.1); bodyMaterial.setIor(1.5); // Index of refraction bodyMaterial.setTransmissionFactor(0.9); // For glass-like materials bodyMaterial.setSheenColorFactor([1.0, 1.0, 1.0]); // Alpha mode and cutoff bodyMaterial.setAlphaMode('BLEND'); // 'OPAQUE', 'MASK', or 'BLEND' bodyMaterial.setAlphaCutoff(0.5); bodyMaterial.setDoubleSided(true); } // Iterate all materials materials.forEach((material, index) => { console.log(`Material ${index}: ${material.name}`); // Check texture information const baseColorTexture = material.pbrMetallicRoughness.baseColorTexture; if (baseColorTexture && baseColorTexture.texture) { const texture = baseColorTexture.texture; console.log(` Texture: ${texture.name}`); console.log(` Image source: ${texture.source.uri}`); // Modify sampler properties const sampler = texture.sampler; sampler.setWrapS(10497); // REPEAT sampler.setWrapT(10497); sampler.setRotation(Math.PI / 4); // 45 degree rotation sampler.setScale({ u: 2.0, v: 2.0 }); // Double texture scale sampler.setOffset({ u: 0.1, v: 0.1 }); } // Create thumbnail from texture if (material.pbrMetallicRoughness.baseColorTexture?.texture?.source) { material.pbrMetallicRoughness.baseColorTexture.texture.source .createThumbnail(256, 256) .then(url => { console.log('Thumbnail URL:', url); // Use URL for preview URL.revokeObjectURL(url); // Clean up when done }); } }); // Material variants support model.createVariant('Summer'); model.createMaterialInstanceForVariant(0, 'Body_Summer', 'Summer', true); const summerMaterial = model.getMaterialByName('Body_Summer'); if (summerMaterial) { summerMaterial.pbrMetallicRoughness.setBaseColorFactor('#ffff00'); } }); ``` ### Environment and Lighting Configuration Control scene lighting, shadows, exposure, and HDR environments. ```html ``` ### Export and Screenshot Functionality Capture rendered frames as images or data URLs for sharing and thumbnails. ```javascript const viewer = document.querySelector('model-viewer'); viewer.addEventListener('load', async () => { // Wait for first render await new Promise(resolve => { viewer.addEventListener('before-render', resolve, { once: true }); }); // Export as data URL const dataUrl = viewer.toDataURL('image/png', 0.95); console.log('Data URL:', dataUrl); // Display in image element const img = document.createElement('img'); img.src = dataUrl; document.body.appendChild(img); // Export as Blob for file download try { const blob = await viewer.toBlob({ mimeType: 'image/jpeg', qualityArgument: 0.9, idealAspect: true // Use ideal aspect ratio, crop if needed }); // Create download link const url = URL.createObjectURL(blob); const a = document.createElement('a'); a.href = url; a.download = 'model-screenshot.jpg'; a.click(); URL.revokeObjectURL(url); console.log('Screenshot size:', blob.size, 'bytes'); } catch (error) { console.error('Failed to create blob:', error); } // Capture at specific camera angle viewer.cameraOrbit = '45deg 60deg 2m'; viewer.jumpCameraToGoal(); await new Promise(resolve => requestAnimationFrame(resolve)); const thumbnailBlob = await viewer.toBlob({ mimeType: 'image/png', idealAspect: false }); console.log('Thumbnail created'); }); ``` ### Advanced Scene Graph Manipulation Access and modify the 3D scene hierarchy and node properties. ```javascript const viewer = document.querySelector('model-viewer'); viewer.addEventListener('load', () => { const model = viewer.model; // Model bounds and dimensions console.log('Materials:', model.materials.length); // Access scene information through internal scene API const scene = viewer[$scene]; // Note: internal API, may change // Get model dimensions const size = scene.size; console.log(`Model size: ${size.x}m x ${size.y}m x ${size.z}m`); // Bounding sphere const boundingSphere = scene.boundingSphere; console.log(`Bounding sphere radius: ${boundingSphere.radius}m`); // Ideal camera distance const idealDistance = scene.idealCameraDistance(); console.log(`Ideal camera distance: ${idealDistance}m`); // Animation information const animationNames = scene.animationNames; const duration = scene.duration; console.log(`Animations: ${animationNames.join(', ')}`); console.log(`Total duration: ${duration}s`); // Dynamic target (where camera looks) const target = scene.getDynamicTarget(); console.log(`Camera target: ${target.x}, ${target.y}, ${target.z}`); }); ``` ### NPM Package Integration Install and use model-viewer as an ES module in build systems. ```bash # Install dependencies npm install three npm install @google/model-viewer ``` ```javascript // Import in your JavaScript/TypeScript project import '@google/model-viewer'; // TypeScript type definitions are included import type { ModelViewerElement } from '@google/model-viewer'; const viewer = document.querySelector('model-viewer') as ModelViewerElement; viewer?.addEventListener('load', () => { // Type-safe API access const orbit: { theta: number; phi: number; radius: number } = viewer.getCameraOrbit(); const materials: readonly Material[] = viewer.model.materials; console.log(`Model loaded with ${materials.length} materials`); }); // Configure cache size globally import { ModelViewerElement as MVE } from '@google/model-viewer'; MVE.modelCacheSize = 10; // Cache up to 10 models // Configure minimum render scale (quality vs performance) MVE.minimumRenderScale = 0.5; // 0.25 to 1.0 ``` ### Interactive Prompt and Accessibility Control user interaction hints and accessibility features. ```html ``` ### Post-Processing Effects Integration Register custom post-processing effect composers for advanced rendering. ```javascript import '@google/model-viewer'; import { EffectComposer } from 'three/examples/jsm/postprocessing/EffectComposer.js'; import { RenderPass } from 'three/examples/jsm/postprocessing/RenderPass.js'; import { UnrealBloomPass } from 'three/examples/jsm/postprocessing/UnrealBloomPass.js'; const viewer = document.querySelector('model-viewer'); viewer.addEventListener('load', () => { // Create custom effect composer const effectComposer = { composer: null, setRenderer(renderer) { this.renderer = renderer; this.composer = new EffectComposer(renderer); }, setMainScene(scene) { this.scene = scene; const renderPass = new RenderPass(scene, scene.getCamera()); this.composer.addPass(renderPass); // Add bloom effect const bloomPass = new UnrealBloomPass( { x: window.innerWidth, y: window.innerHeight }, 1.5, // strength 0.4, // radius 0.85 // threshold ); this.composer.addPass(bloomPass); }, setMainCamera(camera) { this.camera = camera; }, setSize(width, height) { this.composer.setSize(width, height); }, beforeRender(time, delta) { // Called before each render }, render(deltaTime) { this.composer.render(deltaTime); } }; // Register the effect composer viewer.registerEffectComposer(effectComposer); // Later, to remove effects // viewer.unregisterEffectComposer(); }); ``` ## Use Cases and Integration Patterns Model Viewer excels in e-commerce applications where product visualization is critical, enabling customers to examine items from all angles with realistic lighting and materials before purchase. Online retailers integrate it to showcase furniture, electronics, jewelry, and apparel with AR preview capabilities that let customers visualize products in their actual space using their smartphone cameras. The library's material editing API allows for real-time product customization, where users can change colors, textures, and finishes to match their preferences, with all modifications rendered instantly in high quality. Educational platforms leverage Model Viewer for interactive 3D content in subjects like anatomy, architecture, engineering, and archaeology, where students can explore detailed models at their own pace with annotations and interactive hotspots providing contextual information. The integration pattern typically involves either CDN inclusion for rapid prototyping or NPM package installation for production applications with build systems like Webpack or Vite. Developers can progressively enhance the experience by starting with basic model display and incrementally adding features like camera controls, animations, AR support, and material editing based on specific requirements. The event-driven architecture makes it straightforward to coordinate Model Viewer with other UI components—listening for load, error, and interaction events to update application state, trigger analytics, or synchronize with product databases. For advanced use cases, the Scene Graph API provides low-level access to Three.js primitives, enabling developers to implement custom rendering pipelines, post-processing effects, or dynamic content generation while maintaining the convenience of the web component interface. Cross-platform AR support through WebXR, Scene Viewer, and Quick Look ensures consistent experiences across Android, iOS, and desktop browsers without requiring separate codebases or native app development.