Try Live
Add Docs
Rankings
Pricing
Docs
Install
Theme
Install
Docs
Pricing
More...
More...
Try Live
Rankings
Enterprise
Create API Key
Add Docs
Unity Graphics
https://github.com/unity-technologies/graphics
Admin
The Scriptable Render Pipeline (SRP) is a Unity feature that provides artists and developers with
...
Tokens:
479,500
Snippets:
4,074
Trust Score:
9.3
Update:
5 months ago
Context
Skills
Chat
Benchmark
45.3
Suggestions
Latest
Show doc for...
Code
Info
Show Results
Context Summary (auto-generated)
Raw
Copy
Link
# Unity Graphics Repository ## Introduction The Unity Graphics repository contains the Scriptable Render Pipeline (SRP) framework and its associated rendering packages designed to provide artists and developers with modern, high-fidelity graphics capabilities in Unity. This open-source repository houses the core infrastructure for Unity's rendering systems, including the Universal Render Pipeline (URP) for cross-platform rendering, the High Definition Render Pipeline (HDRP) for high-end graphics on compute shader-compatible platforms, Shader Graph for visual shader authoring, Visual Effect Graph for GPU-accelerated visual effects, and the Post Processing Stack for image effects. The packages are distributed as Core packages within the Unity Editor and are actively developed by Unity Technologies with regular mirroring from their private development repository. The repository follows a branch structure aligned with Unity versions, where the master branch maps to Unity Alpha releases, version-specific staging branches (e.g., 2021.1/staging) correspond to beta and released Unity versions, and release branches (e.g., 10.x.x/release) support Unity 2020.x and earlier versions. Each package can be cloned into a Unity project either as local packages or directly within the project's Packages folder, enabling developers to modify and customize the rendering pipeline source code for their specific needs. The framework is designed with extensibility in mind, allowing custom effects and features to integrate seamlessly with the built-in rendering features through a plugin-based architecture. ## APIs and Key Functions ### PostProcessVolume - Global and Local Post-Processing Control The `PostProcessVolume` component manages post-processing effects within Unity scenes, supporting both global scene-wide effects and local bounded regions with customizable blending. It uses Unity's volume framework to blend between different effect configurations based on camera position and priority. ```csharp using UnityEngine; using UnityEngine.Rendering.PostProcessing; public class VolumeController : MonoBehaviour { void Start() { // Create a global post-processing volume GameObject volumeObject = new GameObject("Global Volume"); PostProcessVolume volume = volumeObject.AddComponent<PostProcessVolume>(); // Configure as global volume volume.isGlobal = true; volume.weight = 1f; volume.priority = 100f; // Create and assign a new profile PostProcessProfile profile = ScriptableObject.CreateInstance<PostProcessProfile>(); volume.sharedProfile = profile; // Add bloom effect to profile Bloom bloom = profile.AddSettings<Bloom>(); bloom.enabled.Override(true); bloom.intensity.Override(5f); bloom.threshold.Override(1f); bloom.softKnee.Override(0.5f); bloom.diffusion.Override(7f); bloom.color.Override(Color.white); Debug.Log("Global volume created with bloom effect"); } void CreateLocalVolume() { // Create a local volume with boundaries GameObject localVolumeObject = new GameObject("Local Volume"); PostProcessVolume localVolume = localVolumeObject.AddComponent<PostProcessVolume>(); // Configure as local volume localVolume.isGlobal = false; localVolume.blendDistance = 5f; // Start blending 5 units from collider localVolume.weight = 1f; localVolume.priority = 50f; // Add box collider to define volume bounds BoxCollider collider = localVolumeObject.AddComponent<BoxCollider>(); collider.isTrigger = true; collider.size = new Vector3(10f, 10f, 10f); // Create profile with vignette effect for local area PostProcessProfile localProfile = ScriptableObject.CreateInstance<PostProcessProfile>(); localVolume.sharedProfile = localProfile; Vignette vignette = localProfile.AddSettings<Vignette>(); vignette.enabled.Override(true); vignette.intensity.Override(0.45f); vignette.smoothness.Override(0.4f); vignette.color.Override(new Color(0.1f, 0.1f, 0.2f)); } } ``` ### PostProcessLayer - Camera-Based Post-Processing Rendering The `PostProcessLayer` component is attached to cameras to enable post-processing rendering with configurable anti-aliasing methods and volume layer filtering. It controls which volumes affect the camera and manages the rendering pipeline for post-processing effects. ```csharp using UnityEngine; using UnityEngine.Rendering.PostProcessing; public class CameraPostProcessingSetup : MonoBehaviour { public PostProcessResources postProcessResources; void SetupCameraPostProcessing() { Camera camera = GetComponent<Camera>(); // Add PostProcessLayer component to camera PostProcessLayer layer = camera.gameObject.AddComponent<PostProcessLayer>(); // Initialize with resources (required if adding at runtime) if (postProcessResources != null) { layer.Init(postProcessResources); } // Configure volume trigger (uses camera transform by default) layer.volumeTrigger = transform; // Set which layers contain volumes (use dedicated layer for performance) layer.volumeLayer = LayerMask.GetMask("PostProcessing"); // Enable anti-aliasing layer.antialiasingMode = PostProcessLayer.Antialiasing.TemporalAntialiasing; layer.temporalAntialiasing.jitterSpread = 0.75f; layer.temporalAntialiasing.sharpness = 0.25f; layer.temporalAntialiasing.stationaryBlending = 0.95f; layer.temporalAntialiasing.motionBlending = 0.85f; // Enable NaN filtering to prevent artifacts layer.stopNaNPropagation = true; // Optimize for mobile by rendering directly to camera target layer.finalBlitToCameraTarget = false; Debug.Log($"PostProcessLayer configured with {layer.antialiasingMode} anti-aliasing"); } void SwitchAntiAliasing(PostProcessLayer layer, PostProcessLayer.Antialiasing mode) { layer.antialiasingMode = mode; switch (mode) { case PostProcessLayer.Antialiasing.FastApproximateAntialiasing: layer.fastApproximateAntialiasing.fastMode = false; layer.fastApproximateAntialiasing.keepAlpha = false; break; case PostProcessLayer.Antialiasing.SubpixelMorphologicalAntialiasing: layer.subpixelMorphologicalAntialiasing.quality = SubpixelMorphologicalAntialiasing.Quality.High; break; } } } ``` ### Bloom Effect - Configurable Light Bloom and Lens Dirt The `Bloom` effect creates a glow around bright areas in the scene with configurable threshold, intensity, and diffusion parameters. It supports lens dirt textures and anamorphic distortion for cinematic effects. ```csharp using UnityEngine; using UnityEngine.Rendering.PostProcessing; public class BloomEffectController : MonoBehaviour { private PostProcessVolume volume; private Bloom bloom; void Start() { // Get or create volume volume = GetComponent<PostProcessVolume>(); if (volume == null) { volume = gameObject.AddComponent<PostProcessVolume>(); volume.isGlobal = true; } // Create profile if needed if (volume.profile == null) { volume.profile = ScriptableObject.CreateInstance<PostProcessProfile>(); } // Add or get bloom effect if (!volume.profile.TryGetSettings(out bloom)) { bloom = volume.profile.AddSettings<Bloom>(); } // Configure bloom parameters bloom.enabled.Override(true); bloom.intensity.Override(3.5f); // Brightness of bloom bloom.threshold.Override(1.2f); // Brightness threshold bloom.softKnee.Override(0.7f); // Soft transition at threshold bloom.diffusion.Override(7f); // Size of bloom (1-10) bloom.anamorphicRatio.Override(0f); // Horizontal/vertical distortion bloom.color.Override(new Color(1f, 0.95f, 0.9f)); // Warm tint // Add lens dirt effect Texture2D dirtTexture = Resources.Load<Texture2D>("LensDirt"); if (dirtTexture != null) { bloom.dirtTexture.Override(dirtTexture); bloom.dirtIntensity.Override(2.5f); } // Enable fast mode for mobile performance bloom.fastMode.Override(false); } void AnimateBloomPulse() { // Pulse bloom intensity based on time float pulse = Mathf.PingPong(Time.time * 0.5f, 1f); bloom.intensity.value = 2f + pulse * 3f; } void ConfigureAnamorphicBloom(float ratio) { // Negative values = vertical distortion, positive = horizontal bloom.anamorphicRatio.Override(ratio); bloom.diffusion.Override(8f); // Higher diffusion works better with anamorphic } } ``` ### ColorGrading - HDR/LDR Color Correction and Tonemapping The `ColorGrading` effect provides comprehensive color correction capabilities including tonemapping, white balance, exposure, contrast, color filters, and channel mixing for both HDR and LDR workflows. ```csharp using UnityEngine; using UnityEngine.Rendering.PostProcessing; public class ColorGradingController : MonoBehaviour { private PostProcessVolume volume; private ColorGrading colorGrading; void SetupHDRColorGrading() { volume = gameObject.AddComponent<PostProcessVolume>(); volume.isGlobal = true; volume.profile = ScriptableObject.CreateInstance<PostProcessProfile>(); colorGrading = volume.profile.AddSettings<ColorGrading>(); colorGrading.enabled.Override(true); // Set HDR grading mode colorGrading.gradingMode.Override(GradingMode.HighDefinitionRange); // Configure tonemapping colorGrading.tonemapper.Override(Tonemapper.ACES); // Adjust exposure and color temperature colorGrading.postExposure.Override(0.5f); colorGrading.temperature.Override(10f); // Warmer colorGrading.tint.Override(-5f); // Less magenta // Adjust contrast and saturation colorGrading.contrast.Override(15f); colorGrading.saturation.Override(10f); // Color filter overlay colorGrading.colorFilter.Override(new Color(1f, 0.95f, 0.9f, 1f)); // Lift, Gamma, Gain adjustments (shadows, midtones, highlights) colorGrading.lift.Override(new Vector4(1f, 1f, 1f, 0f)); colorGrading.gamma.Override(new Vector4(1f, 1f, 1f, 0f)); colorGrading.gain.Override(new Vector4(1.05f, 1f, 0.95f, 0f)); // Slight warm gain Debug.Log("HDR color grading configured with ACES tonemapping"); } void SetupLDRColorGrading() { volume = gameObject.AddComponent<PostProcessVolume>(); volume.profile = ScriptableObject.CreateInstance<PostProcessProfile>(); colorGrading = volume.profile.AddSettings<ColorGrading>(); colorGrading.enabled.Override(true); // Set LDR grading mode (for mobile/low-end platforms) colorGrading.gradingMode.Override(GradingMode.LowDefinitionRange); // LDR adjustments colorGrading.brightness.Override(5f); colorGrading.contrast.Override(10f); colorGrading.saturation.Override(5f); colorGrading.colorFilter.Override(Color.white); } void SetupCustomTonemapping() { colorGrading.tonemapper.Override(Tonemapper.Custom); // Custom tonemapper curve parameters colorGrading.toneCurveToeStrength.Override(0.5f); colorGrading.toneCurveToeLength.Override(0.5f); colorGrading.toneCurveShoulderStrength.Override(0.5f); colorGrading.toneCurveShoulderLength.Override(0.5f); colorGrading.toneCurveShoulderAngle.Override(0.5f); } void LoadExternalLUT(Texture3D lutTexture) { // Use external LUT authored in software like DaVinci Resolve colorGrading.gradingMode.Override(GradingMode.External); colorGrading.externalLut.Override(lutTexture); } } ``` ### QuickVolume - Runtime Post-Processing Volume Creation The `QuickVolume` method creates temporary post-processing volumes at runtime for dynamic effects, time-based events, or gameplay-triggered visual changes with automatic cleanup support. ```csharp using UnityEngine; using UnityEngine.Rendering.PostProcessing; public class RuntimeVolumeEffects : MonoBehaviour { private PostProcessVolume pulseVolume; private Vignette vignette; void CreatePulsatingVignette() { // Create vignette effect instance vignette = ScriptableObject.CreateInstance<Vignette>(); vignette.enabled.Override(true); vignette.intensity.Override(0.5f); vignette.smoothness.Override(0.4f); vignette.roundness.Override(1f); vignette.rounded.Override(false); vignette.color.Override(Color.black); // Create quick volume with priority 100 pulseVolume = PostProcessManager.instance.QuickVolume( gameObject.layer, 100f, vignette ); Debug.Log("Pulsating vignette volume created"); } void Update() { if (vignette != null) { // Animate vignette intensity with sine wave float intensity = Mathf.Abs(Mathf.Sin(Time.time * 2f)) * 0.5f; vignette.intensity.value = intensity; } } void OnDestroy() { // Clean up to prevent memory leaks if (pulseVolume != null) { RuntimeUtilities.DestroyVolume(pulseVolume, true, true); } } void CreateDamageEffect() { // Create red vignette for damage indication Vignette damageVignette = ScriptableObject.CreateInstance<Vignette>(); damageVignette.enabled.Override(true); damageVignette.intensity.Override(0.8f); damageVignette.smoothness.Override(0.3f); damageVignette.color.Override(new Color(1f, 0f, 0f, 1f)); ChromaticAberration chromatic = ScriptableObject.CreateInstance<ChromaticAberration>(); chromatic.enabled.Override(true); chromatic.intensity.Override(0.5f); // Create volume with multiple effects PostProcessVolume damageVolume = PostProcessManager.instance.QuickVolume( gameObject.layer, 200f, damageVignette, chromatic ); // Fade out damage effect over 2 seconds StartCoroutine(FadeOutVolume(damageVolume, 2f)); } System.Collections.IEnumerator FadeOutVolume(PostProcessVolume volume, float duration) { float elapsed = 0f; while (elapsed < duration) { elapsed += Time.deltaTime; volume.weight = 1f - (elapsed / duration); yield return null; } RuntimeUtilities.DestroyVolume(volume, true, true); } } ``` ### Custom Post-Processing Effect - Extensibility Framework Unity's post-processing stack supports custom effects through a two-class system: a settings class for parameters and a renderer class for rendering logic, both integrating seamlessly with volume blending. ```csharp using System; using UnityEngine; using UnityEngine.Rendering.PostProcessing; // Settings class - stores effect parameters [Serializable] [PostProcess(typeof(GrayscaleRenderer), PostProcessEvent.AfterStack, "Custom/Grayscale")] public sealed class Grayscale : PostProcessEffectSettings { [Range(0f, 1f), Tooltip("Grayscale effect intensity.")] public FloatParameter blend = new FloatParameter { value = 0.5f }; [Range(0f, 1f), Tooltip("Red channel weight.")] public FloatParameter redWeight = new FloatParameter { value = 0.2126f }; [Range(0f, 1f), Tooltip("Green channel weight.")] public FloatParameter greenWeight = new FloatParameter { value = 0.7152f }; [Range(0f, 1f), Tooltip("Blue channel weight.")] public FloatParameter blueWeight = new FloatParameter { value = 0.0722f }; // Override to add custom enable/disable logic public override bool IsEnabledAndSupported(PostProcessRenderContext context) { return enabled.value && blend.value > 0f; } } // Renderer class - implements rendering logic public sealed class GrayscaleRenderer : PostProcessEffectRenderer<Grayscale> { public override void Init() { // Called when renderer is created - initialize resources here Debug.Log("Grayscale renderer initialized"); } public override void Render(PostProcessRenderContext context) { // Get property sheet for shader var sheet = context.propertySheets.Get(Shader.Find("Hidden/Custom/Grayscale")); // Set shader parameters sheet.properties.SetFloat("_Blend", settings.blend); sheet.properties.SetVector("_Weights", new Vector3( settings.redWeight, settings.greenWeight, settings.blueWeight )); // Render fullscreen triangle context.command.BlitFullscreenTriangle( context.source, context.destination, sheet, 0 // Pass index ); } public override void Release() { // Called when renderer is destroyed - cleanup here Debug.Log("Grayscale renderer released"); } } // Corresponding HLSL Shader (Hidden/Custom/Grayscale) /* Shader "Hidden/Custom/Grayscale" { HLSLINCLUDE #include "Packages/com.unity.postprocessing/PostProcessing/Shaders/StdLib.hlsl" TEXTURE2D_SAMPLER2D(_MainTex, sampler_MainTex); float _Blend; float3 _Weights; float4 Frag(VaryingsDefault i) : SV_Target { float4 color = SAMPLE_TEXTURE2D(_MainTex, sampler_MainTex, i.texcoord); float luminance = dot(color.rgb, _Weights); color.rgb = lerp(color.rgb, luminance.xxx, _Blend); return color; } ENDHLSL SubShader { Cull Off ZWrite Off ZTest Always Pass { HLSLPROGRAM #pragma vertex VertDefault #pragma fragment Frag ENDHLSL } } } */ ``` ### Profile Editing - Dynamic Effect Management Post-processing profiles can be modified at runtime either through shared profiles (affecting all volumes) or instanced profiles (affecting only specific volumes) with utility methods for adding, removing, and querying effects. ```csharp using UnityEngine; using UnityEngine.Rendering.PostProcessing; public class ProfileEditor : MonoBehaviour { public PostProcessVolume volume; void EditSharedProfile() { // Modify shared profile - affects ALL volumes using this profile // Changes persist after exiting play mode PostProcessProfile sharedProfile = volume.sharedProfile; // Try to get existing bloom effect if (sharedProfile.TryGetSettings(out Bloom bloom)) { bloom.intensity.Override(5f); Debug.Log("Modified existing bloom in shared profile"); } else { // Add new bloom effect if it doesn't exist bloom = sharedProfile.AddSettings<Bloom>(); bloom.enabled.Override(true); bloom.intensity.Override(5f); Debug.Log("Added new bloom to shared profile"); } } void EditInstancedProfile() { // Request instanced profile - affects ONLY this volume // Changes reset when exiting play mode // Must manually destroy when done PostProcessProfile instancedProfile = volume.profile; // Add color grading to instanced profile if (!instancedProfile.TryGetSettings(out ColorGrading grading)) { grading = instancedProfile.AddSettings<ColorGrading>(); } grading.enabled.Override(true); grading.temperature.Override(10f); grading.saturation.Override(20f); Debug.Log("Modified instanced profile for this volume only"); } void ManageEffects() { PostProcessProfile profile = volume.profile; // Add multiple effects try { Vignette vignette = profile.AddSettings<Vignette>(); vignette.enabled.Override(true); vignette.intensity.Override(0.4f); AmbientOcclusion ao = profile.AddSettings<AmbientOcclusion>(); ao.enabled.Override(true); ao.intensity.Override(1f); Debug.Log("Added vignette and ambient occlusion"); } catch (System.Exception e) { Debug.LogWarning($"Effect already exists: {e.Message}"); } // Remove effect if (profile.TryGetSettings(out Bloom bloom)) { profile.RemoveSettings<Bloom>(); Debug.Log("Removed bloom effect"); } // Check if effect exists bool hasGrading = profile.HasSettings<ColorGrading>(); Debug.Log($"Profile has color grading: {hasGrading}"); } void CreateDynamicProfile() { // Create profile from scratch PostProcessProfile newProfile = ScriptableObject.CreateInstance<PostProcessProfile>(); // Add and configure effects Bloom bloom = newProfile.AddSettings<Bloom>(); bloom.enabled.Override(true); bloom.intensity.Override(3f); bloom.threshold.Override(1f); Vignette vignette = newProfile.AddSettings<Vignette>(); vignette.enabled.Override(true); vignette.intensity.Override(0.35f); // Assign to volume volume.profile = newProfile; Debug.Log("Created and assigned new profile"); } void OnDestroy() { // Clean up instanced profiles to prevent memory leaks if (volume.HasInstantiatedProfile()) { RuntimeUtilities.DestroyProfile(volume.profile, true); } } } ``` ## Summary and Integration The Unity Graphics repository serves as the foundation for modern rendering in Unity, providing a comprehensive suite of packages for creating high-quality visuals across diverse platforms. The Scriptable Render Pipeline architecture enables customization at every level, from modifying built-in render pipelines to creating entirely new rendering solutions. The Post Processing Stack v2 offers production-ready image effects with intuitive volume-based blending, while Shader Graph and Visual Effect Graph provide visual authoring tools that eliminate the need for manual shader coding. These packages integrate seamlessly through Unity's package management system and are designed to work together, with URP targeting broad platform compatibility and HDRP focusing on cutting-edge graphics for high-end hardware. The repository's open-source nature and extensible architecture make it an essential resource for Unity developers seeking to push the boundaries of real-time rendering. Integration patterns typically involve cloning the repository into a Unity project's Assets or Packages folder, checking out the appropriate version tag that matches the Unity Editor version, and then either using the pre-built pipelines as-is or extending them with custom effects and features. The post-processing system integrates through the PostProcessLayer component on cameras and PostProcessVolume components in the scene, with effects organized into profiles that can be shared across multiple volumes or instanced for per-volume customization. Custom rendering features can be injected at predefined points in the render pipeline, and the framework's use of ScriptableObjects and reflection enables new effects to work automatically with volume blending and serialization. For teams requiring maximum control, the repository provides complete source access to all rendering systems, allowing deep customization while maintaining compatibility with Unity's evolving graphics features and platform support.