### Installing Media Blocks SDK .Net via NuGet - Bash
Source: https://github.com/visioforge/help/blob/main/dotnet/mediablocks/GettingStarted/index.md
This command installs the VisioForge.DotNet.MediaBlocks NuGet package into your .Net project, providing all necessary SDK components for multimedia processing.
```Bash
dotnet add package VisioForge.DotNet.MediaBlocks
```
--------------------------------
### Setting Up Basic Audio Rendering Pipeline - C#
Source: https://github.com/visioforge/help/blob/main/dotnet/mediablocks/AudioRendering/index.md
This example demonstrates a fundamental audio rendering pipeline setup. It initializes a `MediaBlocksPipeline`, creates a `VirtualAudioSourceBlock` to generate audio, and connects its output to an `AudioRendererBlock` for playback, finally starting the pipeline asynchronously.
```csharp
var pipeline = new MediaBlocksPipeline();
var audioSourceBlock = new VirtualAudioSourceBlock(new VirtualAudioSourceSettings());
// Create audio renderer with default settings
var audioRenderer = new AudioRendererBlock();
pipeline.Connect(audioSourceBlock.Output, audioRenderer.Input);
await pipeline.StartAsync();
```
--------------------------------
### Implementing IP Camera Preview in C# WinForms
Source: https://github.com/visioforge/help/blob/main/dotnet/videocapture/video-tutorials/ip-camera-preview.md
This complete C# WinForms example demonstrates how to initialize the VisioForge VideoCaptureCore, set up an IP camera source, configure audio settings, and control the video preview. It shows the basic structure for integrating real-time IP camera feeds into a .NET application, including starting and stopping the capture asynchronously. Dependencies include VisioForge.Core.VideoCapture and related types.
```C#
using System;
using System.Collections.Generic;
using System.ComponentModel;
using System.Data;
using System.Drawing;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
using System.Windows.Forms;
using VisioForge.Core.VideoCapture;
using VisioForge.Core.Types;
using VisioForge.Core.Types.Output;
using VisioForge.Core.Types.VideoCapture;
namespace ip_camera_preview
{
public partial class Form1 : Form
{
private VideoCaptureCore videoCapture1;
public Form1()
{
InitializeComponent();
}
private void Form1_Load(object sender, EventArgs e)
{
videoCapture1 = new VideoCaptureCore(VideoView1 as IVideoView);
}
private async void btStart_Click(object sender, EventArgs e)
{
// Several engines are available. We'll use LAV as the most compatible. For low latency RTSP playback, use the RTSP Low Latency engine.
videoCapture1.IP_Camera_Source = new IPCameraSourceSettings()
{
URL = new Uri("http://192.168.233.129:8000/camera/mjpeg"),
Type = IPSourceEngine.Auto_LAV
};
videoCapture1.Audio_PlayAudio = videoCapture1.Audio_RecordAudio = false;
videoCapture1.Mode = VideoCaptureMode.IPPreview;
await videoCapture1.StartAsync();
}
private async void btStop_Click(object sender, EventArgs e)
{
await videoCapture1.StopAsync();
}
}
}
```
--------------------------------
### Installing .NET Workload for Linux Development (CLI)
Source: https://github.com/visioforge/help/blob/main/dotnet/install/index.md
Command-line instruction to install the necessary .NET workload for Linux development, ensuring all required components are available for building Linux applications.
```CLI
dotnet workload install linux
```
--------------------------------
### Complete SRT Streaming Setup with Encoder Fallback in VisioForge C#
Source: https://github.com/visioforge/help/blob/main/dotnet/general/network-streaming/srt.md
This comprehensive example demonstrates a full SRT streaming setup. It initializes an `SRTOutput` with a destination URI, configures video encoding with a hardware acceleration fallback (NVENC to OpenH264), sets bitrates for both options, adds the output to a `videoCapture` engine, and finally starts the streaming process. This snippet provides a complete working example for initiating an SRT stream.
```C#
// Create and configure SRT output
var srtOutput = new SRTOutput("srt://streaming-server:1234");
// Configure video encoding - try hardware acceleration with fallback
if (NVENCH264EncoderSettings.IsAvailable())
{
var nvencSettings = new NVENCH264EncoderSettings();
nvencSettings.Bitrate = 4000000; // 4 Mbps
srtOutput.Video = nvencSettings;
}
else
{
var softwareSettings = new OpenH264EncoderSettings();
softwareSettings.Bitrate = 2000000; // 2 Mbps for software encoding
srtOutput.Video = softwareSettings;
}
// Add to capture engine
videoCapture.Outputs_Add(srtOutput, true);
// Start streaming
videoCapture.Start();
```
--------------------------------
### Setting Up a Minimal VisioForge MediaBlocks Pipeline in C#
Source: https://github.com/visioforge/help/blob/main/dotnet/mediablocks/GettingStarted/index.md
This snippet demonstrates the basic steps to create and manage a media processing pipeline using the VisioForge Media Blocks SDK. It initializes a pipeline, adds a virtual video source and a video renderer, connects them, starts the pipeline for processing, and then stops and disposes of it.
```csharp
using VisioForge.Core.MediaBlocks;
var pipeline = new MediaBlocksPipeline();
var source = new VirtualVideoSourceBlock(new VirtualVideoSourceSettings());
var renderer = new VideoRendererBlock(pipeline, videoViewControl);
pipeline.AddBlock(source);
pipeline.AddBlock(renderer);
pipeline.Connect(source.Output, renderer.Input);
await pipeline.StartAsync();
// ...
await pipeline.StopAsync();
pipeline.Dispose();
```
--------------------------------
### Initializing macOS Audio Capture Pipeline with VisioForge MediaBlocks (C#)
Source: https://github.com/visioforge/help/blob/main/dotnet/mediablocks/Sources/index.md
This C# example illustrates how to configure an audio capture pipeline on macOS using MediaBlocksPipeline. It enumerates and selects the first available audio device, sets up OSXAudioSourceSettings, creates an OSXAudioSourceBlock and an AudioRendererBlock, connects them, and starts the pipeline for audio playback.
```C#
// create pipeline
var pipeline = new MediaBlocksPipeline();
// select the first available audio device
var devices = await DeviceEnumerator.Shared.AudioSourcesAsync();
var device = devices.Length > 0 ? devices[0] : null;
OSXAudioSourceSettings audioSourceSettings = null;
if (device != null)
{
var formatItem = device.Formats[0];
if (formatItem != null)
{
audioSourceSettings = new OSXAudioSourceSettings(device.DeviceID, formatItem);
}
}
// create macOS audio source block
var audioSource = new OSXAudioSourceBlock(audioSourceSettings);
// create audio renderer block
var audioRenderer = new AudioRendererBlock();
// connect blocks
pipeline.Connect(audioSource.Output, audioRenderer.Input);
// start pipeline
await pipeline.StartAsync();
```
--------------------------------
### Starting Video Capture - Delphi
Source: https://github.com/visioforge/help/blob/main/delphi/videocapture/video-capture-wmv.md
This snippet initiates the video capture process. After all configurations are set, calling the `Start` method begins recording video according to the specified settings.
```pascal
VideoCapture1.Start;
```
--------------------------------
### Initializing VisioForge SDK X-Engines (C#)
Source: https://github.com/visioforge/help/blob/main/dotnet/install/index.md
Demonstrates how to initialize the VisioForge SDK's cross-platform X-engines at application startup. This step is crucial before using any X-engine components and provides both synchronous and asynchronous initialization options.
```C#
// Initialize at application startup
VisioForge.Core.VisioForgeX.InitSDK();
// Or use the async version
await VisioForge.Core.VisioForgeX.InitSDKAsync();
```
--------------------------------
### Installing VisioForge .NET Core NuGet Package (CMD)
Source: https://github.com/visioforge/help/blob/main/dotnet/install/index.md
This command installs the core VisioForge .NET SDK NuGet package using the NuGet Package Manager Console. It provides fundamental multimedia capabilities for cross-platform development and CI/CD pipelines, serving as the base dependency for other UI-specific packages.
```CMD
Install-Package VisioForge.DotNet.Core
```
--------------------------------
### Installing GStreamer libcamera for Raspberry Pi - Bash
Source: https://github.com/visioforge/help/blob/main/dotnet/deployment-x/Ubuntu.md
This command installs the `gstreamer1.0-libcamera` package, which is specifically required for GStreamer functionality on Raspberry Pi devices. It provides integration with the `libcamera` framework for camera access.
```Bash
sudo apt install gstreamer1.0-libcamera
```
--------------------------------
### Complete Info.plist Configuration Example for iOS Permissions (XML)
Source: https://github.com/visioforge/help/blob/main/dotnet/deployment-x/iOS.md
This comprehensive XML snippet provides a full Info.plist file example, including standard iOS configuration keys like device family, supported orientations, and required device capabilities. Crucially, it demonstrates the declaration of NSCameraUsageDescription, NSMicrophoneUsageDescription, and NSPhotoLibraryUsageDescription for proper permission prompts.
```xml
LSRequiresIPhoneOS
UIDeviceFamily
1
2
UIRequiredDeviceCapabilities
arm64
UISupportedInterfaceOrientations
UIInterfaceOrientationPortrait
UIInterfaceOrientationLandscapeLeft
UIInterfaceOrientationLandscapeRight
UISupportedInterfaceOrientations~ipad
UIInterfaceOrientationPortrait
UIInterfaceOrientationPortraitUpsideDown
UIInterfaceOrientationLandscapeLeft
UIInterfaceOrientationLandscapeRight
XSAppIconAssets
Assets.xcassets/appicon.appiconset
NSCameraUsageDescription
Camera access is required for video recording
NSMicrophoneUsageDescription
Microphone access is required for audio recording
NSPhotoLibraryUsageDescription
Photo library access is required to save videos
```
--------------------------------
### Basic Intel QuickSync MJPEG Encoder Setup in C#
Source: https://github.com/visioforge/help/blob/main/dotnet/general/video-encoders/mjpeg.md
This example demonstrates the basic setup for the Intel QuickSync MJPEG encoder. It shows how to initialize the encoder, set a custom quality value, verify hardware support, and integrate the encoder block into a VisioForge pipeline, with a fallback to the CPU encoder if QuickSync is not available.
```csharp
// Import required namespaces
using VisioForge.Core.Types.Output;
// Create QuickSync MJPEG encoder with default settings
var qsvEncoder = new QSVMJPEGEncoderSettings();
// Verify hardware support
if (QSVMJPEGEncoderSettings.IsAvailable())
{
// Set custom quality value
qsvEncoder.Quality = 90; // Higher quality setting
// Create and add encoder block
var encoderBlock = qsvEncoder.CreateBlock();
pipeline.AddBlock(encoderBlock);
// Continue pipeline setup
}
else
{
// Fall back to CPU-based encoder
Console.WriteLine("QuickSync hardware not detected. Falling back to CPU encoder.");
var cpuEncoder = new MJPEGEncoderSettings();
pipeline.AddBlock(cpuEncoder.CreateBlock());
}
```
--------------------------------
### Initializing VisioForge Media Blocks SDK - C#
Source: https://github.com/visioforge/help/blob/main/dotnet/mediablocks/GettingStarted/index.md
Initializes the VisioForge Media Blocks SDK at application startup. This step is crucial for preparing the SDK environment and ensuring all components are ready for use.
```C#
using VisioForge.Core;
// Initialize the SDK at application startup
VisioForgeX.InitSDK();
```
--------------------------------
### Installing Core GStreamer Packages - Bash
Source: https://github.com/visioforge/help/blob/main/dotnet/deployment-x/Ubuntu.md
This command installs a comprehensive set of GStreamer plugins and libraries essential for multimedia processing with VisioForge SDKs on Ubuntu. It includes base, good, and bad plugins, along with ALSA, OpenGL, PulseAudio, FFmpeg integration, and GStreamer Editing Services.
```Bash
sudo apt install gstreamer1.0-plugins-base gstreamer1.0-plugins-good gstreamer1.0-plugins-bad gstreamer1.0-alsa gstreamer1.0-gl gstreamer1.0-pulseaudio gstreamer1.0-libav libges-1.0-0
```
--------------------------------
### Installing Xcode Command Line Tools (Bash)
Source: https://github.com/visioforge/help/blob/main/dotnet/deployment-x/macOS.md
This snippet provides the command to install Xcode Command Line Tools, which are essential prerequisites for macOS development, including compiling and linking native components. Running this command in the Terminal initiates the installation process for these tools.
```bash
xcode-select --install
```
--------------------------------
### Starting and Stopping Media Blocks Pipeline - C#
Source: https://github.com/visioforge/help/blob/main/dotnet/mediablocks/GettingStarted/index.md
Asynchronously starts the media processing pipeline to begin data flow and processing, and later stops it to cease operations. This manages the active state of the pipeline.
```C#
// Start the pipeline asynchronously
await pipeline.StartAsync();
// ... later, stop processing
await pipeline.StopAsync();
```
--------------------------------
### Creating Media Blocks Pipeline and Blocks - C#
Source: https://github.com/visioforge/help/blob/main/dotnet/mediablocks/GettingStarted/index.md
Demonstrates how to instantiate a new MediaBlocksPipeline and create specific media blocks like VirtualVideoSourceBlock and VideoRendererBlock. Blocks are then added to the pipeline for processing.
```C#
using VisioForge.Core.MediaBlocks;
// Create a new pipeline instance
var pipeline = new MediaBlocksPipeline();
// Example: Create a virtual video source and a video renderer
var virtualSource = new VirtualVideoSourceBlock(new VirtualVideoSourceSettings());
var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); // VideoView1 is your UI control
// Add blocks to the pipeline
pipeline.AddBlock(virtualSource);
pipeline.AddBlock(videoRenderer);
```
--------------------------------
### Complete Example: Configuring Hardware-Accelerated AVI Output (C#)
Source: https://github.com/visioforge/help/blob/main/dotnet/general/output-formats/avi.md
This comprehensive example demonstrates a full configuration for AVI output, including initializing `AVIOutput` with a filename, setting a hardware-accelerated NVIDIA H.264 video encoder, and configuring an AAC audio encoder. This snippet provides a practical illustration of combining various settings for high-quality AVI file generation.
```C#
var aviOutput = new AVIOutput("high_quality_output.avi");
aviOutput.Video = new NVENCH264EncoderSettings();
aviOutput.Audio = new VOAACEncoderSettings();
```
--------------------------------
### Starting Video Capture - VB6
Source: https://github.com/visioforge/help/blob/main/delphi/videocapture/video-capture-wmv.md
This snippet executes the `Start` method of the `VideoCapture1` object, which commences the video capture operation using the current component settings.
```vb
VideoCapture1.Start
```
--------------------------------
### Configuring Linux Target Framework (XML)
Source: https://github.com/visioforge/help/blob/main/dotnet/install/index.md
Sets the standard target framework for Linux applications, which typically uses the base framework without a platform-specific suffix.
```XML
net8.0
```
--------------------------------
### Installing VisioForge .NET UI-Specific NuGet Packages (CMD)
Source: https://github.com/visioforge/help/blob/main/dotnet/install/index.md
These commands install UI-specific VisioForge .NET SDK NuGet packages for different target platforms using the NuGet Package Manager Console. Developers should choose the relevant package (MAUI, WinUI, or Avalonia) based on their application's UI framework requirements to integrate multimedia controls.
```CMD
Install-Package VisioForge.DotNet.Core.UI.MAUI
Install-Package VisioForge.DotNet.Core.UI.WinUI
Install-Package VisioForge.DotNet.Core.UI.Avalonia
```
--------------------------------
### Verifying GStreamer Installation Version - Bash
Source: https://github.com/visioforge/help/blob/main/dotnet/deployment-x/Ubuntu.md
This command checks and displays the installed version of GStreamer. It is used to confirm that GStreamer is correctly installed and meets the minimum version requirements (1.22.0+) for VisioForge SDKs.
```Bash
gst-inspect-1.0 --version
```
--------------------------------
### Initializing VisioForge .NET SDK X-Engine (C#)
Source: https://github.com/visioforge/help/blob/main/dotnet/init.md
This snippet demonstrates how to initialize the VisioForge .NET SDK for X-engines. This method must be called before using any SDK classes to ensure proper functionality. It sets up the necessary components for cross-platform operations.
```csharp
VisioForge.Core.VisioForgeX.InitSDK();
```
--------------------------------
### Demultiplexing Media File with UniversalDemuxBlock C# Example
Source: https://github.com/visioforge/help/blob/main/dotnet/mediablocks/Demuxers/index.md
Provides a detailed C# example demonstrating the use of UniversalDemuxBlock within a MediaBlocksPipeline. It illustrates the steps to obtain MediaFileInfo, configure demuxer settings, instantiate the demux block, and connect its video, audio, and subtitle outputs to respective renderer blocks. The example emphasizes the need for a data source block and proper handling of stream outputs.
```csharp
var pipeline = new MediaBlocksPipeline();
// 1. Obtain MediaFileInfo for your media file
var mediaInfoReader = new MediaInfoReader(Context); // Assuming Context is your logging context
MediaFileInfo mediaInfo = await mediaInfoReader.GetInfoAsync("path/to/your/video.mkv");
if (mediaInfo == null)
{
Console.WriteLine("Failed to get media info.");
return;
}
// 2. Choose or create Demuxer Settings
// Example: Auto-detect demuxer type
IUniversalDemuxSettings demuxSettings = new UniversalDemuxSettings();
// Or, specify a type, e.g., for an MKV file:
// IUniversalDemuxSettings demuxSettings = new MKVDemuxSettings();
// Or, for MPEG-TS with specific program:
// var mpegTsSettings = new MPEGTSDemuxSettings { ProgramNumber = 1 };
// IUniversalDemuxSettings demuxSettings = mpegTsSettings;
// 3. Create UniversalDemuxBlock
var universalDemuxBlock = new UniversalDemuxBlock(
demuxSettings,
mediaInfo,
renderVideo: true, // Process video streams
renderAudio: true, // Process audio streams
renderSubtitle: true // Process subtitle streams
);
// 4. Connect a data source that provides the raw file stream to UniversalDemuxBlock's input.
// This step is crucial and depends on how you get the file data.
// For instance, using a FileSource configured to output raw data, or a StreamSourceBlock.
// Example with a hypothetical RawFileSourceBlock (not a standard block, for illustration):
// var rawFileSource = new RawFileSourceBlock("path/to/your/video.mkv");
// pipeline.Connect(rawFileSource.Output, universalDemuxBlock.Input);
// 5. Connect outputs
// Video outputs (MediaBlockPad[])
var videoOutputs = universalDemuxBlock.VideoOutputs;
if (videoOutputs.Length > 0)
{
// Example: connect the first video stream
var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); // Assuming VideoView1
pipeline.Connect(videoOutputs[0], videoRenderer.Input);
}
// Audio outputs (MediaBlockPad[])
var audioOutputs = universalDemuxBlock.AudioOutputs;
if (audioOutputs.Length > 0)
{
// Example: connect the first audio stream
var audioRenderer = new AudioRendererBlock();
pipeline.Connect(audioOutputs[0], audioRenderer.Input);
}
// Subtitle outputs (MediaBlockPad[])
var subtitleOutputs = universalDemuxBlock.SubtitleOutputs;
if (subtitleOutputs.Length > 0)
{
// Example: connect the first subtitle stream to a conceptual handler
// var subtitleHandler = new MySubtitleHandlerBlock();
// pipeline.Connect(subtitleOutputs[0], subtitleHandler.Input);
}
// Metadata output (if renderMetadata was true and metadata stream exists)
var metadataOutputs = universalDemuxBlock.MetadataOutputs;
if (metadataOutputs.Length > 0 && metadataOutputs[0] != null)
{
// Handle metadata stream
}
// Start pipeline after all connections are made
// await pipeline.StartAsync();
```
--------------------------------
### Initializing UniversalSourceBlock and Media Pipeline (C#)
Source: https://github.com/visioforge/help/blob/main/dotnet/mediablocks/Sources/index.md
This C# code demonstrates how to set up a basic media playback pipeline using UniversalSourceBlock to load a video file, connect its video output to a VideoRendererBlock, and its audio output to an AudioRendererBlock, then start the pipeline.
```C#
var pipeline = new MediaBlocksPipeline();
var fileSource = new UniversalSourceBlock();
fileSource.Filename = "test.mp4";
var videoRenderer = new VideoRendererBlock(pipeline, VideoView1);
pipeline.Connect(fileSource.VideoOutput, videoRenderer.Input);
var audioRenderer = new AudioRendererBlock();
pipeline.Connect(fileSource.AudioOutput, audioRenderer.Input);
await pipeline.StartAsync();
```
--------------------------------
### Initializing VisioForge SDK on MainPage Loaded (C#)
Source: https://github.com/visioforge/help/blob/main/dotnet/videocapture/maui/camera-recording-maui.md
Handles the `MainPage_Loaded` event to initialize the VisioForge SDK. It requests necessary permissions (camera, microphone, photo), enumerates available video and audio devices, sets up event handlers, and optionally starts a preview on mobile platforms. This is the primary entry point for SDK setup.
```C#
private async void MainPage_Loaded(object sender, EventArgs e)
{
// Ask for permissions
#if __ANDROID__ || __MACOS__ || __MACCATALYST__ || __IOS__
await RequestCameraPermissionAsync();
await RequestMicPermissionAsync();
#endif
#if __IOS__ && !__MACCATALYST__
RequestPhotoPermission();
#endif
// Get IVideoView interface
IVideoView vv = videoView.GetVideoView();
// Create core object with IVideoView interface
_core = new VideoCaptureCoreX(vv);
// Add event handlers
_core.OnError += Core_OnError;
// Enumerate cameras
_cameras = await DeviceEnumerator.Shared.VideoSourcesAsync();
if (_cameras.Length > 0)
{
btCamera.Text = _cameras[0].DisplayName;
}
// Enumerate microphones and other audio sources
_mics = await DeviceEnumerator.Shared.AudioSourcesAsync(null);
if (_mics.Length > 0)
{
btMic.Text = _mics[0].DisplayName;
}
// Enumerate audio outputs
_speakers = await DeviceEnumerator.Shared.AudioOutputsAsync(null);
if (_speakers.Length > 0)
{
btSpeakers.Text = _speakers[0].DisplayName;
}
// Add Destroying event handler
Window.Destroying += Window_Destroying;
#if __ANDROID__ || (__IOS__ && !__MACCATALYST__)
// Select second camera if available for mobile platforms
if (_cameras.Length > 1)
{
btCamera.Text = _cameras[1].DisplayName;
_cameraSelectedIndex = 1;
}
// Start preview
btStartCapture.IsEnabled = true;
await StartPreview();
#endif
}
```
--------------------------------
### Installing .NET Android Workload (CMD)
Source: https://github.com/visioforge/help/blob/main/dotnet/install/index.md
This command installs the necessary Android development workload using the .NET CLI. It is a prerequisite for building and running .NET applications targeting Android, ensuring that all required tools and components for Android development are available in the environment.
```CMD
dotnet workload install android
```
--------------------------------
### Navigating to VisioForge Installation Directory (Command Prompt)
Source: https://github.com/visioforge/help/blob/main/delphi/mediaplayer/install/index.md
This command changes the current directory in an Administrator Command Prompt to the default installation path of the VisioForge Media Framework's redistributable files. This is a prerequisite step before manually registering the OCX control.
```cmd
cd "C:\Program Files (x86)\VisioForge\Media Framework\Redist\AnyCPU"
```
--------------------------------
### Starting Video Capture - C++ MFC
Source: https://github.com/visioforge/help/blob/main/delphi/videocapture/video-capture-wmv.md
This snippet calls the `Start` method on the `m_videoCapture` object to begin the video capture. This command activates the recording process based on the previously configured settings.
```cpp
m_videoCapture.Start();
```
--------------------------------
### Initializing and Starting VNC Stream with C#
Source: https://github.com/visioforge/help/blob/main/dotnet/mediablocks/Sources/index.md
This C# snippet demonstrates how to set up a media pipeline to capture and display a VNC stream. It configures VNCSourceSettings with host, port, and optional password, then creates a VNCSourceBlock and connects its output to a VideoRendererBlock for display. Finally, the MediaBlocksPipeline is started asynchronously to begin streaming.
```C#
var pipeline = new MediaBlocksPipeline();
// Configure VNC source settings
var vncSettings = new VNCSourceSettings
{
Host = "your-vnc-server-ip", // or use Uri
Port = 5900, // Standard VNC port
Password = "your-password", // if any
// Width = 1920, // Optional: desired width
// Height = 1080, // Optional: desired height
};
var vncSource = new VNCSourceBlock(vncSettings);
// Create video renderer
var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); // Assuming VideoView1 is your display control
// Connect blocks
pipeline.Connect(vncSource.Output, videoRenderer.Input);
// Start pipeline
await pipeline.StartAsync();
```
--------------------------------
### Applying Output Settings and Starting Encoding Process in C#
Source: https://github.com/visioforge/help/blob/main/dotnet/general/output-formats/ffmpeg-exe.md
This snippet finalizes the encoding setup and initiates the process. It applies the ffmpegOutput format settings, sets the operation mode (e.g., VideoCaptureMode.VideoCapture), defines the output filename, and then asynchronously starts the core processing using core.StartAsync().
```csharp
// Apply format settings
core.Output_Format = ffmpegOutput;
// Set operation mode
core.Mode = VideoCaptureMode.VideoCapture; // For Video Capture SDK
// core.Mode = VideoEditMode.Convert; // For Video Edit SDK
// Set output path
core.Output_Filename = "output.mp4";
// Begin processing
await core.StartAsync();
```
--------------------------------
### Configuring macOS Target Framework for Native Applications (XML)
Source: https://github.com/visioforge/help/blob/main/dotnet/install/index.md
Specifies the target framework for macOS native applications, utilizing the `net8.0-macos` framework to ensure compatibility and proper compilation.
```XML
net8.0-macos
```
--------------------------------
### Complete Camera Viewer Implementation with Media Blocks SDK in C#
Source: https://github.com/visioforge/help/blob/main/dotnet/mediablocks/GettingStarted/camera.md
This comprehensive example demonstrates initializing, configuring, and managing a camera viewer using the Media Blocks SDK. It includes device enumeration, format selection, block creation, pipeline building, starting, stopping, and taking snapshots, showcasing a full application flow.
```C#
using System;
using System.Linq;
using System.Threading.Tasks;
using VisioForge.Core.MediaBlocks;
using VisioForge.Core.MediaBlocks.Sources;
using VisioForge.Core.MediaBlocks.VideoRendering;
using VisioForge.Core.Types.X.Sources;
public class CameraViewerExample
{
private MediaBlocksPipeline _pipeline;
private SystemVideoSourceBlock _videoSource;
private VideoRendererBlock _videoRenderer;
public async Task InitializeAsync(IVideoView videoView)
{
// Create pipeline
_pipeline = new MediaBlocksPipeline();
_pipeline.OnError += (s, e) => Console.WriteLine(e.Message);
// Enumerate devices
var devices = await DeviceEnumerator.Shared.VideoSourcesAsync();
if (devices.Length == 0)
{
throw new Exception("No camera devices found");
}
// Select device and format
var device = devices[0];
var format = device.GetHDOrAnyVideoFormatAndFrameRate(out var frameRate);
// Create settings
var settings = new VideoCaptureDeviceSourceSettings(device);
if (format != null)
{
settings.Format = format.ToFormat();
if (frameRate != null && !frameRate.IsEmpty)
{
settings.Format.FrameRate = frameRate;
}
}
// Create blocks
_videoSource = new SystemVideoSourceBlock(settings);
_videoRenderer = new VideoRendererBlock(_pipeline, videoView);
// Build pipeline
_pipeline.AddBlock(_videoSource);
_pipeline.AddBlock(_videoRenderer);
_pipeline.Connect(_videoSource.Output, _videoRenderer.Input);
// Start pipeline
await _pipeline.StartAsync();
}
public async Task StopAsync()
{
if (_pipeline != null)
{
await _pipeline.StopAsync();
_pipeline.Dispose();
}
}
public async Task TakeSnapshotAsync(string filename)
{
return await _videoRenderer.Snapshot_SaveAsync(filename,
SkiaSharp.SKEncodedImageFormat.Jpeg, 90);
}
}
```
--------------------------------
### Initializing Spinnaker Camera Source in C#
Source: https://github.com/visioforge/help/blob/main/dotnet/mediablocks/Sources/index.md
This snippet demonstrates how to initialize and connect a Spinnaker camera source to a video renderer within a `MediaBlocksPipeline`. It enumerates available Spinnaker cameras, selects the first one, configures its settings (name, ROI, frame rate), creates a `SpinnakerSourceBlock`, connects it to a `VideoRendererBlock`, and starts the pipeline. This functionality requires the Spinnaker SDK to be installed.
```csharp
var pipeline = new MediaBlocksPipeline();
var sources = await DeviceEnumerator.Shared.SpinnakerSourcesAsync();
var sourceSettings = new SpinnakerSourceSettings(sources[0].Name, new VisioForge.Core.Types.Rect(0, 0, 1280, 720), new VideoFrameRate(10));
var source = new SpinnakerSourceBlock(sourceSettings);
var videoRenderer = new VideoRendererBlock(pipeline, VideoView1);
pipeline.Connect(source.Output, videoRenderer.Input);
await pipeline.StartAsync();
```
--------------------------------
### Configuring and Rendering Push Audio Source in C#
Source: https://github.com/visioforge/help/blob/main/dotnet/mediablocks/Sources/index.md
This C# code demonstrates how to initialize a `PushSourceBlock` for audio, configure it with `PushAudioSourceSettings` (specifying live status, sample rate, channels, and format), connect its output to an `AudioRendererBlock`, and start the pipeline. It also illustrates how to push raw audio sample data using `PushFrame`.
```C#
var pipeline = new MediaBlocksPipeline();
// Configure push audio source
var audioPushSettings = new PushAudioSourceSettings(
isLive: true,
sampleRate: 44100,
channels: 2,
format: AudioFormatX.S16LE);
var audioPushSource = new PushSourceBlock(audioPushSettings);
// Example: Render the pushed audio
var audioRenderer = new AudioRendererBlock();
pipeline.Connect(audioPushSource.Output, audioRenderer.Input);
// Start pipeline
await pipeline.StartAsync();
// In a separate thread or task, push audio samples:
// byte[] audioData = ... ; // Your raw PCM S16LE audio data
// audioPushSource.PushFrame(audioData);
// Call PushFrame repeatedly for new audio data.
```
--------------------------------
### Applying AVI Output Settings and Starting Video Capture in C#
Source: https://github.com/visioforge/help/blob/main/dotnet/general/output-formats/avi.md
This snippet demonstrates the final steps to initiate video capture with the configured AVI output. It assigns the `aviOutput` object to `VideoCapture1.Output_Format`, sets the capture mode to `VideoCapture`, specifies the output filename, and then asynchronously starts the capture process.
```C#
// Set output format
VideoCapture1.Output_Format = aviOutput;
// Set capture mode
VideoCapture1.Mode = VideoCaptureMode.VideoCapture;
// Set output file path
VideoCapture1.Output_Filename = "output.avi";
// Start capture
await VideoCapture1.StartAsync();
```
--------------------------------
### Basic MPEG-TS Output Configuration with VideoCaptureCore (C#)
Source: https://github.com/visioforge/help/blob/main/dotnet/general/output-formats/mpegts.md
This example illustrates the basic setup of `VideoCaptureCore` to produce an MPEG-TS output. It shows how to create an `MPEGTSOutput` instance, configure its video encoder (H.264, bitrate, rate control) and audio encoder (AAC, bitrate, version), and assign it to the core's output format.
```C#
// Create VideoCaptureCore instance
var core = new VideoCaptureCore();
// Set output filename
core.Output_Filename = "output.ts";
// Create MPEG-TS output
var mpegtsOutput = new MPEGTSOutput();
// Configure video settings
mpegtsOutput.Video.Codec = MFVideoEncoder.MS_H264;
mpegtsOutput.Video.AvgBitrate = 2000; // 2 Mbps
mpegtsOutput.Video.RateControl = MFCommonRateControlMode.CBR;
// Configure audio settings
mpegtsOutput.Audio.Bitrate = 128; // 128 kbps
mpegtsOutput.Audio.Version = AACVersion.MPEG4;
core.Output_Format = mpegtsOutput;
```
--------------------------------
### Configuring macOS Target Framework for .NET MAUI Applications (XML)
Source: https://github.com/visioforge/help/blob/main/dotnet/install/index.md
Defines the target framework for .NET MAUI applications when specifically targeting macOS, using the `net8.0-maccatalyst` framework for Mac Catalyst compatibility.
```XML
net8.0-maccatalyst
```
--------------------------------
### Integrating VideoView in Windows Forms (C#)
Source: https://github.com/visioforge/help/blob/main/dotnet/install/index.md
Shows how to add and use the `VideoView` control within a Windows Forms application to display media content. This snippet includes the necessary `using` directive and the instantiation of the control.
```C#
// Add reference to VisioForge.DotNet.Core
using VisioForge.Core.UI.WinForms;
// In your form
videoView = new VideoView();
this.Controls.Add(videoView);
```
--------------------------------
### Capturing and Rendering System Audio Pipeline - C#
Source: https://github.com/visioforge/help/blob/main/dotnet/mediablocks/AudioRendering/index.md
This snippet provides a practical example of capturing system audio and rendering it. It sets up a `MediaBlocksPipeline`, uses a `SystemAudioSourceBlock` to capture audio, configures an `AudioRendererBlock` with a specific volume, connects the source to the renderer, and manages the pipeline's lifecycle including starting, delaying, and stopping.
```csharp
var pipeline = new MediaBlocksPipeline();
// Capture system audio
var systemAudioSource = new SystemAudioSourceBlock();
// Configure the audio renderer
var audioRenderer = new AudioRendererBlock();
audioRenderer.Volume = 0.8f; // 80% volume
// Connect blocks
pipeline.Connect(systemAudioSource.Output, audioRenderer.Input);
// Start processing
await pipeline.StartAsync();
// Allow audio to play for 10 seconds
await Task.Delay(TimeSpan.FromSeconds(10));
// Stop the pipeline
await pipeline.StopAsync();
```
--------------------------------
### Starting Audio Capture Process - C++ MFC
Source: https://github.com/visioforge/help/blob/main/delphi/videocapture/audio-capture-wav.md
This C++ MFC snippet starts the audio capture process by invoking the Start() method on m_VideoCapture. This command triggers the initialization of audio hardware and begins capturing audio based on the previously configured settings.
```cpp
// Begin audio capture process
m_VideoCapture.Start();
```
--------------------------------
### Integrating VideoView in WPF Applications (XAML)
Source: https://github.com/visioforge/help/blob/main/dotnet/install/index.md
Demonstrates how to declare the `VideoView` control in WPF XAML for displaying media content. This requires adding a reference to the `VisioForge.DotNet.Core.UI.WPF` package.
```XAML
```
--------------------------------
### Handling Media Blocks Pipeline Events - C#
Source: https://github.com/visioforge/help/blob/main/dotnet/mediablocks/GettingStarted/index.md
Subscribes to key events of the MediaBlocksPipeline such as OnError, OnStart, and OnStop to implement custom logic for error handling, pipeline state changes, and monitoring.
```C#
pipeline.OnError += (sender, args) =>
{
Console.WriteLine($"Pipeline error: {args.Message}");
// Implement your error handling logic here
};
pipeline.OnStart += (sender, args) =>
{
Console.WriteLine("Pipeline started");
};
pipeline.OnStop += (sender, args) =>
{
Console.WriteLine("Pipeline stopped");
};
```
--------------------------------
### Creating Audio Sink Pipeline with Bridge Audio Source (C#)
Source: https://github.com/visioforge/help/blob/main/dotnet/mediablocks/Bridge/index.md
This C# snippet illustrates the setup of a sink media pipeline using `MediaBlocksPipeline`. It creates a `BridgeAudioSourceBlock` to receive audio from a source pipeline and an `AudioRendererBlock` to play the received audio. The blocks are connected, and the pipeline is started, enabling the rendering of bridged audio.
```C#
// create sink pipeline
var sinkPipeline = new MediaBlocksPipeline();
// create bridge audio source and audio renderer
var bridgeAudioSource = new BridgeAudioSourceBlock(new BridgeAudioSourceSettings());
var audioRenderer = new AudioRendererBlock();
// connect source and sink
sinkPipeline.Connect(bridgeAudioSource.Output, audioRenderer.Input);
// start pipeline
await sinkPipeline.StartAsync();
```
--------------------------------
### Starting Audio Capture Process - VB6
Source: https://github.com/visioforge/help/blob/main/delphi/videocapture/audio-capture-wav.md
This VB6 snippet initiates the audio capture process by calling the Start method on VideoCapture1. This command activates the audio hardware and commences capturing audio to the specified output file using the applied settings.
```vb
' Begin audio capture process
VideoCapture1.Start
```
--------------------------------
### Starting Audio Capture Process - Delphi
Source: https://github.com/visioforge/help/blob/main/delphi/videocapture/audio-capture-wav.md
This Delphi snippet initiates the audio capture process by calling the Start method on VideoCapture1. This action initializes the audio hardware, applies all configured settings and codecs, and begins capturing audio to the designated output file.
```pascal
// Begin audio capture process
VideoCapture1.Start;
```
--------------------------------
### Configuring and Rendering Push Video Source in C#
Source: https://github.com/visioforge/help/blob/main/dotnet/mediablocks/Sources/index.md
This C# code demonstrates how to initialize a `PushSourceBlock` for video, configure it with `PushVideoSourceSettings` (specifying width, height, frame rate, and format), connect its output to a `VideoRendererBlock`, and start the pipeline. It also illustrates how to push raw video frame data using `PushFrame`.
```C#
var pipeline = new MediaBlocksPipeline();
// Configure push video source
var videoPushSettings = new PushVideoSourceSettings(
width: 640,
height: 480,
frameRate: new VideoFrameRate(30),
format: VideoFormatX.RGB);
// videoPushSettings.IsLive = true; // Default
var videoPushSource = new PushSourceBlock(videoPushSettings);
// Example: Render the pushed video
var videoRenderer = new VideoRendererBlock(pipeline, VideoView1);
pipeline.Connect(videoPushSource.Output, videoRenderer.Input);
// Start pipeline
await pipeline.StartAsync();
// In a separate thread or task, push video frames:
// byte[] frameData = ... ; // Your raw RGB frame data (640 * 480 * 3 bytes)
// videoPushSource.PushFrame(frameData);
// Call PushFrame repeatedly for each new video frame.
```
--------------------------------
### Initializing and Connecting Media Blocks Pipeline in C#
Source: https://github.com/visioforge/help/blob/main/dotnet/mediablocks/Sources/index.md
This C# code snippet demonstrates how to set up a `MediaBlocksPipeline` and integrate `VirtualAudioSourceBlock` and `VirtualVideoSourceBlock` with their respective renderers. It shows the instantiation of source and renderer blocks, followed by connecting their outputs to inputs within the pipeline, and finally starting the pipeline asynchronously.
```csharp
var pipeline = new MediaBlocksPipeline();
var audioSourceBlock = new VirtualAudioSourceBlock(new VirtualAudioSourceSettings());
var videoSourceBlock = new VirtualVideoSourceBlock(new VirtualVideoSourceSettings());
var videoRenderer = new VideoRendererBlock(pipeline, VideoView1);
pipeline.Connect(videoSourceBlock.Output, videoRenderer.Input);
var audioRenderer = new AudioRendererBlock();
pipeline.Connect(audioSourceBlock.Output, audioRenderer.Input);
await pipeline.StartAsync();
```
--------------------------------
### Creating Video Sink Pipeline with Bridge Video Source (C#)
Source: https://github.com/visioforge/help/blob/main/dotnet/mediablocks/Bridge/index.md
This C# snippet illustrates the setup of a sink media pipeline for video using `MediaBlocksPipeline`. It creates a `BridgeVideoSourceBlock` to receive video from a source pipeline and a `VideoRendererBlock` to display the received video. The blocks are connected, and the pipeline is started, enabling the rendering of bridged video.
```C#
// create sink pipeline
var sinkPipeline = new MediaBlocksPipeline();
// create bridge video source and video renderer
var bridgeVideoSource = new BridgeVideoSourceBlock(new BridgeVideoSourceSettings());
var videoRenderer = new VideoRendererBlock(sinkPipeline, VideoView1);
// connect source and sink
sinkPipeline.Connect(bridgeVideoSource.Output, videoRenderer.Input);
// start pipeline
await sinkPipeline.StartAsync();
```
--------------------------------
### Cleaning Up VisioForge SDK Resources (C#)
Source: https://github.com/visioforge/help/blob/main/dotnet/install/index.md
Illustrates how to properly release resources used by the VisioForge SDK's X-engines when the application exits. This includes both synchronous and asynchronous cleanup methods to prevent memory leaks and ensure stable behavior.
```C#
// Clean up at application exit
VisioForge.Core.VisioForgeX.DestroySDK();
// Or use the async version
await VisioForge.Core.VisioForgeX.DestroySDKAsync();
```
--------------------------------
### Integrating VideoView in .NET MAUI Applications (XAML)
Source: https://github.com/visioforge/help/blob/main/dotnet/install/index.md
Illustrates how to declare the `VideoView` control in .NET MAUI XAML for displaying media content. This requires adding a reference to the `VisioForge.DotNet.Core.UI.MAUI` package.
```XAML
```
--------------------------------
### Configuring and Using CVMotionCellsBlock in C#
Source: https://github.com/visioforge/help/blob/main/dotnet/mediablocks/OpenCV/index.md
This C# example demonstrates how to initialize and configure the `CVMotionCellsBlock` for motion detection within a `MediaBlocksPipeline`. It sets up motion detection parameters like grid size, sensitivity, and threshold, and subscribes to the `MotionDetected` event to log motion start and end, then connects and starts the pipeline.
```csharp
var pipeline = new MediaBlocksPipeline();
// Assuming SystemVideoSourceBlock is already created and configured as 'videoSource'
var motionCellsSettings = new CVMotionCellsSettings
{
GridSize = new VisioForge.Core.Types.Size(8, 6), // Example: 8x6 grid, default is new Size(10,10)
Sensitivity = 0.75, // Example value, C# default is 0.5. Represents sensitivity.
Threshold = 0.05, // Example value, C# default is 0.01. Represents fraction of moved cells.
Display = true, // Default is true
CellsColor = SKColors.Aqua, // Example color, default is SKColors.Red
PostNoMotion = TimeSpan.FromSeconds(5) // Post no_motion after 5s of inactivity, default is TimeSpan.Zero
};
var motionCellsBlock = new CVMotionCellsBlock(motionCellsSettings);
motionCellsBlock.MotionDetected += (s, e) =>
{
if (e.IsMotion)
{
Console.WriteLine($"Motion DETECTED at {e.CurrentTime}. Cells: {e.Cells}. Started: {e.StartedTime}");
}
else
{
Console.WriteLine($"Motion FINISHED or NO MOTION at {e.CurrentTime}. Finished: {e.FinishedTime}");
}
};
var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); // Assuming VideoView1
// Connect blocks
pipeline.Connect(videoSource.Output, motionCellsBlock.Input0);
pipeline.Connect(motionCellsBlock.Output, videoRenderer.Input);
// Start pipeline
await pipeline.StartAsync();
```
--------------------------------
### Initializing Virtual Audio and Video Sources in C#
Source: https://github.com/visioforge/help/blob/main/dotnet/mediablocks/Sources/index.md
This snippet demonstrates how to create a MediaBlocksPipeline and initialize VirtualAudioSourceBlock and VirtualVideoSourceBlock with default settings. It then connects their outputs to AudioRendererBlock and VideoRendererBlock respectively, and starts the pipeline. This sets up a basic virtual audio and video playback system.
```C#
var pipeline = new MediaBlocksPipeline();
var audioSourceBlock = new VirtualAudioSourceBlock(new VirtualAudioSourceSettings());
var videoSourceBlock = new VirtualVideoSourceBlock(new VirtualVideoSourceSettings());
var videoRenderer = new VideoRendererBlock(pipeline, VideoView1);
pipeline.Connect(videoSourceBlock.Output, videoRenderer.Input);
var audioRenderer = new AudioRendererBlock();
pipeline.Connect(audioSourceBlock.Output, audioRenderer.Input);
await pipeline.StartAsync();
```
--------------------------------
### Configuring Allied Vision Camera Source in C#
Source: https://github.com/visioforge/help/blob/main/dotnet/mediablocks/Sources/index.md
This code illustrates how to integrate an Allied Vision camera into a media pipeline. It enumerates available cameras, selects the first, reads its information, creates `AlliedVisionSourceSettings` with optional ROI and configures exposure and gain, then initializes an `AlliedVisionSourceBlock`, connects it to a `VideoRendererBlock`, and starts the pipeline. This functionality requires the Allied Vision Vimba SDK to be installed.
```csharp
var pipeline = new MediaBlocksPipeline();
// Enumerate Allied Vision cameras
var alliedVisionCameras = await DeviceEnumerator.Shared.AlliedVisionSourcesAsync();
if (alliedVisionCameras.Count == 0)
{
Console.WriteLine("No Allied Vision cameras found.");
return;
}
var cameraInfo = alliedVisionCameras[0]; // Select the first camera
// Create Allied Vision source settings
// Width, height, x, y are optional and depend on whether you want to set a specific ROI
// If null, it might use default/full sensor resolution. Camera.ReadInfo() should be called.
cameraInfo.ReadInfo(); // Ensure camera info like Width/Height is read
var alliedVisionSettings = new AlliedVisionSourceSettings(
cameraInfo,
width: cameraInfo.Width, // Or a specific ROI width
height: cameraInfo.Height // Or a specific ROI height
);
// Optionally configure other settings
alliedVisionSettings.ExposureAuto = VmbSrcExposureAutoModes.Continuous;
alliedVisionSettings.Gain = 10; // Example gain value
var alliedVisionSource = new AlliedVisionSourceBlock(alliedVisionSettings);
// Create video renderer
var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); // Assuming VideoView1 is your display control
// Connect blocks
pipeline.Connect(alliedVisionSource.Output, videoRenderer.Input);
// Start pipeline
await pipeline.StartAsync();
```
--------------------------------
### Listing Available Camera Devices on Linux
Source: https://github.com/visioforge/help/blob/main/dotnet/deployment-x/Ubuntu.md
This command uses `v4l2-ctl` to list all available video capture devices (cameras) on a Linux system. It helps in identifying connected cameras and their device paths.
```bash
v4l2-ctl --list-devices
```
--------------------------------
### Initializing MP3Output Instance in C#
Source: https://github.com/visioforge/help/blob/main/dotnet/general/audio-encoders/mp3.md
This snippet demonstrates the basic initialization of an `MP3Output` instance. To create a new `MP3Output` object, you must provide the desired output filename as a constructor argument.
```csharp
// Initialize MP3 output with destination filename
var mp3Output = new MP3Output("output.mp3");
```
--------------------------------
### Creating and Connecting WMV Output Block in C#
Source: https://github.com/visioforge/help/blob/main/dotnet/mediablocks/Outputs/index.md
Demonstrates how to initialize a `MediaBlocksPipeline`, create virtual video and audio sources, and integrate a `WMVOutputBlock`. It shows both default and custom settings for the WMV output, and how to connect the sources to the output block's input pads before starting the pipeline.
```csharp
// create pipeline
var pipeline = new MediaBlocksPipeline();
// create video source (example: virtual source)
var videoSource = new VirtualVideoSourceBlock(new VirtualVideoSourceSettings());
// create audio source (example: virtual source)
var audioSource = new VirtualAudioSourceBlock(new VirtualAudioSourceSettings());
// create WMV output block with default settings
var wmvOutput = new WMVOutputBlock("output.wmv");
// Or, with custom settings:
// var asfSinkSettings = new ASFSinkSettings("output.wmv");
// var wmvEncSettings = WMVEncoderBlock.GetDefaultSettings();
// wmvEncSettings.Bitrate = 3000000; // Example: 3 Mbps
// var wmaEncSettings = WMAEncoderBlock.GetDefaultSettings();
// wmaEncSettings.Bitrate = 160000; // Example: 160 Kbps
// var wmvOutput = new WMVOutputBlock(asfSinkSettings, wmvEncSettings, wmaEncSettings);
// Create inputs for the WMV output block
var videoInputPad = wmvOutput.CreateNewInput(MediaBlockPadMediaType.Video);
var audioInputPad = wmvOutput.CreateNewInput(MediaBlockPadMediaType.Audio);
// connect video path
pipeline.Connect(videoSource.Output, videoInputPad);
// connect audio path
pipeline.Connect(audioSource.Output, audioInputPad);
// start pipeline
await pipeline.StartAsync();
// ... later, to stop ...
// await pipeline.StopAsync();
```