Skip to content
On this page

Pipeline

Simplygon pipeline combines common processing patterns into easy to use pipelines. This allows you to cover most of the normal workflows with predefined API entry points without having to setup each individual processing object manually. It also allows for distribution by serializing the pipeline definition to a file and executing the pipeline on a remote node using the pipeline batch executable.

  • Reduction pipeline Geometries are reduced using the reduction processor and optionally new materials are baked using material casters.

  • Quad reduction pipeline Geometries made out of quads are reduced using the quad processor and optionally new materials are baked using material casters.

  • Remeshing pipeline Geometries are replaced using the remeshing processor and optionally new materials are baked using material casters.

  • Aggregation pipeline Geometries are combined using the aggregation processor and optionally new materials are baked using material casters.

  • Billboard cloud pipeline Geometries are replaced with a cloud of billboards fitting the exterior hull. New materials are baked using material casters.

  • Billboard cloud vegetation pipeline Geometries are replaced with a cloud of billboards keeping the visual perception of depth and volume. New materials are baked using material casters.

  • Flipbook pipeline Geometries are replaced with a single quad that is always oriented towards viewer around up-vector. From all specified direction new materials are baked using material casters.

  • Impostor from single view pipeline Geometries are replaced with a single quad from specified view direction. New materials are baked using material casters from said direction.

The processors and material casters are configured using the settings exposed.

Pipeline settings

Pipelines are configured using the settings exposed by the processors and casters in the corresponding pipeline. The settings are either accessed directly through the API objects, or indirectly through paths.

Settings objects

Settings can be manipulated using the API object directly:

cpp
spReductionPipeline reductionPipeline = sg->CreateReductionPipeline();

// Access the settings object for the reduction processor
spReductionSettings reductionSettings = reductionPipeline->GetReductionSettings();
// Set the value for a setting in the settings object
reductionSettings->SetUseHighQualityNormalCalculation( true );
// Get the value for a setting in the settings object
if( reductionSettings->GetUseHighQualityNormalCalculation() )
{
    // ...
}
csharp
spReductionPipeline reductionPipeline = sg.CreateReductionPipeline();

// Access the settings object for the reduction processor
spReductionSettings reductionSettings = reductionPipeline.GetReductionSettings();
// Set the value for a setting in the settings object
reductionSettings.SetUseHighQualityNormalCalculation( true );
// Get the value for a setting in the settings object
if( reductionSettings.GetUseHighQualityNormalCalculation() )
{
    // ...
}
python
reductionPipeline = sg.CreateReductionPipeline()

#  Access the settings object for the reduction processor
spReductionSettings reductionSettings = reductionPipeline.GetReductionSettings()

#  Set the value for a setting in the settings object
reductionSettings.SetUseHighQualityNormalCalculation( True )

#  Get the value for a setting in the settings object
if reductionSettings.GetUseHighQualityNormalCalculation():
    #  ...

Settings paths

Paths can be used to set and get parameters directly on the pipeline object. The paths are created on the form {processor}/{settings_object}/{setting_name} where

  • {processor} is the name of the processor (RemeshingProcessor, ReductionProcessor, AggregationProcessor)
  • {settings_object} is the name of the settings object to manipulate (like Reduction, Repair, MappingImage etc, matching the API call Get{settings_object}Settings() on the processor)
  • {setting_name} is the actual setting name.

The type must match the setting variable type as exposed in the setting object, so a Set{setting_name}( unsigned int ) function on the settings object taking an unsigned integer must be accessed with a SetUIntParameter( "{processor}/{settings_object}/setting_name" ) call. Using the wrong typed Set/Get function is an error. Example:

cpp
spReductionPipeline reductionPipeline = sg->CreateReductionPipeline();

// Set the value for a setting using the path
reductionPipeline->SetBoolParameter( "ReductionProcessor/Reduction/UseHighQualityNormalCalculation", true );
// Get the value for a setting using the path
if( reductionPipeline->GetBoolParameter( "ReductionProcessor/Reduction/UseHighQualityNormalCalculation" ) )
{
    // ...
}
csharp
spReductionPipeline reductionPipeline = sg.CreateReductionPipeline();

// Set the value for a setting using the path
reductionPipeline.SetBoolParameter( "ReductionProcessor/Reduction/UseHighQualityNormalCalculation", true );
// Get the value for a setting using the path
if( reductionPipeline.GetBoolParameter( "ReductionProcessor/Reduction/UseHighQualityNormalCalculation" ) )
{
    // ...
}
python
reductionPipeline = sg.CreateReductionPipeline()
# Set the value for a setting using the path
reductionPipeline.SetBoolParameter( "ReductionProcessor/Reduction/UseHighQualityNormalCalculation", True )
# Get the value for a setting using the path
if reductionPipeline.GetBoolParameter( "ReductionProcessor/Reduction/UseHighQualityNormalCalculation" ):
    #  ...

The path access is mostly indended for use in languages and environments where string manipulation is the canonical way of working. In native C++ we recommend using the settings API objects directly.

Generic pipeline settings

All pipelines have a generic settings object for shared settings across all pipelines. It is accessed through GetPipelineSettings() on the pipeline object, or with the path Pipeline/{settings_name} in the path API.

Global settings

The global settings for the Simplygon API can also be accessed through the path API, using the path Global/{settings_name}.

Input and output

Pipelines take scenes as input and return processed scenes as output. Input and output can either be scene API objects, or paths to files that can be read as scenes. Currently the file formats supported are limited to the internal scene file format and Wavefront OBJ files. In the case of material casting, the resulting output textures are saved as PNG files in the directory set in TextureOutputPath in the generic pipeline settings object. If the path is relative it is treated as relative to the current working directory.

Example using scene API objects

cpp
spReductionPipeline reductionPipeline = sg->CreateReductionPipeline();
spScene scene = <your code to create/setup scene>;
// The scene passed in is modified by the pipeline and geometries and materials are modified and/or replaced
reductionPipeline->RunScene( scene, EPipelineRunMode::RunInThisProcess );
csharp
spReductionPipeline reductionPipeline = sg.CreateReductionPipeline();
spScene scene = <your code to create/setup scene>;
// The scene passed in is modified by the pipeline and geometries and materials are modified and/or replaced
reductionPipeline.RunScene( scene, EPipelineRunMode.RunInThisProcess );
python
reductionPipeline = sg.CreateReductionPipeline()
scene = #<your code to create/setup scene>
# The scene passed in is modified by the pipeline and geometries and materials are modified and/or replaced
reductionPipeline.RunScene( scene, EPipelineRunMode_RunInThisProcess )

Example using file paths

cpp
spReductionPipeline reductionPipeline = sg->CreateReductionPipeline();
// The scene file passed in is NOT modified by the pipeline, the modified scene is written to the output path
reductionPipeline->RunSceneFromFile( "path/to/input/scene.obj", "path/to/output/scene.obj", EPipelineRunMode::RunInThisProcess );
// Internal scene file format
reductionPipeline->RunSceneFromFile( "path/to/input/scene.sg", "path/to/output/scene.sg", EPipelineRunMode::RunInThisProcess );
csharp
spReductionPipeline reductionPipeline = sg.CreateReductionPipeline();
// The scene file passed in is NOT modified by the pipeline, the modified scene is written to the output path
reductionPipeline.RunSceneFromFile( "path/to/input/scene.obj", "path/to/output/scene.obj", EPipelineRunMode_RunInThisProcess );
// Internal scene file format
reductionPipeline.RunSceneFromFile( "path/to/input/scene.sg", "path/to/output/scene.sg", EPipelineRunMode_RunInThisProcess );
python
reductionPipeline = sg.CreateReductionPipeline()
# The scene file passed in is NOT modified by the pipeline, the modified scene is written to the output path 
reductionPipeline.RunSceneFromFile( "path/to/input/scene.obj", "path/to/output/scene.obj", EPipelineRunMode.RunInThisProcess )
# Internal scene file format
reductionPipeline.RunSceneFromFile( "path/to/input/scene.sg", "path/to/output/scene.sg", EPipelineRunMode.RunInThisProcess )

Material casters

Pipelines allow material casting by attaching any number of material casters to the pipeline. The material casters will execute after the processor in the pipeline and bake the configured texture channels. Materials on scene geometry will be replaced by the new baked materials using the cast textures. Any number of output materials can be baked, and the material index given to AddMaterialCaster determines which material the caster will output texture to.

The pipeline must be configured to use mapping images if you want material casting. This is done with SetGenerateMappingImage(true) on the pipeline mapping image settings.

Example:

cpp
spReductionPipeline reductionPipeline = sg->CreateReductionPipeline();
spColorCaster diffuseCaster = sg->CreateColorCaster();
reductionPipeline->AddMaterialCaster( diffuseCaster, 0 );
csharp
spReductionPipeline reductionPipeline = sg.CreateReductionPipeline();
spColorCaster diffuseCaster = sg.CreateColorCaster();
reductionPipeline.AddMaterialCaster( diffuseCaster, 0 );
python
spReductionPipeline reductionPipeline = sg.CreateReductionPipeline()
spColorCaster diffuseCaster = sg.CreateColorCaster()
reductionPipeline.AddMaterialCaster( diffuseCaster, 0 )

There is also a string based interface to add material casters to a pipeline:

cpp
spReductionPipeline reductionPipeline = sg->CreateReductionPipeline();
spMaterialCaster materialCaster = reductionPipeline->AddMaterialCasterByType( "Color", 0 );
csharp
spReductionPipeline reductionPipeline = sg.CreateReductionPipeline();
spMaterialCaster materialCaster = reductionPipeline.AddMaterialCasterByType( "Color", 0 );
python
reductionPipeline = sg.CreateReductionPipeline()
materialCaster = reductionPipeline.AddMaterialCasterByType( "Color", 0 )

The caster type is the base type name without leading interface identifier and caster suffix, i.e IColorCaster becomes Color, INormalCaster becomes Normal.

Material caster settings

You can access the caster settings through the caster object API, for example IColorCaster::GetColorCasterSettings().

To access the caster settings through the string based path API you use a zero based index in the path to specify which caster to access, for example "MaterialCaster/0/MaterialChannel" to specify the name of the first caster material output channel.

Event handling

The pipelines supports the progress event, SG_EVENT_PROGRESS, and aggregate progress events from the processors and casters into a single continuous range. Note that event handling is only supported by the C++ API, use a separate thread and polling in C#.

Example - Observe progress

cpp
class ProgressObserver : public Simplygon::Observer
{
    public:
    bool OnProgress( Simplygon::spObject subject, Simplygon::real progressPercent ) override
    {
        printf( "Progress: %f\n", progressPercent );
        // return false to abort the processing
        return true;
    }
} progressObserver;

void RegisterProgressObserver( spReductionPipeline pipeline )
{
    // Register the observer with the pipeline.
    pipeline->AddObserver( &progressObserver );
}
csharp
// Copyright (c) Microsoft Corporation. 
// Licensed under the MIT license. 

using System;
using System.IO;
using System.Threading.Tasks;

public class ProgressObserver : Simplygon.Observer
{
    public override bool OnProgress( Simplygon.spObject subject, float progressPercent )
    {
        Console.WriteLine( $"Progress: {progressPercent}" );
        // return false to abort the processing
        return true;
    }
}

public class Program
{
    static private ProgressObserver progressObserver;

    static void RegisterProgressObserver( Simplygon.spReductionPipeline pipeline )
    {
        // Register the observer with the pipeline.
        pipeline.AddObserver(customObserver);
    }

    // Rest of program
}
python
class ProgressObserver( Simplygon.Observer ):
    def OnProgress( self, subject: Simplygon.spObject, progressPercent: float ):
        print( "Progress: %f" %(progressPercent) )
        # return False to abort the processing
        return True

progressObserver = ProgressObserver()

def RegisterProgressObserver( pipeline: Simplygon.IReductionPipeline ):
    # Add the custom observer to the reduction pipeline. 
    pipeline.AddObserver( progressObserver )

Cascading

Pipelines support cascading, where the output from one pipeline is piped as input to another pipeline. This can be used to automate creation of LOD chains, for example performing cascaded reduction for a number of LOD levels and finally creating a proxy mesh by remeshing as final LOD level. All pipelines in a cascade will be processed when you call RunScene/RunSceneFromFile on the top-level pipeline.

Adding a cascaded pipeline

To add a cascaded pipeline use the AddCascadedPipeline method. This will add the given pipeline as a cascaded child to the pipeline on which the method was called. You can add multiple cascaded pipelines to a pipeline object if you wish to create a tree-like cascade.

cpp
spReductionPipeline reductionPipelineLOD1 = sg->CreateReductionPipeline();
spReductionPipeline reductionPipelineLOD2 = sg->CreateReductionPipeline();
spRemeshingPipeline remeshingPipelineLOD3 = sg->CreateRemeshingPipeline();

reductionPipelineLOD1->AddCascadedPipeline( reductionPipelineLOD2 );
reductionPipelineLOD2->AddCascadedPipeline( remeshingPipelineLOD3 );

// This will process all three pipelines
// The scene passed in is modified by the first pipeline
// The second and third pipeline will copy the scene before modifying it
reductionPipelineLOD1->RunScene( scene, EPipelineRunMode::RunInThisProcess );
csharp
spReductionPipeline reductionPipelineLOD1 = sg.CreateReductionPipeline();
spReductionPipeline reductionPipelineLOD2 = sg.CreateReductionPipeline();
spRemeshingPipeline remeshingPipelineLOD3 = sg.CreateRemeshingPipeline();

reductionPipelineLOD1.AddCascadedPipeline( reductionPipelineLOD2 );
reductionPipelineLOD2.AddCascadedPipeline( remeshingPipelineLOD3 );

// This will process all three pipelines
// The scene passed in is modified by the first pipeline
// The second and third pipeline will copy the scene before modifying it
reductionPipelineLOD1.RunScene( scene, EPipelineRunMode.RunInThisProcess );
python
reductionPipelineLOD1 = sg.CreateReductionPipeline();
reductionPipelineLOD2 = sg.CreateReductionPipeline();
remeshingPipelineLOD3 = sg.CreateRemeshingPipeline();

reductionPipelineLOD1.AddCascadedPipeline( reductionPipelineLOD2 );
reductionPipelineLOD2.AddCascadedPipeline( remeshingPipelineLOD3 );

# This will process all three pipelines
# The scene passed in is modified by the first pipeline
# The second and third pipeline will copy the scene before modifying it
reductionPipelineLOD1.RunScene( scene, EPipelineRunMode_RunInThisProcess );

Accessing the cascaded processed scenes

The resulting scene from each cascaded pipeline can be accessed by calling GetProcessedScene on the corresponding pipeline, or GetCascadedSceneForIndex on the corresponding resulting scene. All cascaded scenes will automatically be exported when using the RunSceneFromFile function with an output file name set.

IMPORTANT NOTE

When using cascaded pipelines with RunSceneFromFile and the output file format does not support cascaded scenes, only the scene from the first top level pipeline is exported to the output file. If you want to use cascaded pipelines and RunSceneFromFile, you must either use a Simplygon scene (.sg) as output format and read back the scene with a SceneImporter, or not write any output file (keeping scene in memory) and use GetProcessedScene on the pipeline. Finally use GetCascadedSceneForIndex on the top level processed scene to access the resulting cascaded sub-scenes. Some file format might have limited support for cascaded scenes, see the limitations section in scene importer and exporter documentation.

cpp
spReductionPipeline reductionPipelineLOD1 = sg->CreateReductionPipeline();
spReductionPipeline reductionPipelineLOD2 = sg->CreateReductionPipeline();
spRemeshingPipeline remeshingPipelineLOD3 = sg->CreateRemeshingPipeline();

reductionPipelineLOD1->AddCascadedPipeline( reductionPipelineLOD2 );
reductionPipelineLOD2->AddCascadedPipeline( remeshingPipelineLOD3 );

// This will process all three pipelines
// The scene passed in is modified by the first pipeline
// The second and third pipeline will copy the scene before modifying it
reductionPipelineLOD1->RunScene( scene, EPipelineRunMode::RunInThisProcess );

// Each pipeline has its own output scene with the processed data
spScene sceneLOD1 = reductionPipelineLOD1->GetProcessedScene();
spScene sceneLOD2 = reductionPipelineLOD2->GetProcessedScene();
spScene sceneLOD3 = reductionPipelineLOD3->GetProcessedScene();

// The scenes can also be accessed from the top level scene
sceneLOD2 = sceneLOD1->GetCascadedSceneForIndex( 0 );
sceneLOD3 = sceneLOD2->GetCascadedSceneForIndex( 0 );

// Process using files
reductionPipelineLOD1->RunSceneFromFile( "path/to/input.fbx", "path/to/output.sg", EPipelineRunMode::RunInThisProcess );

// Use scene importer to access root scene, then GetCascadedSceneForIndex for cascaded results
spSceneImporter importer = sg->CreteSceneImporter();
importer->SetImportFilePath( "path/to/output.sg" );
importer->RunImport();
sceneLOD1 = importer->GetScene();
sceneLOD2 = sceneLOD1->GetCascadedSceneForIndex( 0 );
sceneLOD3 = sceneLOD2->GetCascadedSceneForIndex( 0 );

// Process using file input and in-memory output
reductionPipelineLOD1->RunSceneFromFile( "path/to/input.fbx", nullptr, EPipelineRunMode::RunInThisProcess );

// Access root scene, then GetCascadedSceneForIndex for cascaded results
sceneLOD1 = reductionPipelineLOD1->GetProcessedScene();
sceneLOD2 = sceneLOD1->GetCascadedSceneForIndex( 0 );
sceneLOD3 = sceneLOD2->GetCascadedSceneForIndex( 0 );
csharp
spReductionPipeline reductionPipelineLOD1 = sg.CreateReductionPipeline();
spReductionPipeline reductionPipelineLOD2 = sg.CreateReductionPipeline();
spRemeshingPipeline remeshingPipelineLOD3 = sg.CreateRemeshingPipeline();

reductionPipelineLOD1.AddCascadedPipeline( reductionPipelineLOD2 );
reductionPipelineLOD2.AddCascadedPipeline( remeshingPipelineLOD3 );

// This will process all three pipelines
// The scene passed in is modified by the first pipeline
// The second and third pipeline will copy the scene before modifying it
reductionPipelineLOD1.RunScene( scene, EPipelineRunMode.RunInThisProcess );

// Each pipeline has its own output scene with the processed data
spScene sceneLOD1 = reductionPipelineLOD1.GetProcessedScene();
spScene sceneLOD2 = reductionPipelineLOD2.GetProcessedScene();
spScene sceneLOD3 = reductionPipelineLOD3.GetProcessedScene();

// The scenes can also be accessed from the top level scene
sceneLOD2 = sceneLOD1.GetCascadedSceneForIndex( 0 );
sceneLOD3 = sceneLOD2.GetCascadedSceneForIndex( 0 );

// Process using files
reductionPipelineLOD1.RunSceneFromFile( "path/to/input.fbx", "path/to/output.sg" );

// Use scene importer to access root scene, then GetCascadedSceneForIndex for cascaded results
spSceneImporter importer = sg.CreteSceneImporter();
importer.SetImportFilePath( "path/to/output.sg" );
importer.RunImport();
sceneLOD1 = importer.GetScene();
sceneLOD2 = sceneLOD1.GetCascadedSceneForIndex( 0 );
sceneLOD3 = sceneLOD2.GetCascadedSceneForIndex( 0 );

// Process using file input and in-memory output
reductionPipelineLOD1.RunSceneFromFile( "path/to/input.fbx", nullptr, EPipelineRunMode.RunInThisProcess );

// Access root scene, then GetCascadedSceneForIndex for cascaded results
sceneLOD1 = reductionPipelineLOD1.GetProcessedScene();
sceneLOD2 = sceneLOD1.GetCascadedSceneForIndex( 0 );
sceneLOD3 = sceneLOD2.GetCascadedSceneForIndex( 0 );
python
reductionPipelineLOD1 = sg.CreateReductionPipeline();
reductionPipelineLOD2 = sg.CreateReductionPipeline();
remeshingPipelineLOD3 = sg.CreateRemeshingPipeline();

reductionPipelineLOD1.AddCascadedPipeline( reductionPipelineLOD2 );
reductionPipelineLOD2.AddCascadedPipeline( remeshingPipelineLOD3 );

# This will process all three pipelines
# The scene passed in is modified by the first pipeline
# The second and third pipeline will copy the scene before modifying it
reductionPipelineLOD1.RunScene( scene, EPipelineRunMode_RunInThisProcess );

# Each pipeline has its own output scene with the processed data
sceneLOD1 = reductionPipelineLOD1.GetProcessedScene();
sceneLOD2 = reductionPipelineLOD2.GetProcessedScene();
sceneLOD3 = reductionPipelineLOD3.GetProcessedScene();

# The scenes can also be accessed from the top level scene
sceneLOD2 = sceneLOD1.GetCascadedSceneForIndex( 0 );
sceneLOD3 = sceneLOD2.GetCascadedSceneForIndex( 0 );

# Process using files
reductionPipelineLOD1.RunSceneFromFile( "path/to/input.fbx", "path/to/output.sg" );

# Use scene importer to access root scene, then GetCascadedSceneForIndex for cascaded results
spSceneImporter importer = sg.CreteSceneImporter();
importer.SetImportFilePath( "path/to/output.sg" );
importer.RunImport();
sceneLOD1 = importer.GetScene();
sceneLOD2 = sceneLOD1.GetCascadedSceneForIndex( 0 );
sceneLOD3 = sceneLOD2.GetCascadedSceneForIndex( 0 );

# Process using file input and in-memory output
reductionPipelineLOD1.RunSceneFromFile( "path/to/input.fbx", "", EPipelineRunMode_RunInThisProcess );

# Access root scene, then GetCascadedSceneForIndex for cascaded results
sceneLOD1 = reductionPipelineLOD1.GetProcessedScene();
sceneLOD2 = sceneLOD1.GetCascadedSceneForIndex( 0 );
sceneLOD3 = sceneLOD2.GetCascadedSceneForIndex( 0 );

Batching

Pipelines can be added to a batch for parallel execution, mostly intended for distribution using either the built-in Simplygon Grid, FASTBuild or Incredibuild. A pipeline batch is a queue of any number of pipelines operating on a scene. The pipelines and scenes can be reused any number of times in the same batch. The pipelines and scenes can either be in-memory objects or file paths.

Pipelines and scenes are serialized when added to the batch with a call to the Queue function. Any modifications to the scene and/or pipeline object after the Queue call returns will NOT be reflected in the batch execution.

cpp
spPipelineBatch batch = sg->CreatePipelineBatch();

spReductionPipeline reductionPipeline = sg->CreateReductionPipeline();
spRemeshingPipeline remeshingPipeline = sg->CreateRemeshingPipeline();

/* Code to setup pipelines */

spScene firstScene = <code to load or setup scene>;
spScene secondScene = <code to load or setup scene>;

batch->Queue( reductionPipeline, firstScene );
batch->Queue( remeshingPipeline, firstScene );
batch->Queue( reductionPipeline, secondScene );
batch->Queue( reductionPipeline, secondScene );

// Note that modifying for example the reductionPipeline or firstScene here
// after the Queue call will have no effect on the batch execution

batch->QueueFile( "path/to/pipeline.json", "path/to/scene.fbx" );
batch->QueueFile( "some/other/pipeline.json", "some/other/scene.sg" );

batch->Run( ESimplygonRunMode::RunInThisProcess );
csharp
sgPipelineBatch batch = sg->CreatePipelineBatch();

spReductionPipeline reductionPipeline = sg->CreateReductionPipeline();
spRemeshingPipeline remeshingPipeline = sg->CreateRemeshingPipeline();

/* Code to setup pipelines */

spScene firstScene = <code to load or setup scene>;
spScene secondScene = <code to load or setup scene>;

batch.Queue( reductionPipeline, firstScene );
batch.Queue( remeshingPipeline, firstScene );
batch.Queue( reductionPipeline, secondScene );
batch.Queue( reductionPipeline, secondScene );

// Note that modifying for example the reductionPipeline or firstScene here
// after the Queue call will have no effect on the batch execution

batch.QueueFile( "path/to/pipeline.json", "path/to/scene.fbx" );
batch.QueueFile( "some/other/pipeline.json", "some/other/scene.sg" );

batch.Run( ESimplygonRunMode.RunInThisProcess );
python
batch = sg->CreatePipelineBatch();

reductionPipeline = sg->CreateReductionPipeline();
remeshingPipeline = sg->CreateRemeshingPipeline();

# Code to setup pipelines

firstScene = <code to load or setup scene>;
secondScene = <code to load or setup scene>;

batch.Queue( reductionPipeline, firstScene );
batch.Queue( remeshingPipeline, firstScene );
batch.Queue( reductionPipeline, secondScene );
batch.Queue( reductionPipeline, secondScene );

# Note that modifying for example the reductionPipeline or firstScene here
# after the Queue call will have no effect on the batch execution

batch.QueueFile( "path/to/pipeline.json", "path/to/scene.fbx" );
batch.QueueFile( "some/other/pipeline.json", "some/other/scene.sg" );

batch.Run( ESimplygonRunMode_RunInThisProcess );

Batch processing

Pipelines and pipeline batches can be serialized for batch processing and/or distribution by use of the batch executable tool.

IMPORTANT NOTE

Serialized pipeline files are transient and should never be stored or used on other machines, they must only be used to transfer settings information between processes on the same machine. They are not guaranteed to be compatible between different releases of Simplygon.

Serializing a pipeline and batches

Pipelines and pipeline batches are serialized using the IPipelineSerializer interface.

cpp
spPipelineSerializer serializer = sg->CreatePipelineSerializer();

// Save a pipeline
spPipeline pipeline = <your code to configure a pipeline>;
serializer->SavePipelineToFile( "path/to/pipeline.json", pipeline );

// Load a pipeline
pipeline = serializer->LoadPipelineFromFile( "path/to/pipeline.json" );

// Save a pipeline batch
spPipelineBatch batch = <your code to configure a pipeline batch>;
serializer->SavePipelineBatchToFile( "path/to/pipelinebatch.json", batch );

// Load a pipeline batch
batch = serializer->LoadPipelineBatchFromFile( "path/to/pipelinebatch.json" );
csharp
sgPipelineSerializer serializer = sg.CreatePipelineSerializer();

// Save a pipeline
sgPipeline pipeline = <your code to configure pipeline>;
serializer.SavePipelineToFile( "path/to/pipeline.json", pipeline);

// Load a pipeline
pipeline = serializer.LoadPipelineFromFile( "path/to/pipeline.json" );

// Save a pipeline batch
sgPipelineBatch batch = <your code to configure a pipeline batch>;
serializer.SavePipelineBatchToFile( "path/to/pipelinebatch.json", batch );

// Load a pipeline batch
batch = serializer.LoadPipelineBatchFromFile( "path/to/pipelinebatch.json" );
python
sgPipelineSerializer serializer = sg.CreatePipelineSerializer()

# Save a pipeline
sgPipeline pipeline = <your code to configure pipeline>
serializer.SavePipelineToFile( "path/to/pipeline.json", pipeline)

# Load a pipeline
pipeline = serializer.LoadPipelineFromFile( "path/to/pipeline.json" )

# Save a pipeline batch
sgPipelineBatch batch = <your code to configure a pipeline batch>;
serializer.SavePipelineBatchToFile( "path/to/pipelinebatch.json", batch );

# Load a pipeline batch
batch = serializer.LoadPipelineBatchFromFile( "path/to/pipelinebatch.json" );

Executing a pipeline using batch tool

Run the batch tool executable with three arguments specifying serialized pipeline file, input scene file and output scene file. Created textures are output according to the texture output path settings set in the pipeline.

SimplygonBatch.exe <path/to/pipeline.json> <path/to/input.scene> <path/to/output.scene>

The batch tool uses the scene importer and exporter to read and write files, see the importer and exporter documentation for a list of file formats supported.

If you use cascaded pipelines, you must output the resulting data as a Simplygon scene (.sg) in order to fully capture all the cascaded scene results. Other scene exporters might only output the top level scene from the root pipeline or emulate multiple scenes using top-level nodes. Once you load a Simplygon scene (.sg) from a cascaded pipeline, use the GetCascadedSceneCount and GetCascadedSceneForIndex to access the corresponding cascaded scene results.

Executing a pipeline batch using batch tool

Run the batch tool executable with a single arguments specifying serialized pipeline batch file.

SimplygonBatch.exe <path/to/pipelinebatch.json>

Progress reporting

If you want to parse the progress of the pipeline you can pass -Progress as an extra first parameter to the batch tool executable. This will supress any output from the pipeline execution and instead print progress on stdout as a integer percentage number in [0,100] range separated by newlines. The calling executable can then read stdin and parse the output as integers, one per line.

SimplygonBatch.exe -Progress <path/to/pipeline.json> <path/to/input.obj> <path/to/output.obj>

Examples