# Pipeline
Simplygon pipeline combines common processing patterns into easy to use pipelines. This allows you to cover most of the normal workflows with predefined API entry points without having to setup each individual processing object manually. It also allows for distribution by serializing the pipeline definition to a file and executing the pipeline on a remote node using the pipeline batch executable.
Reduction pipeline Geometries are reduced using the reduction processor and optionally new materials are baked using material casters.
Quad reduction pipeline Geometries made out of quads are reduced using the quad processor and optionally new materials are baked using material casters.
Remeshing pipeline Geometries are replaced using the remeshing processor and optionally new materials are baked using material casters.
Aggregation pipeline Geometries are combined using the aggregation processor and optionally new materials are baked using material casters.
Billboard cloud pipeline Geometries are replaced with a cloud of billboards fitting the exterior hull. New materials are baked using material casters.
Billboard cloud vegetation pipeline Geometries are replaced with a cloud of billboards keeping the visual perception of depth and volume. New materials are baked using material casters.
Flipbook pipeline Geometries are replaced with a single quad that is always oriented towards viewer around up-vector. From all specified direction new materials are baked using material casters.
Impostor from single view pipeline Geometries are replaced with a single quad from specified view direction. New materials are baked using material casters from said direction.
The processors and material casters are configured using the settings exposed.
# Pipeline settings
Pipelines are configured using the settings exposed by the processors and casters in the corresponding pipeline. The settings are either accessed directly through the API objects, or indirectly through paths.
# Settings objects
Settings can be manipulated using the API object directly:
# Settings paths
Paths can be used to set and get parameters directly on the pipeline object. The paths are created on the form {processor}/{settings_object}/{setting_name}
where
{processor}
is the name of the processor (RemeshingProcessor, ReductionProcessor, AggregationProcessor){settings_object}
is the name of the settings object to manipulate (like Reduction, Repair, MappingImage etc, matching the API callGet{settings_object}Settings()
on the processor){setting_name}
is the actual setting name.
The type must match the setting variable type as exposed in the setting object, so a Set{setting_name}( unsinged int )
function on the settings object taking an unsigned integer must be accessed with a SetUIntParameter( "{processor}/{settings_object}/setting_name" )
call. Using the wrong typed Set/Get function is an error. Example:
The path access is mostly indended for use in languages and environments where string manipulation is the canonical way of working. In native C++ we recommend using the settings API objects directly.
# Generic pipeline settings
All pipelines have a generic settings object for shared settings across all pipelines. It is accessed through GetPipelineSettings()
on the pipeline object, or with the path Pipeline/{settings_name}
in the path API.
# Global settings
The global settings for the Simplygon API can also be accessed through the path API, using the path Global/{settings_name}
.
# Input and output
Pipelines take scenes as input and return processed scenes as output. Input and output can either be scene API objects, or paths to files that can be read as scenes. Currently the file formats supported are limited to the internal scene file format and Wavefront OBJ files. In the case of material casting, the resulting output textures are saved as PNG files in the directory set in TextureOutputPath
in the generic pipeline settings object. If the path is relative it is treated as relative to the current working directory.
# Example using scene API objects
# Example using file paths
# Material casters
Pipelines allow material casting by attaching any number of material casters to the pipeline. The material casters will execute after the processor in the pipeline and bake the configured texture channels. Materials on scene geometry will be replaced by the new baked materials using the cast textures. Any number of output materials can be baked, and the material index given to AddMaterialCaster
determines which material the caster will output texture to.
The pipeline must be configured to use mapping images if you want material casting. This is done with SetGenerateMappingImage(true)
on the pipeline mapping image settings.
Example:
There is also a string based interface to add material casters to a pipeline:
The caster type is the base type name without leading interface identifier and caster suffix, i.e IColorCaster
becomes Color
, INormalCaster
becomes Normal
.
# Material caster settings
You can access the caster settings through the caster object API, for example IColorCaster::GetColorCasterSettings()
.
To access the caster settings through the string based path API you use a zero based index in the path to specify which caster to access, for example "MaterialCaster/0/MaterialChannel" to specify the name of the first caster material output channel.
# Event handling
The pipelines supports the progress event, SG_EVENT_PROGRESS, and aggregate progress events from the processors and casters into a single continuous range. Note that event handling is only supported by the C++ API, use a separate thread and polling in C#.
# Example - Observe progress
# Cascading
Pipelines support cascading, where the output from one pipeline is piped as input to another pipeline. This can be used to automate creation of LOD chains, for example performing cascaded reduction for a number of LOD levels and finally creating a proxy mesh by remeshing as final LOD level. All pipelines in a cascade will be processed when you call RunScene/RunSceneFromFile on the top-level pipeline.
# Adding a cascaded pipeline
To add a cascaded pipeline use the AddCascadedPipeline method. This will add the given pipeline as a cascaded child to the pipeline on which the method was called. You can add multiple cascaded pipelines to a pipeline object if you wish to create a tree-like cascade.
# Accessing the cascaded processed scenes
The resulting scene from each cascaded pipeline can be accessed by calling GetProcessedScene on the corresponding pipeline, or GetCascadedSceneForIndex on the corresponding resulting scene. All cascaded scenes will automatically be exported when using the RunSceneFromFile function with an output file name set.
IMPORTANT NOTE
When using cascaded pipelines with RunSceneFromFile and the output file format does not support cascaded scenes, only the scene from the first top level pipeline is exported to the output file. If you want to use cascaded pipelines and RunSceneFromFile, you must either use a Simplygon scene (.sg) as output format and read back the scene with a SceneImporter, or not write any output file (keeping scene in memory) and use GetProcessedScene on the pipeline. Finally use GetCascadedSceneForIndex on the top level processed scene to access the resulting cascaded sub-scenes. Some file format might have limited support for cascaded scenes, see the limitations section in scene importer and exporter documentation.
# Batching
Pipelines can be added to a batch for parallel execution, mostly intended for distribution using either the built-in Simplygon Grid, FASTBuild or Incredibuild. A pipeline batch is a queue of any number of pipelines operating on a scene. The pipelines and scenes can be reused any number of times in the same batch. The pipelines and scenes can either be in-memory objects or file paths.
Pipelines and scenes are serialized when added to the batch with a call to the Queue function. Any modifications to the scene and/or pipeline object after the Queue call returns will NOT be reflected in the batch execution.
# Batch processing
Pipelines and pipeline batches can be serialized for batch processing and/or distribution by use of the batch executable tool.
IMPORTANT NOTE
Serialized pipeline files are transient and should never be stored or used on other machines, they must only be used to transfer settings information between processes on the same machine. They are not guaranteed to be compatible between different releases of Simplygon.
# Serializing a pipeline and batches
Pipelines and pipeline batches are serialized using the IPipelineSerializer
interface.
# Executing a pipeline using batch tool
Run the batch tool executable with three arguments specifying serialized pipeline file, input scene file and output scene file. Created textures are output according to the texture output path settings set in the pipeline.
SimplygonBatch.exe <path/to/pipeline.json> <path/to/input.scene> <path/to/output.scene>
The batch tool uses the scene importer and exporter to read and write files, see the importer and exporter documentation for a list of file formats supported.
If you use cascaded pipelines, you must output the resulting data as a Simplygon scene (.sg) in order to fully capture all the cascaded scene results. Other scene exporters might only output the top level scene from the root pipeline or emulate multiple scenes using top-level nodes. Once you load a Simplygon scene (.sg) from a cascaded pipeline, use the GetCascadedSceneCount
and GetCascadedSceneForIndex
to access the corresponding cascaded scene results.
# Executing a pipeline batch using batch tool
Run the batch tool executable with a single arguments specifying serialized pipeline batch file.
SimplygonBatch.exe <path/to/pipelinebatch.json>
# Progress reporting
If you want to parse the progress of the pipeline you can pass -Progress
as an extra first parameter to the batch tool executable. This will supress any output from the pipeline execution and instead print progress on stdout as a integer percentage number in [0,100] range separated by newlines. The calling executable can then read stdin and parse the output as integers, one per line.
SimplygonBatch.exe -Progress <path/to/pipeline.json> <path/to/input.obj> <path/to/output.obj>