Centerprise API User Guide

Introduction

Centerprise brings a new level of functionality, openness, extensibility, and performance to data integration tools. Its framework-based design makes it suitable for infinite deployment scenarios. Centerprise is designed to be a high performance, extensible integration framework and to provide easy to use Microsoft .NET APIs that enable customers and partners to:

  • Extend Centerprise by adding new sources, destinations, transformations, workflow tasks, functions and more
  • Create customized solutions for data mapping, conversion, or integration
  • Integrate Centerprise as part of applications and workflows to provide high-performance data management functionality

This document contains a discussion of key APIs and usage scenarios. The objective is to provide an overview of these APIs and introduce selected sample components.

Centerprise Flow Document Model

Centerprise flow documents contain dataflow and workflow graphs. These documents are stored as standard XML when persisted. Centerprise API’s expose the object model for these documents to enable dynamic creation and modification of these documents. The following subsections provide an overview of the flow document classes.

Flow Document

This is the base class for all flow documents including dataflow and workflow. This includes actions, maps, and links. FlowDocument contains information about actions, links, and maps.

Actions

These represent boxes in a flow diagram. In dataflow, these boxes represent sources, destinations, and transformations while in a workflow document; actions represent workflow tasks, data sources, and decision objects. Action contains all the properties of the specific item as well as size and position information.

Links

Links are primarily used in workflow document and represent connections between actions. A link contains start action, end action, and link type.

Maps

Maps are used in dataflow documents. Maps link tree nodes. A map contains start path, end path, and map type.

Methods

Flow document also provides APIs for verification, persistence, and UI support.

Dataflow Document

DataflowDocument class contains information about a dataflow document. Dataflow documents are saved as standard XML files with extension .df. Here is XML for an empty DataflowDocument.

<?xml version=”1.0” encoding=”utf-8”?>

<DataflowDocument xmlns:xsi=”http://www.w3.org/2001/XMLSchema-instance” xmlns:xsd=”http://www.w3.org/2001/XMLSchema”>

<Id>0</Id>

<Name />

<Description />

<CreateDtTm> 0001-01-01T00:00:00</CreateDtTm>

<UpdateDtTm>0001-01-01T00:00:00</UpdateDtTm>

<Actions />

<Links />

<Maps />

</DataflowDocument>

SubDataflowDocument

Represents a subflow document. Subflows are dataflows that can be used inside dataflows and other subflows. This enables development of modular dataflows.

RDataflowDocument

Represents real-time dataflow. Real-time dataflows will be implemented in a future version of Centerprise.

WorkflowDocument

Represents a workflow graph. Centerprise workflows provide job sequencing, dependency, and looping capabilities.

Key Concepts

Centerprise Type System

Centerprise uses a type system for meta-information about record layouts. This type system is used by Centerprise designer and runtime to process data objects. Two key interfaces define the type system-IMetaObject and IDataObject. IMetaObject specifies type information while IDataObject specifies instance information.

IMetaObject

IMetaObject contains meta-information about a record layout. IMetaObject structure is as follows:

Elements represent layout fields. Children are collection member objects while Referenced are single instance member objects.

Astera.Core.MetaObject provides an implementation of IMetaObject. This implementation is serializable and can be used for most situations.

IDataObject

IDataObject represents instance information. At runtime, Centerprise converts all incoming data into classes that implement IDataObject. CDataObject is the base class for most of these classes. IDataObject mirrors IMetaObject in that it contains a collection of element values, a collection of referenced objects, and a collection of child collections.

IDataObjectWithMessages interface defines APIs for data quality messages.

You should not create an implementation of this interface. Use RecordObject class in order to implement this interface.

Flow Actions

Flow actions are boxes on the flow diagram. To create a flow action, you must create four different classes implementing the following interfaces:

  • IFlowItemData-Data structure that stores action properties at runtime and persisted. Also provides inbound and outbound layout and design time verification support.
  • IFlowActionEditorPage-Editor control page(s) in action properties wizard
  • IProcessor-Runtime processor for the action. Different sub-interfaces must be implemented for different types of actions
  • IFlowActionTemplate-Information for Centerprise designer and toolbox

Classes implementing IFlowActionEditorPage and IFlowActionTemplate are UI classes and must be compiled in separate assemblies from IFlowItemData and IProcessor implementations. This is necessary to ensure that the .NET user interface framework is not needed on the server where only back-end assemblies will be installed.

FlowDocumentItemAction

This class represents an instance of flow action inside a flow document. It contains ItemData and designer template information. This class is sealed and cannot be subclassed.

Item Data

ItemData implements IFlowItemData interface. ItemData classes are used to store all the design time properties for the action. Item data must have the following characteristics:

  • Contains a TemplateName attribute. This attribute binds ItemData class to a specific designer template.
  • Must be XML serializable. Centerprise stores documents in XML and uses XMLSerializer class to load and save documents.
  • Implements ICreatesProcessor interface that creates runtime processor for the action.
  • Implements Verify method to perform design and runtime verification of action properties.

Editor Pages

These are .NET controls that implement IFlowActionEditorPage interface. Centerprise action properties are organized like wizards and these controls represent wizard pages.

Processors

Processors are runtime components that are instantiated during job execution or preview. Processors perform actions or transformations at runtime. All processors implement IProcessor interface. Additionally, Centerprise defines interfaces for specific processor types. Examples of processor interface types include IReader, IWriter, ISingleProcessor, IAsyncSetProcessor, and ITaskProcessor. Centerprise engine is highly parallel and processors must be written to be thread safe.

Commands

Commands are menu items or toolbar buttons that can be added to the context menu of an action or to the toolbar in the properties wizard. Centerprise commands implement ICommand interface. Use context commands to provide functionality specific to an action or a property page in the action.

Flow Templates

Flow templates are items that appear in the toolbox when using Centerprise flow documents. Flow templates implement IFlowActionTemplate interface. Flow templates are used only in the designer and do not have a runtime role. They should reside in assemblies separate from the back-end classes. You should not access flow template classes from your ItemData or Processor classes. Flow templates are used to specify Editor Pages, Commands, action images, default names, custom renderer, and template type.

How it Comes Together - Design Time

When Centerprise Data Integrator client starts, it loads the built-in as well as custom templates. Custom templates are loaded from the plug-in directory. When a user opens or creates a flow document, Centerprise displays relevant templates in the toolbox.

When a user drags a template to a flow document, an instance of FlowDocumentItemAction is created and added to the flow document. FlowDocumentItemAction creates an instance of ItemData class and uses template properties to set up user interface and rendering.

When a user edits properties of an action, Centerprise clones ItemData and the properties dialog (IFlowActionEditorPage) works on the clone. This cloning is done to provide cancel functionality and support undo stack. If ItemData is ICloneable, then Clone method is called to create a copy. Otherwise, Centerprise uses XML Serialize and Deserialize to create a copy. When a user presses OK in property editing, the original Action in the document is replaced with the modified version.

Centerprise passes an instance of DesignerContext class to property editor pages. This class provides information about the document. This context information is helpful for you in validating action data. You should never modify any information outside the action being edited. Such modification could have unpredictable result and may cause Centerprise to malfunction.

Developing Dataflow Actions - Design Time

Design time support for creation of dataflow actions is described in this chapter. The design time part of the action consists of creating templates, editor pages for properties, and using built-in features inside a custom dataflow action. A template is used to populate the toolbox in the designer and to draw the shape for the item when it is dragged and dropped on the designer. The editor pages are used for editing the properties of the transformation.

Creation of Flow Templates

A template is used as a stencil in the designer to create items for an action. Template contains information such as name prefix to be used for the items, description of the item, image to be displayed in the designer, template type for the grouping in the toolbox, and information about the editor pages. In most of the cases, creating a subclass of the base implementation of DataflowTemplateBase and setting some properties is enough. If the transformation needs editor pages, you’ll need to override the GetEditorPages method that returns an array of editor pages to be used in the property editor.

Working With Action Property Editors

Action property editors follow a standard pattern of showing the property pages in a wizard. Centerprise allows custom actions to use the same structure. Actions need to provide a list of the editor pages. These editor pages need to implement the interface IFlowActionEditorPage. By implementing this interface, you can create your own editor that works with the item data object.

There are some built-in editor pages used in common scenarios. For example, the database connection dialogue page (WizardPageDataflowDbInfoBase) and the layout builder page (WizardPageMetaObjectBuilder) are used in many actions.

To add the editor pages to the action, you need to override the GetEditorPages method of the DataFlowTemplateBase in the template for the action.

Using Designer Services

There are many built-in constructs to facilitate the creation of new dataflow actions.

DesignerContext Class

Designer context contains the information about the document and the current action. It is passed to the Item Data object with OnActionModified method when the action is edited in the user interface.

Adding/Removing Elements Dynamically

If your action allows users to modify layout dynamically, you will need to implement ICanAddElement is. When an ItemData implements this interface, an additional element is appended to existing elements in the layout. Creating a map to this element causes Centerprise to add a new element to the layout. The new element is created using information from the source element. Also, remove element context menu command is enabled for this action.

Parameters

Interface ISupportsParameters is designed to support the scenario where certain properties of an action must be treated as parameters for the purpose of deployment. For example, the delimited file source action implements ISupportsParametersand designates the source file path as a parameter. This enables user to replace source file path at runtime.

View Table Data and Schema

Interface ISupportsViewTableDataAndSchema is designed to support the integration with the database query feature of Centerprise. By implementing this interface, the implementing action gets the context menu items to view the table data and schema.

Edit File

Interface ISupportsEditFile is designed to support integration with file editor feature of Centerprise. By implementing this interface, the implementing action gets the context menu item to edit a file. For example, if an XML source action implements the interface ISupportsEditFile, on the click of edit file menu item Centerprise opens the underlying XML file for editing.

Adding Context Commands

Interface IProvidesContextCommands is designed to provide context commands for actions. It is implemented by actions to add custom commands to the action context menu. When a user right-clicks on an action that implementsIProvidesContextCommands, the designer checks to see if the action implements this interface. If it does, designer callsContextCommandTypes method to obtain a list of commands.

Custom Renderers

Implementing ITreeRenderer interface can control rendering of the tree*.*

Database Connection Dialog

Centerprise provides a standard database connection dialog. This dialogue is used in many built-in actions and is available to custom actions. If an action needs to work with the database connection dialog, the action must implementIHasDbInfo. WizardPageDataflowDbInfoBase works with this interface and must be added to IFlowActionEditorPage collection for the action.

If you wish to work with shared connections, you can implement the interface IAcceptsSharedConnection that is derived from the interface IHasDbInfo. In this case, you can use the editor page WizardpageDataflowDbInfoAndSharedConnection.

Object Renaming

Renaming is a standard feature for all the actions. The base interface for item data IFlowItemData exposes the method to rename the action.

Developing Dataflow Actions - Runtime Processors

Centerprise Dataflow Pipeline Overview

Centerprise dataflow pipeline is a multithreaded engine that supports a high degree of parallelism. At runtime, dataflow graph is executed as a set of distinct asynchronous steps. Depending on the nature of dataflow and the number of CPUs on the server, anywhere from a few to hundreds of records can be processed simultaneously. Centerprise pipeline uses thread pool architecture. Number of threads created depends on the complexity of dataflow and the number of CPUs on the machine. Centerprise threading model is designed for minimal blocking and starvation. This ensures automatic and continuous load balancing between steps of a dataflow and even between multiple dataflows running on the server.

IDataFlow Task

This interface represents a distinct task in the dataflow pipeline. This interface is used to pass data to a processor.

Runtime Context Class

Runtime context class represents context for the current action. This class provides information about preceding and succeeding tasks as well as indicates whether the flow being executed represents data preview, real time, or a batch dataflow. This class also provides APIs for shared database connections, writing trace, and resolving parameters.

IProcessor Interface

All Centerprise processors implement IProcessor interface. IProcessor interface represents base interface for any kind of processing in Centerprise including readers, writers, transformers, and more.

This interface provides APIs to initialize, close, and terminate a process and report unhandled exceptions.

Single and Set Processors

Centerprise dataflow has two broad types of processors-single and set. Set processors represent transformations, readers, and writers. In Centerprise, data always flows between two set processors. Single processors can be used to transform data for specific fields. Set processors may alter number or sequence of records passing through them while single processors return derived values.

Using Runtime Services

Centerprise pipeline provides a variety of support services to built-in and custom actions. These services enable creation of powerful components with minimal coding. This section discusses some of these services.

Input Ordering

Requesting Ordered Input

An action can notify the pipeline that it expects ordered input by implementing IRequiresOrderedInput interface. This interface notifies the pipeline that the action expects input to be sorted or be in the order released by the previous step. If sorted input is expected, pipeline checks to ensure that the input is sorted on the specified key(s). This interface does not cause the pipeline to actually sort the incoming data. If you need the pipeline to sort incoming data, implement INeedsInplaceSorting interface.

Requesting In-place Sorting

If an action requires the pipeline to sort incoming records before delivering them to its processor, it can request that sorting by implementing INeedsInplaceSorting interface. This interface is derived from IRequiresOrderedInput.

RecordComparer

RecordComparer is an IComparer implementation for Centerprise IDataObjects and is used at runtime to compare records for sorting, ordered input validation, and other purposes. When requesting ordered input or in-place sorting, you provide a RecordComparer instance for each input stream to the action. Comparer provides information about comparison fields, ascending/descending, case sensitivity, and null handling during sorting.

Supporting Expressions

Centerprise features a powerful expression engine. Expressions are used in several transformations in Centerprise Dataflow including Route, Data Quality Rules and Expression Map. Centerprise makes this feature available for customers and partners. You can use NetInterpreter and IRuntimeInterpreter to process expressions at design and runtime respectively.

Using Shared Database Connection

There are situations where a dataflow writes to multiple destinations and user wants to ensure that all the updates are performed in a single transaction. For such situations, Centerprise provides a Shared Connection object. Shared Connection can be used to write to multiple destinations using a single database connection and transaction. DataflowRuntimeContext class provides APIs to obtain and return shared connection. When using shared connection, you must ensure that the connection is quickly returned to the pipeline. If a connection is acquired and not returned, it could create a deadlock in the pipeline, as other users of the shared connection must wait for it to be released. Shared connections should be released in finally block to ensure that the connection is always released even if there was an exception in the user code.

Parameters

To support deployment and reusability of flow documents, Centerprise provides built-in support for parameterization. Centerprise has two types of parameters.

Replaceable Parameters

These parameters appear in the Parameter Replacement dialog when deploying a job on the server or when running a dataflow or workflow from within another workflow. Typically, you will use this type of parameters for file paths, database connection information, and other processing flags. The goal here is to provide end users with the capability to modify environment settings when moving between development, QA, and production. Centerprise replaces parameters during initialization before creating and initializing processors. To support replaceable parameters, implement ISupportsParameters interface in your item data.

Parameterized Strings

Parameterized strings are used to substitute values inside strings. Example of parameterized strings is a SQL statement that requires a specific value. DataflowRuntimeContext class provides APIs to substitute parameters at runtime. FlowDocumentVerifier class provides the functionality to parse and verify a parameter string.

Controlling Completion

By default, when Centerprise sends EndOfInput to a set processor, it marks that processor as completed. In some cases, a processor may want to control its completion. An example of this situation is SortProcessor, which does a great deal of processing after receiving end of input, and releases all its records after receiving end of input. A processor can implement INortifiesCompletion interface to indicate to the pipeline that it will notify the pipeline when it is done. If a processor implements this interface, Centerprise pipeline waits for Completed notification from it. The pipeline will hang if no notification is received.

Data Preview Considerations

Data preview is a key feature of Centerprise dataflows. End users consistently rate data preview as one of the most useful features in the product. DataflowRuntimeContext contains a flag indicating whether the action is being run in preview mode. If you update persistent data in your action, ensure that you do not perform such update in preview mode.

Parallel Processing Considerations

Centerprise dataflow pipeline is a highly parallel engine and takes full advantage of today’s multicore and multiprocessor hardware. This helps Centerprise deliver the performance and scalability to efficiently process high data volumes. As a component developer, multithreaded programming brings you opportunities to create highly scalable components. However, it also requires you to ensure that your program performs correctly and efficiently in a parallel environment. Here are a few things you must watch for when developing Centerprise components.

Guarding Against Race Conditions

Assume that your code can be called simultaneously from multiple threads. You must ensure that any shared resources are locked before being used or updated. If you are using Shared Connection, ensure that it is released even if your code throws an exception.

Avoid Excessive Locking

Avoid too much locking. Use locks only around the code that updates or accesses shared resources. Excessive locking will make your processor a serial component and may degrade performance of the whole pipeline.

Throwing Exceptions

Centerprise pipeline uses a thread pool to process data in the pipeline. Your code may be called from any number of threads concurrently. If you throw an unexpected exception, you may terminate the entire pipeline and, in some cases, bring down the server. IProcessor interface provides an event to notify the pipeline of an unhandled exception. Raise this event instead of throwing an exception. This will cause Centerprise to gracefully terminate the pipeline and rollback any transactions if applicable.

If you encounter record level errors that do not necessitate terminating the pipeline, add error information to the RecordObject being processed (see below) and pass the record to the pipeline by raising TaskCompleted event. Do not throw an exception in this case.

Trace Output

Trace output is displayed in the transfer progress window during job execution. Additionally, trace output is also stored in the job execution log at the server. DataflowRuntimeContext provides APIs for adding to the trace output.

RecordObject Class

This class represents an individual record passing through the dataflow pipeline. RecordObject implements IDataObject and provides additional capabilities for adding record level error messages. Use RecordObject to release output to the pipeline using TaskCompleted event.

Developing Workflow Actions

Centerprise Workflow actions are objects that are run sequentially or in parallel by the Workflow engine. They are built on the concept of tasks rather than records as in Dataflow actions. The biggest difference is that a dataflow usually has all actions in the graph “active” while it is being run, running the processor code for each record, while a workflow is generally processing one action at a time, running its processor code only once. This allows for the execution of business logic at a higher level than a dataflow and is generally reserved for tasks that complement a dataflow such as moving files around or initializing and cleaning up resources.

Workflow actions visually appear very similar to dataflow actions within the designer. Both rely on the same structures in order to be present in the user interface. A key difference is that Dataflow actions have maps, while Workflow actions have links. Also, Workflow actions only appear in a Workflow diagram.

IWorkflow ItemData

A major component of a Workflow action is the ItemData object. This object is instantiated for each action in the Workflow graph and is serialized along with the Workflow and persisted to the disk when Save is called on the Workflow. It contains all of the action’s properties that change depending on each instance of the action, such as FilePaths, flags, and other specific properties.

To be used as a Workflow ItemData object, a class must implement IWorkflowItemData. By implementing this interface, a class is responsible for the OutputMeta, which should describe the object returned by the action. For example, if an action returns an object has properties, “A”, “B”, and “C”, the OutputMeta should define these properties.

ITaskProcessor

A Workflow action’s ITaskProcessor is the component that actually executes the runtime logic for an instance of a Workflow action. This object only exists at run-time. Any code that is involved in actually performing the task the Workflow action represents should go in here.

To be used as a Workflow ITaskProcessor, a class must implement ITaskProcessor. The main method in this interface is the Start method. A class should also implement the Started, and Completed events to pass along to the Workflow engine to signal completion.

IFlow Action Template

A Workflow Action Template is identical to the way the Template behaves for a Dataflow action. The Template is responsible for defining how a Workflow action behaves visually to an end user of the designer. A class that implements this interface should define the name, description, and category of the action. It should also provide graphical UI parts such as the icon to be displayed in the left side of an action “box” as well as define any custom editor pages necessary for editing an action’s ItemData.

When developing a Workflow Action Template, there are a couple of spots to pay particular attention to. The first is the template name. This is important, as you must decorate your ItemData class with the FlowTemplateName Attribute with this name. The other spot to mind is the ActionTemplate’s TemplateType. This should be set to WorkflowAction in order for it to show up in the Workflow toolbox.

The GetEditorPages method should be overridden in order to supply the designer with a custom property editor. It is asking for an array of types. For each type that is supplied in the array, a “page” will be present in the navigation for that editor. Property editors must implement IFlowActionEditorPage.

ITask MonitoringInfo

During the course of a Workflow action’s processing, it may necessary to return information about how far along the task is completed. The processor must reference an ITaskMonitoringInfo property to contain this information. By implementing this interface, a class must provide for start time, end time, an update status, and percentage completed. Other properties can be added and introduced as custom monitoring information.

Parameter Replacing (IHasResolveableParameters)

In many cases, properties being set for an action’s ItemData should not be set at design time, but should instead be set from the outside based on some dynamic condition. For example, an action might rely on a file path and that file path should refer to whatever path was dropped to initiate a file trigger. To accommodate this scenario, an action’s ItemData must implement IHasResolveableParameters.

If an ItemData implements IHasResoveableParameters, it must implement the ResolveParameters method. In this method, an ItemData can set its properties to another value, but the intent is to most likely resolve paths referring to another action’s output value or context value or document parameter value (e.g. $Parameter1.IncomingFilePath). To resolve these paths, it would be best to make use the static Astera.Transfer.ParameterResolver.ResovleParameter method.

Output Objects

A Workflow action’s ItemData must create an Output object. This output object must match the OutboundMeta structure defined. Most Workflow actions return the WorkflowOutboundBase class, which output properties such as CompletedSucessfully. These output properties can be used in Decision node expressions to control execution paths. The can also be resolved to be used to set another action’s ItemData properties.

Custom Property Editors

Most Workflow action’s ItemData classes provide some properties to that need to be edited and therefore require some customer user interface to set these properties. To create these UI pages, create a user control and implement from IFlowActionEditorPage. If your editor requires two pages, return the types of these UI pages in the ActionTemplate’s GetEditorPages method.

Adding Functions

Centerprise functions can be used in dataflow function map as well as from Centerprise rules language. Centerprise rules language is used in Expression Map, Filter, Router, Data Quality Rules, and Data Driven Write Strategy transformations. Centerprise has an extensive library of built in functions including a number of functions for string and date processing, name and address parsing, financial and mathematical calculations, validations, and more. Centerprise provides the capability to augment the built-in function library by adding custom functions. Following sample shows an example of writing custom functions.

This sample demonstrates how to create custom functions for Centerprise. The class needs to have the RuleFunctionClass attribute and each of the rule functions need to have Function attributes. These two attribute classes are defined in the Astera.Core namespace.

[RuleFunctionClass]

class RuleFunctions

{

[Function(”SampleRule”, “This sample rule demonstrates the addition of two integers”, true)]

public static int SampleRuleFunction(int a, int b)

{

return a + b;

}

}

Building Flow Documents Programmatically

Overview

Centerprise provides an open document object model for all the flow documents. These documents are saved as standard XML files. Centerprise provides the functionality to create, modify, load, and save flow documents.

This capability can be used to create customized solutions that require data integration functionality. Some usage scenarios that we have encountered include:

  • Customer integration portal where, based on source layout and mapping defined by users, a flow document is created and executed dynamically to process customer data.
  • An application that provides mapping capability to customers and modifies existing flow documents on the fly based on customer data layout and mapping
  • An application that needs high-performance data management capability

Working with Centerprise Server

ServerProxy Class

ServerProxy class manages all the interaction between client and server. It shields CIF developer from the underlying server interaction and marshalling issues and provides API to interact with Centerprise Integration Server. You can communicate with any instance of the Server running on your network through ServerProxy class. You can use ServerProxy class to:

  • Submit job to server
  • Monitor and manage jobs
  • Schedule jobs
  • Administer server

ServerProxy class throws Astera.ServerException to the caller if the server is not available or for any other type of error.

Job Class

Job class is used to build jobs that can be run on local or remote servers. Using TransferJob class, you can build single or batch transfer jobs and submit these jobs to run on any server in the network using ServerProxy class. If your server is configured for email notification, you can set email notification properties in TransferJob and server will send notification emails based on these properties.

Running Transfer

Running transfers using ServerProxy is straightforward. Sample Running job on Server show how to run single and batch transfers on a server.

Monitoring

Centerprise Integration Server provides extensive monitoring capabilities. ServerProxy class contains methods that help you monitor jobs on servers. When you submit job on a server, the server returns a job id. Using that id, you can request monitoring information from the server using GetJobMonitoringInfo. Server returns a JobMonitoringInfo that provides detailed information about status of job. You can terminate a running job by calling Terminate with the job id.

Scheduling

ServerProxy provides SaveScheduledJob, DeleteScheduledJob, and GetScheduledJobs methods to manage scheduled jobs from your programs. Sample Scheduling Jobs on Server provides an example of creating new scheduled jobs.

Retrieving Server Monitoring Info

You can perform a health check on the server by calling GetServerInfo, which returns server version, processes currently queued and running, as well as staging directory path.

Server Administration

Server administration capability available in the Server Manager module of the studio is also available through API. You can use GetServerProperties and SaveServerProperties methods to retrieve and save Server properties.

Dataflow Sample 1 - Distinct Transformation

Overview

This sample is intended to illustrate how to create a transformation where the input and the output layouts of the transformation are same. One such transformation is distinct, which is an in-built transformation in Centerprise. We are going to examine the implementation of this transformation step by step.

A Distinct transformation takes a set of records and returns a set of distinct records. Distinct requires a key or a set of keys to determine the uniqueness of records. Apart from that, it exposes functionality to make the uniqueness case sensitive, channel the duplicate records to another port, and to specify if the keys already sort the incoming data.

The process to create the transformation is divided into three parts - data structure called item data, runtime processor, and the user interface.

Item Data

Item data for the transformation is the data structure to be used by the runtime processor and the user interface. For transformations, there is a base implementation for item data called DataflowItemDataRecordTransformation. We derive theDataflowItemDataDistict from this base class. The base class has a property called Layout of type BaseLayout. This layout is used as the input and output layouts for this category of transformations where the input and output layouts are the same. Apart from what is inherited from the base class, we add members/properties to hold values for our implementation. In this case, we add a structure to hold the keys, flags to hold the values for case-sensitivity, exposure of duplicate records as another port, and if the incoming data is already sorted by the keys. Also, you need to specify the runtime processor class by overriding the CreateProcessor method. This is how the code snippet looks like after adding the properties -

public class DataflowItemDataDistinct :DataflowItemDataRecordTransformation

{

public DataflowItemDataDistinct()

{

this.Layout = new RecordLayout();

}

private List _DistinctKeyFieldNames = new List();

public List KeyFieldNames

{

get { return this._DistinctKeyFieldNames; }

}

public bool IsInputSorted { get; set; }

public bool IsCaseSensitive { get; set; }

public bool AddDuplicatePort { get; set; }

public override IProcessor CreateProcessor(DataflowRuntimeContext context)

{

return new DistinctTransformationProcessor(context, this);

}

}

There are some other methods from the base class that you may override. Please see the API documentation for details of these virtual methods.

Processor

A processor is created at the runtime to execute the logic where it gets the item data and record objects to work with. We are going to use a base implementation for the processor called BufferedSetProcessorBase for Distinct. Distinct needs to gather all the incoming records in buffer before deciding on the duplicate records and which is why we are going to use this base processor. When the record arrives in the pipeline, Process method of the processor is called with the IDataflowTaskas parameter. IDataflowTask contains the incoming record object. Here is a code snippet of how the processor looks -

class DistinctTransformationProcessor : BufferedSetProcessorBase

{

public DistinctTransformationProcessor(DataflowRuntimeContext context, DataflowItemDataDistinct distinct)

: base(distinct.Name)

{

this.Context = context;

this.Distinct = distinct;

}

DataflowItemDataDistinct Distinct;

public override void Process(IDataflowTask task)

{

//Do processing here

}

}

User Interface

User interface consists of two parts - template and editor pages. A template is used to populate the toolbox in the designer and to draw the shape for the item when it is dragged and dropped on the designer. The editor pages are used for editing the properties of the transformation.

Template

A template is used as a stencil in the designer to create items of a type. Template contains information such as name prefix to be used for the items, description of the item, image to be displayed in the designer, template type for the grouping in the toolbox, and information about the editor. In most of the cases, creating a subclass of the base implementation ofDataflowTemplateBase and setting some properties is enough. If the transformation needs editor pages, you’ll need to override the GetEditorPages method and provide the editor pages to be used in the property editor. Here is how the template for distinct looks -

public class DataflowTemplateDistinctTransformation : DataflowTemplateBase

{

public DataflowTemplateDistinctTransformation()

{

this.Name = this.NamePrefix = “Distinct”;

this.Description = “Distinct”;

this.Image = new System.Drawing.Bitmap(”distinct.bmp”);

this.Category = “Transformations”

this.TemplateType = FlowActionTemplateType.SetTransformation;

this.IsStateless = true;

}

public override Type[] GetEditorPages()

{

return new Type[] { typeof(WizardPageMetaObjectBuilder), typeof(WizardPageDistinct) };

}

}

Editor Pages

Centerprise provides a convenient way to seamlessly integrate your own editors for the properties of item data. These editor pages appear the wizard when you open the editor for the item. The base implementation for the editor pages isWizardPageBase. For distinct, we need two pages in the wizard - one to edit the layout and second for editing the properties (IsInputSorted, IsCaseSensitive, and AddDuplicatePorts). For the layout editor, we are going to use the standard layout editor page WizardPageMetaObjectBuilder.

For editing of the properties, we create a new editor page WizardPageDistinct. We need specify page properties like title of the page and tooltip for this page in the attribute PageProperties. OnActionChanged method is called when the page is about to be displayed. By overriding the method, we can populate the user interface with the properties from the *ItemData.*Code snippet for WizardPageDistinct looks like -

[PageProperties(Title = “Distinct Transformation Properties”,

Tooltip = “Distinct Transformation Properties”)]

public class WizardPageDistinct : WizardPageBase

{

public WizardPageDistinct()

{

this.InitializeComponent();

}

protected override void OnActionChanged()

{

// Page is about to be shown. Populate your user interface using the

// ItemData object.

}

}

Dataflow Sample 2 - Union Transformation

Overview

In this sample, we’ll cover an example of a custom transformation that accepts two inputs and only one output. This scenario might apply to might apply to both single value transformations (maps) and record transformations, but for this example, we’ll cover the record transformation case.

This example will create a record transformation that will take records coming from two sources and combine them into one data set. This is very similar to the built in router transformation that comes standard with Centerprise.

Parts

Like with any custom transformation in Centerprise, you have to create four classes. These classes are:

  • The item data which contains the descriptive information about your transformation
  • The template, which contains the code that interfaces your transformation with the diagram designer and toolbox
  • The properties editor, UI code that is used to edit your item data class instance. This class is optional, but you’ll most likely need it if your transformation requires any editing.
  • The processor, which provides runtime execution functionality

Item Data

As always, start with the ItemData class. At the very minimum, the class should implement IFlowItemData, but is much easier to write if it inherits from one of the many ItemData base classes available. In this case, we’ll inherit from DataflowItemDataRecordTransformation . since we are creating a record set transformation.

Make sure to add the references to Astera.Shared and Centerprise.NET to your project and include them using the “using” syntax at the top of the code file.

Build Layout

The first order of business is to establish a layout for our ItemData class. This is the structure that stores information about fields including their names, data types, indices, etc… You can see this information as serialized XML whenever you save a Dataflow.

Since we are inheriting from DataflowItemDataRecordTransformation we should also set a layout. Set the layout in the constructor of your ItemData class.

Set the Layout property to a new instance of RecordLayout. You can create a custom layout type if your transformation requires special properties to be associated with the layout, but in most cases, a plain, vanilla RecordLayout object will suffice.

Build Metas

Most transformations contain a few explicit metadata objects used to tell the Centerprise document how to map between Actions (boxes). The InputMeta object defines the structure that will be used as input. The OutputMeta object defines the structure that will be used as output. And the DisplayMeta object defines a wrapper object that is more conducive to the Action Box’s tree and usually contains both input and output Meta objects. In many cases, the DisplayMeta is exactly the same as the Input and Output Metas (.e.g. the Filter Transformation), but since we have different structures for input and output, we’ll need a separate object.

Single Output

The first Meta we’ll build is the output since it is the easiest. For this example, the output is basically just the layout itself.

To create, instantiate a new MetaObject with the same name as the ItemData instance.

this._OutboundMeta = new MetaObject(this.Name);

This will create an empty meta object with no fields or objects. We’ll want to add the output node so that an “Output” shows in the tree in the designer.

Create a method to add referenced nodes to any Meta object so we can use it for the input as well.

private void AddReferenced(IMetaObject meta, string name, int index = 0)

{

MetaObject referenced = MetaObject.Build(this.Layout.Meta);

referenced.Name = name;

meta.Referenced.Add(referenced);

}

Now, add the “Output” referenced node.

this.AddReferenced(this._OutboundMeta, ParameterConstants.OutboundName);

We have now completed our OutputMeta. If you think of it as a tree, it now looks like MultiInSingleOut1->Output->leaf elements. That is, the Output Meta starts at the root of the tree, not just the “Output” node.

Clone Inputs

The inputs are a little trickier. In this example, we’re going to have multiple inputs all with the same layout similar to how the built in Union transformation works. If we add a field to Input1, the layout will change and therefore, Input2 will change.

First, create the root of the input Meta tree:

_InboundMeta = new MetaObject(this.Name);

Now, add the multiple input objects that will be added as referenced nodes

this.AddInputs(_InboundMeta);

private void AddInputs(IMetaObject meta)

{

//Add default items so there is always at least 2 inputs

if (this._InputItems.Count == 0)

{

this.InputItems.Add(ParameterConstants.InboundName + “_1”);

this.InputItems.Add(ParameterConstants.InboundName + “_2”);

}

//clone the Meta to have the exact same structure under each input item

for (int i = 0; i < this.InputItems.Count; i++)

{

this.AddReferenced(meta, Astera.Core.Common.MakeValidName(this.InputItems[i]));

}

}

This method is adding two items into our list of input items if they are not already present. It is then going through the list of items and then adding a referenced object for every item in the list.

Display Meta

The display Meta is for the showing the ItemData’s inbound and outbound Meta structures in a more user-friendly manner. Recall that both Inbound and Outbound are both completely separate trees with two roots. Two display both of the information in a single tree (as most Actions in Centerprise do), we combine them and show them in a single tree, the DisplayMeta.

Create the DisplayMeta by first creating a root, adding the output, an then adding all of the inputs.

this._DisplayMeta = new MetaObject(this.Name);

this.AddReferenced(this._DisplayMeta, ParameterConstants.OutboundName);

AddInputs(this._DisplayMeta);

Since everything is driven by the Layout, making sure everything matches is taken care of.

Set Can Add Element

The next thing we’re going to want to do is override CanAddElement in order to tell Centerprise under which nodes do we want the “new element” special node. Of course, it only makes sense under input nodes, so we’ll write as much:

public override bool CanAddElement(string objectPath)

{

ObjectPath path = new ObjectPath(objectPath);

//first validate that we have a good path

path.Validate(this.DisplayMeta);

//make sure the path is not output and not the message path

ObjectPath outputPath = new ObjectPath(this.DisplayMeta, ParameterConstants.OutboundName);

return path.ToString() != this.Name && path != outputPath && outputPath.IsAncestorOf(path) == false && IsMessagePath(path) == false;

}

This method is called for every object node in the tree when the UI is building the tree. The ObjectPath will come in Centerprise’s standard dot notation (.e.g. instancename1.subnodeinput).

What this method is doing is basically checking to make sure it is not an outputnode and not at the root. It is also taking into account DataQuality mode and not showing under the “Messages” node.

Set Port Directions

The last thing we need to do in our Item data is set the ports. By default, every node and element will have input and output ports. This is probably not desirable. In our case, we do not want the root to have input nor output ports.

public override MetaUsage GetPathUsage(ObjectPath path)

{

//for this type of structure, you can make use of built in functionality

//to determine if paths are inputs and outputs

return this.GetMultiInputSingleOutputPathUsage(path);

}

You are free to have custom logic for each path you encounter, but in most cases, you should make use the base classes GetMultiInputsSingleOutputPathUsage method, which will take care of this logic for you.

Map Groups

Finally, set the map groups. Since our transformation has multiple inputs, it has multiple sets of mappings.

#region IBuildsMapGroups Members

public bool BuildMapGroups

{

get { return true; }

}

public MapGroup[] GetMapGroups(IDataflowDocumentContext verifier, VerificationMessageList messages)

{

return GetMapGroupsForMultiInputOneOutput(verifier, messages);

}

#endregion

Make use the base classes GetMapGroupsForMultiInputOneOutput method for convenience.

Note that if your transformation is a set transformation and makes use of multiple inputs and does not implement this interface, the document will get confused about how one record can be coming from two different sources.

Template

Now, that our ItemData is completed, it is now time to move onto the template. The template is responsible for how our transformation looks in the toolbox including name, category, icon, etc… as well as what controls to use as it’s editor and what processor to use to modify the record / set. It’s basically the glue that holds everything together.

Set common properties

Set the usual properties in the constructor of this class.

this.Category = “Sample”;

this.Name = “MultiInSingleOut”;

this.NamePrefix = “MultiInSingleOut”;

this.Description = “Multiple In Single Out”;

this.IsDefaultExpanded = true;

this.TemplateType = FlowActionTemplateType.SetTransformation;

this.Image = Resource1.rightbox.ToBitmap();

The one to take note of here is the TemplateType. Make sure to set it to the correct type of transformation as there are several. Since we are transforming the record set, this is a SetTransformation.

Deploy

At this point, we have the absolute minimum to deploy to Centerprise. It will show up in the toolbox and you’ll be able to drag-and-drop onto the designer, but you will not be able to preview it since it has no processor yet.

Please refer to this guide’s section on deployment for more information on that subject.

Processor

Our next task is to set up the brains for our transformation. Since this example is basically a simple union, it is really not that smart.

Create Processor Class

Inherit from ProcessorBase

For your convenience, you may choose to inherit your class from ProcessorBase. This class has implemented some of the basic functions that all processors must implement, but doesn’t really do anything. If your processor needs to have special logic for termination, closing, and initialization, then either just implement ISetProcessor directly or override the appropriate method for these things.

Explain other base classes

Implement ISetProcessor

ISetProcessor vs. ISingleProcessor

Make sure your class implements one of the processor interfaces. For a set transformation, implement ISetProcessor. For a map, implement ISingleProcessor.

Override Process

Override the abstract Process method. This is where all of the processing should happen to modify a record / set. Since we’re not really modifying anything, we’ve left the Process method blank and passing the record onto the output.

Summary

With our ItemData, Template, and Processor completed, we’ve now have all the pieces necessary to start using our transformation in dataflows. We should now be able to feed data into it from two different sources and have it output from a single port.

Workflow Sample 1 - Simple Workflow Task

Introduction

This example will focus on creating a simple workflow task in Centerprise. A workflow task differs from a dataflow transformation in that there is no record-by-record processing happening in a workflow. It executes once, and then passes control to the next task(s).

This example will create a very simple task action that will write to a log file the date and time that it was run. It will demonstrate the very basics of a workflow task without going into too much detail about other, more advanced options available for a workflow task, such as custom monitoring information.

Creating the Classes

To create our task, we must implement at least 3 classes. If we need a custom user interface to edit the stored data properties, we will have to create that class as well. This class has a hard coded file path and does not use any customer user interface. Refer to the “Custom Property Editors” example for that.

The 3 mandatory classes for a workflow task are:

  • The ItemData class
  • The TaskProcessor
  • The Template

The ItemData class contains the data properties about your custom workflow task. For example, our workflow task contains a FilePath property that will be used to write to a specific location on disk.

The TaskProcessor is the component that actually performs the execution of the task. It usually contains a reference to the ItemData in order to use the task’s properties.

The Template is the class that describes how this task looks in the toolbox and on the designer. It also connects the UI to the back end. It is the glue that holds all of these classes together.

ItemData

The first class we should create for our task is the ItemData. To do so, create a class called WorkflowItemDataTracer

You’ll need to import the Astera.Core, Astera.Transfer, and Astera.Transfer.Workflow namespaces

using Astera.Core;

using Astera.Transfer;

using Astera.Transfer.Workflow;

FlowTemplate Attribute

In order to connect this ItemData to the correct template in the toolbox, you’ll have to decorate this class with the FlowTemplateName attribute

[FlowTemplateName(”WorkflowTracer”)]

public class WorkflowItemDataTracer : WorkflowItemDataBase

The “WorkflowTracer” string parameter will be the name of our template that we will create

Inherit from WorkflowItemDatabase

To make this a workflow class, you must implement IWorkflowItemData. But, use WorkflowItemDataBase to save a lot of coding.

Properties

Since this is where we’ll be storing all of our properties about our task we should write them here. In this case we have a FilePath property that will dictate where to write our file to.

public WorkflowItemDataTracer()

{

_LogFilePath.SetPath(@”C:\CenterpriseExampleLogFile.txt”, true);

}

private FilePath _LogFilePath = new FilePath();

public FilePath LogFilePath

{

get

{

return _LogFilePath;

}

}

Here we’re using a built in Astera.Core.FilePath type to handle UNC resolutions automatically for us. We’re also hard-coding this value in the constructor.

Verification

The ItemData class is also where verification should be written. In this example, we will check for the presence of a file path.

public override void Verify(string name, FlowDocumentVerifier verifier)

{

base.Verify(name, verifier);

if (this.LogFilePath == null || string.IsNullOrWhiteSpace(this.LogFilePath.Path))

{

verifier.AddMessage(this.Name, “File path must be provided.”);

return;

}

}

CreateProcessor

The last thing that our ItemData must do is create our processor. This is the next class necessary.

TaskProcessor

The task processor will be responsible for executing the logic for this task. In this case, the processor will be the one that actually does the writing to the file in our case.

Create a class and name it WorkflowTracerProcessor

Implement ITaskProcessor or Inherit from TaskProcessorBase

In order for this class to be recognized as a task processor, you must implement ITaskProcessor. In our case, we will inherit from TaskProcessor base, which takes care of most of the ITaskProcessor implementation.

Add an ItemData Property

We’ll use this property to control where we write our file. In general, all TaskProcessors should have a reference to the ItemData class that contains the editable properties.

public WorkflowItemDataTracer Data { get; private set; }

Create Constructor Method, Pass In Context and TracerItemData

The TaskProcessor should implement a constructor that takes as parameters the WorkflowRuntimeContext and the related ItemData instance.

The WorkflowRuntimeContext contains environment information about the task including the graph and the item instance in the graph. We make no use of it in this example, so just pass it along to the base class constructor.

The ItemData parameter we’ll use to set our local ItemData property.

public WorkflowTracerProcessor(WorkflowRuntimeContext context, WorkflowItemDataTracer tracer) : base(context, tracer.Name)

{

this.Data = tracer;

}

Take note of the second parameter for the base constructor where we’re passing the name of the ItemData instance.

Implement CreateProcessor In ItemData

Now that we have our constructor set up, go back to our ItemData class and create an instance of our processor

public override ITaskProcessor CreateTaskProcessor(WorkflowRuntimeContext context)

{

return new WorkflowTracerProcessor(context,this);

}

Override Initialize / Cleanup methods

If your processor requires initialization or cleanup, make sure to override the appropriate methods from the base. Our process doesn’t require it, but

public override void Initialize()

{

//any initialization code for this processor should go here

}

public override void Close()

{

//any closing logic should go here

}

public override void Terminate()

{

//any early termination (error, user cancels) should go here

}

Implement StartTask

Now, implement the abstract StartTask method. This is the method that will do all of the work for the task.

protected override void StartTask()

{

try

{

System.IO.File.AppendAllLines(this.Data.LogFilePath.Path, new string[] { string.Format(”Tracer was run at {0}”, DateTime.Now) });

}

catch (Exception ex)

{

//set the error infor for the TaskMonitoringInfo to signal to Centerprise that there

//was an error completing this task. If this is set, the control will flow the error

//path instead of the normal path for the subsequent action

this.TaskMonitoringInfo.ErrorInfo = new ErrorInfo(ex);

}

finally

{

RaiseCompleted();

}

}

Use Properties from ItemData

Note that this method is writing a file based on the file path we received from the ItemData instance. Pretty much all TaskProcessors should follow this pattern.

Error Info

In the catch section, we are setting the TaskMonitoringInfo’s ErrorInfo property with any errors we encounter. We do this to signal to Centerprise that this task is in error and the error path should be followed instead the normal control flow.

RaiseCompleted

Make sure to call RaiseCompleted. Failure to do this will result in the workflow job running forever and appearing to hang.

Template

To create the template, create a class called WorkflowTracerTemplate

Be sure to inherit from FlowTemplateActionBase.

public class WorkflowTracerTemplate : FlowActionTemplateBase

Set Common Properties

Set the usual properties for the template

public WorkflowTracerTemplate()

{

this.Category = “Sample”;

this.Name = “WorkflowTracer”;

this.NamePrefix = “WorkflowTracer”;

this.Description = “Workflow Tracer”;

this.IsDefaultExpanded = true;

this.TemplateType = FlowActionTemplateType.WorkflowAction;

this.Image = Resource1.rightbox.ToBitmap();

}

The name is important as it is how you will tie the ItemData to the template. Make sure the name matches what you have added as an attribute to the ItemData class earlier in this example.

Set Template Type

The TemplateType is also important. Since this is a workflow task, make sure to set this to WorkflowAction. Failure to do so, will result in this task not appearing in the correct flow document.

this.TemplateType = FlowActionTemplateType.WorkflowAction;

Refer To Template in ItemData Class

Go back to the ItemData class and decorate it with the following attribute:

[FlowTemplateName(”WorkflowTracer”)]

public class WorkflowItemDataTracer : WorkflowItemDataBase

Make sure that the name matches or template will not appear in the toolbox

Summary

These three classes complete this example of a simple workflow task. In much of this example, we relied heavily on base class implementations, but these are not required. Please refer to API reference to see what interfaces are available for added flexibility.

Working with the Integration Server

ServerProxyClass

ServerProxy class manages all the interaction between client and server. It shields CIF developer from the underlying server interaction and marshalling issues and provides API to interact with Centerprise Integration Server. You can communicate with any instance of the Server running on your network through ServerProxy class. You can use ServerProxy class to:

  • Submit job to server
  • Monitor and manage jobs
  • Schedule jobs
  • Administer server

ServerProxy class throws Astera.ServerException to the caller if the server is not available or for any other type of error.

Job Class

Job class is used to build jobs that can be run on local or remote servers. Using Job class, you can build dataflow or workflow jobs and submit these jobs to run on any server in the network using ServerProxy class. If your server is configured for email notification, you can set email notification properties in Job and server will send notification emails based on these properties.

Running Job

Running jobs using ServerProxy is straightforward. Following sample shows how to run a dataflow on the server.

///

/// This sample shows how to run a job on a server and monitor the progress of the job

///

class JobRunner

{

string FilesDirectory

{

get

{

return System.IO.Path.Combine(Environment.CurrentDirectory, “files”);

}

}

public JobRunner()

{

//this sample assumes a server running on local host

this.ServerConnection.Host = “127.0.0.1”;

this.ServerConnection.Port = 9257;

}

///

/// This is the location of the dataflow setting file in the file system

///

private string _DataflowSettingLocation = “runjob.df”;

public string DataflowSettingLocation

{

get

{

return System.IO.Path.Combine( this.FilesDirectory, _DataflowSettingLocation);

}

set

{

_DataflowSettingLocation = value;

}

}

private string _DelimitedSourceLocation = “FileA.csv”;

public string DelimitedSourceLocation

{

get

{

return System.IO.Path.Combine(this.FilesDirectory, _DelimitedSourceLocation);

}

}

///

/// This is the information of the server where we intend to run this job

///

private ServerConnection _ServerConnection = new ServerConnection();

public ServerConnection ServerConnection

{

get

{

return _ServerConnection;

}

set

{

_ServerConnection = value;

}

}

///

/// Takes a job, gets the serverProxy and submits the job to the serverproxy

///

///

public Int64 RunJobOnServer(Job job)

{

ServerProxy serverProxy = GetServer();

Int64 jobId = serverProxy.StartJob(job);

return jobId;

}

public void RunJobOnServer()

{

try

{

FlowDocument dataflowDoc = GetDataflow(this.DataflowSettingLocation);

//setting this so you don’t have to place a file at a specific spot on the drive.

//the setting has is hard coded to c:\filea.csv.

DataflowItemDataSourceDelimited delimitedSourceData = dataflowDoc.Actions[0].ItemData as DataflowItemDataSourceDelimited;

delimitedSourceData.FilePath = this.DelimitedSourceLocation;

Job job = CreateJob(dataflowDoc);

long jobId = RunJobOnServer(job);

MonitorJob(jobId);

}

catch (Exception ex)

{

throw ex;

}

}

///

/// Monitors the job and rwrites the trace info to the console

///

///

public void MonitorJob(long jobId)

{

Console.Title = “Monitoring Sample”;

ServerProxy serverProxy = GetServer();

while (true)

{

JobMonitoringInfoBase monitoringInfo = serverProxy.GetJobMonitoringInfo(jobId);

if (monitoringInfo.IsProcessRunning == false)

{

Console.ReadKey();

break;

}

foreach (TaskTraceItem traceItem in monitoringInfo.TraceEntries)

{

Console.WriteLine(traceItem.Text);

}

}

}

///

/// Gets the Server Proxy from the ServerConnection (server name and port number)

///

private ServerProxy GetServer()

{

try

{

ServerProxy proxy = Servers.GetServer(this.ServerConnection.URI);

return proxy;

}

catch (ServerException ex)

{

throw ex;

}

}

FlowDocument GetDataflow(string dataFlowPath)

{

//return new DataflowPersister().LoadXmlFromFile(dataFlowPath);

return new DataflowPersister().LoadXmlFromFile(@”c:\runjob.df”);

//object o = new TransferPersister().LoadXmlFromFile(@”c:\test.xfer”);

//return null;

}

///

/// Creates a job for a dataflow setting

///

public Job CreateJob(FlowDocument dataflow)

{

Job job = new Job();

job.Title = this.DataflowSettingLocation;

//uncomment below to use file path instead of sending contents directly as xml

//job.FilePathResolved.Path = this.DataflowSettingLocation;

job.TransferJobType = ServerJobType.Dataflow;

job.TransferObjectXml = new DataflowPersister().GetXmlData(dataflow);

return job;

}

}

Monitoring

Centerprise Integration Server provides extensive monitoring capabilities. ServerProxy class contains methods that help you monitor jobs on servers. When you submit job on a server, the server returns a job id. Using that id, you can request monitoring information from the server using GetJobMonitoringInfo. Server returns a JobMonitoringInfo that provides detailed information about status of job. You can terminate a running job by calling Terminate with the job id.

Scheduling

ServerProxy provides SaveScheduledJob, DeleteScheduledJob, and GetScheduledJobs methods to manage scheduled jobs from your programs. Following sample provides an example of creating new scheduled jobs.

///

/// This sample shows how to schedule a job on a server

///

class ScheduleJob

{

public ScheduleJob()

{

//this sample assumes a server running on local host

this.ServerConnection.Host = “127.0.0.1”;

this.ServerConnection.Port = 9257;

}

string FilesDirectory

{

get

{

return System.IO.Path.Combine(Environment.CurrentDirectory, “files”);

}

}

private string _DataflowSettingLocation = “runjob.df”;

public string DataflowSettingLocation

{

get

{

return System.IO.Path.Combine(this.FilesDirectory, _DataflowSettingLocation);

}

set

{

_DataflowSettingLocation = value;

}

}

private string _DelimitedSourceLocation = “FileA.csv”;

public string DelimitedSourceLocation

{

get

{

return System.IO.Path.Combine(this.FilesDirectory, _DelimitedSourceLocation);

}

}

///

/// This is the information of the server where we intend to run this job

///

private ServerConnection _ServerConnection = new ServerConnection();

public ServerConnection ServerConnection

{

get

{

return _ServerConnection;

}

set

{

_ServerConnection = value;

}

}

public void ScheduleJobOnServer()

{

DataflowDocument dataflowDoc = new DataflowPersister().LoadXmlFromFile(this.DataflowSettingLocation);

Job job = CreateJob(dataflowDoc);

ScheduleJobOnServer(job);

}

///

/// Takes a transfer job, gets the serverProxy and schedules the job to the serverproxy

///

///

public void ScheduleJobOnServer(Job job)

{

ServerProxy serverProxy = GetServer();

// We are choosing the schedule to run once. You can choose any kind of schedule

ScheduledJobTransfer scheduledJob = new ScheduledJobTransfer();

scheduledJob.Schedule = new RunOnceSchedule();

scheduledJob.FilePathResolved = job.FilePathResolved;

scheduledJob.Id = 0;

scheduledJob.Name = job.Title;

scheduledJob.NotificationEmail = job.NotificationEmail;

scheduledJob.RedirectLocalOutputFilesToStaging = true;

//save the schedule

serverProxy.SaveScheduledJob(scheduledJob);

}

///

/// Gets the Server Proxy from the ServerConnection (server name and port number)

///

///

private ServerProxy GetServer()

{

try

{

ServerProxy proxy = Servers.GetServer(this.ServerConnection.URI);

return proxy;

}

catch (ServerException ex)

{

throw ex;

}

}

///

/// Creates a transfer job for a transfer setting

///

///

public Job CreateJob(DataflowDocument dataflow)

{

Job job = new Job();

job.Title = this.DataflowSettingLocation;

job.FilePathResolved.Path = this.DataflowSettingLocation;

job.TransferJobType = ServerJobType.Dataflow;

job.TransferObjectXml = new DataflowPersister().GetXmlData(dataflow);

return job;

}

}

Retrieving Server Monitoring Info

You can perform a health check on the server by calling GetServerInfo which returns server version, processes currently queued and running, as well as staging directory path.

Server Administration

Server administration capability available in the Server Manager module of the studio is also available through API. You can use GetServerProperties and SaveServerProperties methods to retrieve and save Server properties.