Transformer SDK For C  7.1.5.312996
Transformer SDK For C Documentation
About Transformer

Caplin Transformer is an event-driven real-time data transformation engine optimised for web trading services that are implemented in Transformer Modules. It receives large volumes of raw real-time market data and republishes it as value-added data in real time, either to other DataSource applications or to generic output such as ODBC, XML or mes- sage queues. As a DataSource application, Caplin Transformer is also capable of receiving requests for data from other DataSource applications such as Caplin Liberator.

This is achieved by a set of business modules that implement the specific algorithms required. Examples of the calculations and other processing Caplin Transformer can perform include:

  • a module can calculate the top 10 shares in terms of price, yield, number of transactions per hour and so on;
  • another module can log incoming data to a database, and if a corrected or marked trade appears in a feed determine whether that is passed on or not;
  • it can simplify updates so that if multiple updates are contained within the update packet, only the last (and hence most significant) is sent out;
  • it can perform predetermined tasks according to market conditions, such as sending initialisation messages when markets open and end-of-day calculations when they close;
  • Caplin Transformer can also process suspensions in trading and determine what to do with the data in alignment with market rules.

    System architecture

    The figure below shows how the Transformer fits into the Caplin real-time data architecture.

arch.gif
Caplin Transformer System Architecture

Several market data sources contribute information onto the DataSource API. This data is accepted by the Caplin Transformer, which process it, and can then output it in three ways:

  • back to the DataSource API, where it can be picked up by any suitably configured DataSource-enabled application.
  • to a database, where information can be kept for historical analysis and interrogated by a client web server.
  • to a Caplin Liberator, which can then publish the information over the Internet.
Internal Architecture

The figure below shows the internal structure of the Caplin Transformer.

internal.gif
Caplin Transformer internal architecture
  • The data sink element of Caplin Transformer extracts relevant data from the DataSource API.
  • Central processing sends the data to the modules, which perform the specific algorithms that adjust the data.
  • Amended data is then either output to external applications, such as databases for storing historical records, or back to the core for publishing back onto the DataSource API through the data source.
Caplin Transformer and DataSource peers

As well as being sources of data, products attached to the DataSource API can be destinations for data sent from Caplin Transformer, as illustrated in the figure below:

tranpeers.gif
Caplin Transformer acting as a data source and data sink

A DataSource peer is a remote application that uses Caplin's DataSource protocol to attach to the DataSource API and both send and receive data.

As this means the link between DataSource peers is bidirectional, the true relationship between the elements is shown in the figure below.

tranpeers2.gif
Bidirectional links between DataSources

Objects can be requested from individual DataSource peers or groups of peers.

Transformer Modules

The Transformer is supplied with the following modules:

  • pipeline - A module that implements a simple scripting API for the development of business rules
  • jtm - A module that implements a Java API for the development of modules
  • format - A module that exposes generic formatting functionality for use by other modules
  • cluster - A module that implements clustering and data replication between Transformer nodes.
  • persistence - A module that provides support for persisting of data
Module Types

The Transformer C SDK supports the development of two types of modules; conventional modules which can only interact with other modules by passing data updates, and extension modules which allow functionality to be shared between modules.

All modules should implement the initialisation function mod_init().

Conventional Transformer Modules

This module type is the most common type and is used to implement modules that perform some form of processing on the data.

API Extension Modules

This module type allows the functionality of a module to be exposed to other modules. Possible uses for this could include the following:

  • unified debug handling between modules;
  • data parsing routines - eg handling of fractions;
  • database access routines

The development of an API extension module is identical to that of conventional module, however, it should implement an interface that is exposed using the extension functionality.


Generated on Tue May 14 2019 13:45:39 for Transformer SDK For C