Introduction to Remote Automation

This chapter discusses the remote automation of processes in the eCognition Server® via the eCognition SOAP protocol interface.

Provided are the DIACmdClient.exe, the DIACmdEngine.exe and DIAMkWksp tools. While these are not a programming interface, but command line tools, they do utilize the SOAP protocol interface to provide functionality that can be, for instance, used in a script, in order to automate eCognition Server®.

Relevant Files for Remote Automation

Below are listed the files included in the eCognition Developer installation that are relevant to use of the Automation API for remote automation.

The lib files provided in the Automation installation are not required for DSOAP to build or run. These are provided for the use of the Automation DAMI-API. Go to the folder SDK, which is installed in the eCognition 10.5 installation directory. The default path is C:\Program Files\Trimble\eCognition 10.5\SDK

Help Files

API Header Files

API Library Files

Not applicable

Samples

The eCognition Remote Automation API

There are two ways sending a job to the eCognition Server®:

Both methods have advantages and disadvantages. The main advantage of SOAP Interface is its high volume of functionality. More or less, the SOAP Interface can do everything related to eCognition Server®. In contrast, the DIA Command Line Client is fixed to send and wait and also the creation of a workspace file is required. However, the DIA Command Line Client is easier to use and in most cases way more than enough for script solutions.

Integration of Remote Automation Applications with eCognition Software

Starting a Client with a Preloaded Element

For integration projects you may want to start a client with project, image or rule set loaded. The following parameters are available:

	/image        - image file
	 /ruleset      - rule set file
	 /project      - dpr to be loaded
	 /product      - product to be started
	 /portal       - portal to be started
	 /workspace    - dpj to be loaded

For example, you can open the Life portal using:

DIAClient.exe /product "Definiens Developer XD" /portal "Life"

Command Line Tools

DIA Command-line Client - DIACmdClient.exe

The DIA Command Line Client is designed to submit analysis jobs to the DIA Job Scheduler and to monitor the submitted jobs. A job is represented by a list of projects stored in a workspace file. The workspace files can be created using DIA Command Line Client. The workspace file can also point to a rule set file that it will be used for the analysis of the projects. However, you have the ability to provide a different rule set file for an image analysis.

When you start up the DIA Command Line Client a list of commands and their definitions is displayed:

DIA Command Line Client.
Usage:    DIACmdClient    action [options]
		workspace_file [ruleset_file] [scene_name]
or:       DIACmdClient   action [options] -db user[:pwd]@storage
		workspace_id [ruleset_id]
 
actions:
s 	- submit the workspace to analysis
p 	- submit the workspace for stitching
w 	- wait for the workspace to finish
t 	- test the state of the analysis
r 	- rollback the workspace and delete results
d 	- delete single run (with results)
sw 	- submit to analysis and wait
pw 	- submit for stitching and wait
 
options:
-p           - analyse tiles only
-top         - analyse top scenes only
-u url       - use specified URL to access the Job Scheduler
-t sec       - maximum wait time for the Job Scheduler to start up
-run name    - specifies the run name
-fsr       - force submit even if other runs are not finished yet
-scn file  - submit only scenes listed in the given file
-ro        - read only; do not modify the workspace
-db        - open workspace from Data Management storage
-auth url  - use specified URL to access the Authentication Server
 

For example, when you only want to submit an image analysis job you need to use the (s) command. The DIA Command Line Client will perform the command and when the submission is completed the control returns to you, whereas when you want to submit and monitor the job (sw) or just monitor it (w) then the command is blocked until the job is finished.

When a job is monitored, it is identified in the Job Scheduler using the workspace GUID stored in the workspace file. When several jobs are submitted from the same workspace, then only the last job is monitored.

The (-t sec) command displays on the screen the maximum waiting time (in seconds) for the DIA Job Scheduler to start up. This is useful when for instance DIA Job Scheduler is started in the background and then immediately the DIA Command Line Client is started from the same batch script. DIA Job Scheduler needs some time for initialization and connecting to the license server. The (-ro) option leaves the content of the workspace unchanged. Otherwise new operation (“Analyze”) with current user name and time stamp is added to the operation history and new versions for all submitted project are created. Also if (w) command is used then the resulting status after analysis is stored in the workspace.

The (-db) option allows to open workspace from Data Management storage instead of a file. In this case also the rule set is taken from the storage, consequently IDs should be used instead of file names. The user name and password are checked in the Authentication Server, which can be specified by using the (-auth) option. Otherwise the default Authentication Server URL is taken from the configuration file.

The (-fsr) option forces the processing of a job even if other jobs are running. If (-fsr) is not included, submitting an action from the command line, while another run is being processed, will return the −2 error code.

If you want to change the delimiter, which are used in exported .csv files, you can do this by using the XML node ahead in the eCognition.cfg file stored in the config folder.

The eCognition.cfg file stored in the config folder is the configuration file used by the DIA Command Line Client.

<eCognition config-ver="2.0" >
  <general>
     <key name="temp path" string="" />
     <key name="license check timeout" integer="60" />
     <key name="default server configuration" string="eCognition"/>
  </general>

  <logging>
      <key comment="0 means no logging, any greater value is time in milliseconds to
      check for new loaded modules" integer="0" name="log loaded modules"></key>
  </logging>
  <storage>
      <key name="authentication server" string="http://localhost:4002/" />
  </storage>
  <ODBC>
      <key name="oracle dsn" string="Definiens Oracle" />
  </ODBC>
  <CSV-settings>
      <key name="decimal separator" string="." />
      <key name="column delimiter" string=";" />
  </CSV-settings>
  <default-config>
    <plugins>
      <key string="eCognition Basic Process Algorithms v0.1" name="plugin1"></key>
      <key string="eCognition Internal Process Algorithms v0.1" name="plugin2"></key>
      <key string="eCognition Thematic Layer Algorithms v0.1" name="plugin3"></key>
      <key string="eCognition Basic Features v0.1" name="plugin4"></key>
      <key string="eCognition Vector Features v0.1" name="plugin5"></key>
      <key string="eCognition Internal Export Process Algorithms v0.1" 
       name="plugin6"></key>
    </plugins>
</eCognition>

Starting with version 7.0, all submitted jobs contain configuration requirement information. You cannot set this information with command options, for this purposes you should use the eCognition.cfg file:

Example 1
D:\test>\DIA\bin\DIACmdClient -sw test1.dpj fastrule.dcp
DIA Command Line Client.

[Wed Sep 28 15:26:31 2005] : The job was submitted.

[Wed Sep 28 15:26:31 2005] : Waiting for results...

[Wed Sep 28 15:26:32 2005] : A1: processing: : lcn001 / 2696 / 5.0.3 Build 367

[Wed Sep 28 15:26:33 2005] : A1: done: 1 s: lcn001 / 2696 / 5.0.3 Build 367

[Wed Sep 28 15:26:33 2005] : A2: processing: : lcn001 / 2696 / 5.0.3 Build 367

[Wed Sep 28 15:26:34 2005] : A2: done: 1 s: lcn001 / 2696 / 5.0.3 Build 367

[Wed Sep 28 15:26:36 2005] : A3: done: 1 s: lcn001 / 2696 / 5.0.3 Build 367

[Wed Sep 28 15:26:36 2005] : A4: done: 0 s: lcn001 / 2696 / 5.0.3 Build 367

[Wed Sep 28 15:26:36 2005] : A5: done: 1 s: lcn001 / 2696 / 5.0.3 Build 367

[Wed Sep 28 15:26:36 2005] : A6: processing: : lcn001 / 2696 / 5.0.3 Build 367

[Wed Sep 28 15:26:37 2005] : A6: done: 1 s: lcn001 / 2696 / 5.0.3 Build 367

[Wed Sep 28 15:26:37 2005] : A7: done: 0 s: lcn001 / 2696 / 5.0.3 Build 367

[Wed Sep 28 15:26:37 2005] : A8: processing: : lcn001 / 2696 / 5.0.3 Build 367

[Wed Sep 28 15:26:38 2005] : A8: done: 1 s: lcn001 / 2696 / 5.0.3 Build 367

[Wed Sep 28 15:26:38 2005] : A9: done: 0 s: lcn001 / 2696 / 5.0.3 Build 367

[Wed Sep 28 15:26:38 2005] : A10: processing: : lcn001 / 2696 / 5.0.3 Build 367

[Wed Sep 28 15:26:39 2005] : A10: done: 1 s: lcn001 / 2696 / 5.0.3 Build 367

[Wed Sep 28 15:26:40 2005] : A11: done: 1 s: lcn001 / 2696 / 5.0.3 Build 367

[Wed Sep 28 15:26:40 2005] : A12: done: 0 s: lcn001 / 2696 / 5.0.3 Build 367

[Wed Sep 28 15:26:41 2005] : Total: done: 8 s

Example 2
D:\test>\DIA\bin\DIACmdClient -sw test2.dpj fastrule.dcp
DIA Command Line Client.

[Wed Sep 28 15:31:48 2005] : The job was submitted.

[Wed Sep 28 15:31:48 2005] : Waiting for results...

[Wed Sep 28 15:31:51 2005] : New Project: failed: 0 s: Error accessing image

data: File '\Home\Exp\image_pw.tif' not found or not accessible / lcn001 /

2696 / 5.0.3 Build 367

[Wed Sep 28 15:31:52 2005] : Total: failed: 0 s

Dia Command-line Engine - DIACmdEngine.exe

In addition eCognition offers the DIACmdEngine.exe which is the command-line interface for the engine (C:\src\ecognition\bin\Win64>DIACmdEngine).

The following command-line parameters are available:

- analyze image file:

DIACmdEngine image=path1 [image=pathN..] [thematic=pathN] ruleset=path [options]

- analyze image imported using connector:

DIACmdEngine image-dir=path import-connector=name [import-connector-file=path] [image=extra_image_pathN] [thematic=extra_thematic_pathN] ruleset=path [options]

- analyze existing project (.dpr):

DIACmdEngine dpr=path1 ruleset=path [options]

- analyze image imported using scene file list (multiple scenes within single run):

DIACmdEngine image-dir=path scene-xml=path ruleset=path [options]

- resave ruleset to force usage of latest algorithm versions:

DIACmdEngine --update-ruleset input_ruleset_path output_ruleset_path
 

- where:

image=path                 - path to raster or point cloud data file (.tif, .asc, ...).
thematic=path              - path to thematic data file (.shp, gdb, ...).
ruleset=path               - path to rule set file (.dcp).
import-dir=path            - root directory for image/thematic data files.
import-connector=name      - name of the predefined import connector or custom import connector (.xml).
import-connector-file=path - path to .xml file containing customized import connector.
dpr=path                   - path to .dpr file to be used as analysis input.

 

- options:

param:nameN=valueN - parameter to the rule set, where 
     nameN is name of scene variable and
     valueN - value of the scene variable. There can be arbitrary amount of params.
 
array-param:nameN=value1,value2,..,valueN - array parameter to the rule set, where
     nameN is name of rule set array and 
     valueN is the comma-separated value list. There can be arbitrary amount of array-params.
     Example: array-param:myArray=0,90,180,270. 
output-dir=path - output diretory for export files
 
license-token=json - additional license information in json format
 
save-dpr (or save-dpr=path/to/project.dpr) - save project file. 
If explicit path to .dpr specified, it will be used instead default path 
({:Workspc.OutputRoot}\dpr\{:Project.Name}.v{:Project.Ver}.dpr)
 
--pause                     - pause application after done
 
--map path1=path2        - local drive - network path mapping
--log-file=<file> 
If parameter is specified, then this log file path is taken, if parameter is not specified the default log file path (from config/eCognition.cfg) and the name of the file is auto-generated
(e.g. DIACmdEngine.exe --log-file=C:\logs\CmdEngine.log)
DIA Workspace File Creator

DIA Workspace File Creator is designed to simplify creating workspace files. It can be used to automatically import all image files from specified folder recursively into the workspace. DIA Workspace File Creator is a command line tool and it has the following syntax:

 

DIAMkWksp wksp_file [input_fldr] [import_tmplt_name] [export_tmplt_file][add_tmplt_fldr]
Attribute Description
wksp_file Output workspace (.dpj) file to be created
input_fldr Optional folder which will be search recursively for image files
import_tmplt_name Optional name of import template to be used; if not given then default template is used; import templates are defined in Default.scm configuration file
export_tmplt_file Optional path to the file containing an export template to be saved the new workspace file
add_tmplt_fldr Optional folder containing import template files to be used
Example 3

To open the Rule Set Mode portal from the command line, use:

"C:\Program Files\Trimble\eCognition  Developer 10.5\bin\DIAClient.exe" /product "eCognition  Developer" /portal "Rule Set Mode"

In this example, the new workspace file myWksp.dpj will be created. The folder \\srvr\imgs\myImgs\ will be scanned recursively and all image files found in it will be imported to the workspace; one file per scene. Additionally the export template found in the myOldWksp.dpj file will be added to the new workspace.

DIAMkWksp myWksp.dpj \\srvr\imgs\myImgs\ "Generic – one file per scene" 
myOldWksp.dpj
About Importing Data

The description of the input data (Input Data Specification), starting with Definiens product line version 7.0, is kept in the workspace file in an SQLite3 database format. The workspace file (.dpj) is saved by Definiens GUI client. But it can also be created and edited manually or using for instance Perl script, in XML format of Definiens product line 5.0. When opened in GUI client 7.0 or later, it will be automatically converted to the SQLite3 database format. Here is the example of a workspace file containing one folder with one project with two image layers in XML format:

<data-inst name="myWorkspace" scheme="High Content Analysis" guid=
"74D6C48C-C708-4CC1-A66A-3368987A8C26">
  <data-inst name="myFolder" type="Folder">
      <data-inst name="myProject" type="Project" oper-id="1" ver="1"
      status="1" scene-id="1">
         <data-inst name="Layer 1" type="Image Layer" file="D:\Home\Exp\
         image.frm" channel="0" raster-data-type="1">
            <geo-info lower-left-x="0." lower-left-y="0." resolution="1." 
            source="0" size-x="640" size-y="640" />
         </data-inst>
         <data-inst name="Layer 2" type="Image Layer" file="D:\Home\Exp\
         image.frm" channel="1" raster-data-type="1">
            <geo-info lower-left-x="0." lower-left-y="0." resolution="1."
            source="0" size-x="640" size-y="640" />
         </data-inst>
         <geo-info lower-left-x="0." lower-left-y="0." resolution="1."
         source="2" size-x="640" size-y="640" />
      </data-inst>
  </data-inst>
  <export-tmpl>
     ..........................
  </export-tmpl>
  <oper-history>
      <operation type="0" time="1128411820" user="somebody" />
  </oper-history>
</data-inst>

 

The workspace is organized in a hierarchical structure of data objects. Each data object is represented as data-inst node. The types of data objects are defined in the Default.scm schema configuration file. The parent (top root) object is the workspace itself. The following attributes have to be provided:

Attribute Description
name Workspace name.
scheme Name of the scheme. It has to match the scheme defined in the Default.scm configuration file.
guid Global unique identifier of the workspace.

The workspace can contain folders or projects. The workspace includes also an export-tmpl node and an oper-history node, which is the list of operations performed on the workspace. Each operation is represented by operation node containing the following attributes:

Attribute Description
type 0 means Create
1 means Edit
2 means Analyze.
time Timestamp when the operation was performed (number of seconds since 01.01.1970).
user Name of the user who performed the operation.

The folder can contain projects or other folders. The folder node has two attributes:

Attribute Description
name Folder name.
type It usually equals Folder. More instances of data types exist and can be used for grouping. These data types are defined in the Default.scm schema configuration file.

The project node requires the following attributes:

Attribute Description
name Project name.
type It usually equals Project.
scene-id Scene/project identifier, it has to be unique within the workspace file.
ver Project version, it can always be 1, if we don’t want to keep history of changes.
opder-id One-based index of the operation which created this project in the oper-history node.
status Status of the project, 1 means Created.

The project consists of image layers. It also includes a geo-info node with geocoding information for the entire scene. The geo-info node has the following attributes:

Attribute Description
lower-left-x, lower-left-y Coordinates of the lower left corner in units.
resolution Unit to pixel ratio, so 1 means that unit = pixel.
size-x, size-y Size of the image in pixels.
source 0 means no geo-coding
1 means geo-coding retrieved from data file
2 means geo-coding edited by user.

The image layers have their own geo info node which can be used to align the layers towards each other. The image layer node requires the following attributes:

Attribute Description
name Image layer name.
type It has to be equal Image Layer.
file Path to the file containing image data.
channel Zero-based index of the channel in the file to be used.
raster-data-type Data-type of the pixels in the image:
1:unsigned 8-bit
2: unsigned 16-bit
4:signed 16-bit
8:unsigned 32-bit
16:signed 32-bit
32: floating point 32-bit
About the Export Specification List

The Export Specification List defines exported data from a project. It is simply a list of exported items. Each item has a name, type, driver code and path.

Example of export specification in XML code:

<export-spec>    
  <item name="ProjectFile" type="Scene" driver="DPR" path="\\ourSrvr\      
  Projects\myWksp\dpr\myProject.v2.dpr" />    
  <item name="MyStatsA" type="Table" driver="CSV"
  path="\\ourSrvr\Projects\    
  myWksp\results\MyStatsA\myProject.v2.csv" />    
  <item name="ProtocolExport" type="Path" path="\\ourSrvr\Projects\myWksp\   results\ProtocolExport\myProject.v2\" /> 
</export-spec

The item name is used as an identifier for connecting the item with the rule set. So it means that the rule set has to use the same item name. There are two special item names: ProjectFile and ProtocolExport. The ProjectFile item name, which is not used in a rule set, identifies the binary project file, containing image object data and classification results, and it can also be used by the engine as an input for the next analysis. The ProtocolExport item name is used to define the output folder for deprecated protocol operations. All protocol operations used in the rule set will store any exported files in the folder defined by this export item.

The item type and driver code define the format of exported item. Currently supported (type, driver) pairs are:

Attribute Description
("Scene","DPR") Complete binary project file containing image objects and the rule set.
("Scene","DPR_MIN") Minimized binary project file without the rule set and unused classes.
("Table","CSV") Comma separated values text file.
("Table","ODBC") Database accessed via ODBC driver.
("Raster","TIF") Tagged image files.
("Raster","JPG") JPEG JFIF files.
("Raster","JP2") JPEG 2000 files.
("Raster","PNG") Portable network graphics.
("Raster","BMP") Windows or OS/2 Bitmaps.
("Raster","PIX") PCIDSK.
("Raster","ASC") ESRI ASCII GRID files.
("Raster","IMG") Erdas Imagine Images.
("Raster","NTF") National Imagery Transmission (NITF).
("Raster","SAR") ProSmart SAR-files.
("Vector","SHP") Shapefile.
("Path","") Used for ProtocolExport item.

The item path is the complete path with the file name, where the exported data should be stored. The exception is the ProtocolExport item. In this case only the folder should be given. There are two more item attributes for the ODBC driver: insert and select. Also the path has a little bit different meaning.

About Export Templates

The Export Template enables definition of the Export Specifications for all projects in the workspace in general form. It can be achieved by using special variables which are substituted with proper values for particular projects.

Example of export template in XML code:

<export-tmpl>
 <item name="ProjectFile" type="Scene" driver="DPR" path= "{:Workspc.Dir}\
 dpr\{:Project.Name}.v{:Project.Ver}.dpr" />
 <item name="MyStatsA" type="Table" driver="CSV" ext="csv" path=
 "{:Workspc.Dir}\results\{:Item.Name}\{:Project.Name}.v
    {:Project.Ver}.{:Ext}" />
 <item name="ProtocolExport" type="Path" path= "{:Workspc.Dir}\results\
 {:Item.Name}\{:Project.Name}.v{:Project.Ver}\" />
</export-tmpl>

 

Possible variables are:

Shortcut Description
"{:ActionLib.Dir}" Directory of the action library (if any)
"{:Application.Dir}" Directory of the loaded application/portal (if any)
"{:ArcSDE.Connect.Dir}" Path of the directory with "das" files, specific for ArcSDE database connection
"{:Desktop.Dir}" Desktop directory (windows only)
"{:Ext}" File format extension for an item's driver, e.g. for algorithms that have an export driver specified - like TIF, GeoTiff and Shape
"{:ImgLayer(n).Dir}" Directory of the input image layer n
"{:ImgLayer(n).FileName}" File name of the input image layer n
"{:ImgLayer(n).Path}" Path of the input image layer n
"{:Install.Dir}" Path to the bin folder of the installation directory of the currently running software instance (e.g. Architect or Developer).
"{:Item.Name}" Name of the export item (if available)
"{:Project.Guid}" GUID of the scene
"{:Project.Name}" Name of the scene (absolute in the workspace).
"{:Project.Ver}" Scene state version number (if workspace is available, this is the project version in workspace DB)
"{:Scene.Dir}" Directory of the scene
"{:Scene.Name}" Name of the scene
"{:Variable:abcde}" Value of variable abcde
"{:Workspc.Dir}" Path of the workspace file
"{:Workspc.Guid}" GUID of the workspace
"{:Workspc.InputRoot}" Path of the input root folder
"{:Workspc.Name}" Name of the workspace
"{:Workspc.OutputRoot}" Path of the output root folder - if not workspace available this is the same as {:Scene.Dir}

 

The export item in the export template has an additional attribute, ext, which will be substituted under the {:Ext} variable.

About the ODBC Driver

The ODBC driver in the export item enables the to export of statistical data into database and loading it back for viewing in DIA Command Line Client.

Example of using ODBC driver in the export template:

<export-tmpl>    
	<item name="MyStatsB" type="Table" driver="ODBC" path="DSN=MS 
	Access Database;DBQ={:Workspc.Dir}\DIA.mdb" insert="INSERT INTO     
	statistics VALUES (%scene_id%,%scene_ver%,%feature%,%value%)"     
	select="SELECT * FROM statistics_cross_view" /> 
</export-tmpl>
 

The path attribute in the export item using ODBC driver contains the ODBC Connect String. In the example above, the Connect String uses the default ODBC Data Source for accessing Microsoft Access database files.

The insert attribute defines an SQL statement to be used for inserting a statistical value into the database. The statement contains special variables which will be substituted with proper values while exporting the data. The special variables are:

Attribute Description
%scene_id% ID of the scene.
%scene_ver% Scene state version number.
%feature% Name of the exported statistical feature.
%value% Value of the exported statistical feature.
%row% When exported feature has multiple values per scene, this variable will be incremented.

The select attribute defines an SQL statement to be used for selecting the results from the database. The returned record-set should have some layout as exported CSV files. In the example above it is cross query from the statistics table.

The eCognition SOAP Interface

The eCognition SOAP Interface allows the sending of commands and retrieval of information from the Job Scheduler. The Job Scheduler is a web service which is used for submitting and monitoring user jobs. It is also used by the analysis engines for requesting jobs to processing.

The Job Scheduler web service can be accessed from any application using SOAP protocol. The service is described using WSDL (Web Service Description Language). The description can be found in the DiaJS.wsdl file. The client applications can be developed using WSDL parser and SOAP tools for particular programming language. For example, eCognition is using gSOAP for C++

The gSOAP compiler generates the DiaJS.wsdl file from a DiaJS.h include file. It also generates .cpp files with necessary stub classes for accessing the web service. The workflow with other SOAP tools and/or other programming languages may vary a little bit. In this case please refer to the documentation of the appropriate SOAP tool.

There are two most important service methods for remote job processing. To submit a user job use the js__SubmitJob() service method. To monitor submitted jobs use the js__RequestStatusInfo() service method.

Building a Remote Automation Application

Automating in Script

When automating eCognition software in script, you are most likely using the provided tools DIACmdClient and/or DIAMkWksp. The example code provided below in this section is written in the Perl scripting language. However, an adaptation of this code into another scripting language can be easily achieved.

The DIA Command Line Client is provided with the eCognition Server® package and it can be found in the bin folder under the eCognition Server Version number, the complete file path is: C:\Program Files\eCognition Server Version number\bin\DIACmdClient.exe.

Getting the Images from a Repository

Your image files need to be stored in a location where you can access them by using Microsoft Network (for example \\image-server\repositroy\input…).

Let’s consider as an example the folder mentioned above as a drop an analyze folder. If files or folders are dropped into this folder they should be analyzed. A script should detect if there is a new image and send it to be processed. Find a proper solution in step 1 of the example script.

Import Images and Analyze them using a Specified Rule Set

The DIA Command Line Client can provide the eCognition Server® with workspace files.

If a workspace file does not already exist on your machine you should create one.

Exporting the Results into a CSV File or a Database

This is covered by the workspace file. You can decide if you want to write the data into a database (DB) or a CSV file. For the DB solution there is a need to establish an ODBC connection between eCognition software and the DB server.

Removing Image Reference Files

The image reference files are not included any longer within the database. The removal of the files can be done by the script.

Set up the Environment

  1. Install the eCognition Server® on a computer of your choice. You can allocate as many instances of Analysis Engine Software as you want to be connected to the eCognition Server® while the installation is running.
  2. If the automation scripts will be executed on the eCognition Server® make sure you can access the binary folder with your scripts. (C:\Program Files\eCognition Server Version number\bin).
  3. If a different machine is used, please copy the binary folder to the local disk of this machine. And make sure that you can access the binary folder with your scripts. In addition the eCognition Server® must be declared within the DIA Command Line Client. Therefore the configuration file (C:\Program Files\eCognition Server Version number\bin\config\eCognition.cfg) must be edited. Open the eCognition.cfg file, find the entry <js-server ip=“localhost” port=“8184”/> and replace “localhost” with the IP or the DNS name of your eCognition Server®.
  4. Create an export template file.
  5. Run your scripts on an endless loop.

Perl Script Example

The following Perl Script example illustrates the steps that need to be followed when you use eCognition Server® with DIA Command Line Client.

  1. Detect the new files.
  2. Create a workspace file.
  3. Send the workspace file to the processing system.
  4. Wait until job is completed.
  5. Delete all input files.

Step 1: Detect the New Files

#!/usr/bin/perl
$input = "\\\\server\\Test\\input";
$output = "\\\\server\\Test\\output";
loop:
@dir= `dir /b \"$input\"`;
if (! $dir[0]) {
	print "Directory empty. Go for loop.\n";
	sleep 1;
	goto loop;
} else {
	print "Image(s) or folder(s) found in the input directory.
	Go for analysis\n";
 

Step 2: Create a Workspace File

print "Creating the workspace file(s)\n";
   system "if exist \"$output\\myWorkspace1.dpj\" 
   del \"$output\\myWorkspace1.dpj\"";
   $command = "c:\\bin\\DIAMkWksp.exe $output\\myWorkspace1.dpj
   \"$input\" \
   "Generic - one file per scene\" C:\\Example\\ExportTemplate.dpj";
   print "$command\n";
   system $command;
   system "if exist \"$output\\myWorkspace2.dpj\" del
   \"$output\\myWorkspace2.dpj\"";
   $command = "c:\\bin\\DIAMkWksp.exe $output\\myWorkspace2.dpj \"$input\" \
   "Generic - one file per scene\" C:\\Example\\ExportTemplate.dpj";
   print "$command\n";
   system $command;

Step 3: Send the Workspace File to the Processing System

print "Sending the job(s) to be analyzed\n";
	$command = "c:\\bin\\DIACmdClient.exe
	-s $output\\myWorkspace1.dpj ruleset1.dcp"; # analyze with RuleSet1
	print "$command\n";
	system $command;
	$command = "c:\\bin\\DIACmdClient.exe
	-s $output\\myWorkspace2.dpj ruleset2.dcp"; # analyze with RuleSet2
	print "$command\n";
	system $command;

Step 4: Wait Until Job Is Completed

print "Waiting for job(s) finish\n";
   $command = "c:\\bin\\DIACmdClient.exe -w $output\\myWorkspace1.dpj";
   # Wait for Workspace 1
   print "$command\n";
   system $command;
   $command = "c:\\bin\\DIACmdClient.exe -w $output\\myWorkspace2.dpj";
   # Wait for Workspace 2
   print "$command\n";
   system $command;

Step 5: Delete All Input Files

print "Delete the image(s) input files\n";
	$command = "del /q $input\\*.*";
	# delete all files in the input directory
	print "$command\n";
	system $command;
	@dir= `dir /b \"$input\"`;
	# delete all folders in the input directory
	for $i (@dir){
		chomp $i;
		print "Deleting directory \"$input\\$i\"";
		system "rmdir /s /q \"$input\\$i\"";
	}
print "Go for loop\n";
	goto loop;  }

Building a SOAP Application

Submitting a Job

The js__SubmitJob() service method should be used to submit a job. It takes js__UserJob as the input parameter.

The js__UserJob contains the list of jobs to be executed. It’s stored in the vJobs member of its base js__JobList. So if a user wants to analyze a single scene, he has to put a single element into vJobs array. js__UserJob has one more important member, strWkspGUID, which contains the GUID which is used to identify a submitted job if it has to be monitored later. Because there may be more jobs sent with the same GUID, the Job Scheduler assigns new integer id to each job it receives. This jobID is returned as the output parameter of js__SubmitJob() method.

The vJobs array is a member of js__JobList, which is the base for js__UserJob. Each element of vJobs can be either js__UserJob or js__JobList, because both of them derive from js__JobBase. In this way the user can build nested lists of jobs which form a tree structure.

The js__UserJob describes a single scene to be analyzed. It has mScene member which describes the input and mExportSpec member which describes the output. It also has eTask member which should be set to EAnalyse. See “Automation API Reference” HTML Help for details.

The js__JobList has EOrderType member. It can be set to either EParallel or ESerial. ESerial means that the list of jobs should be executed in a sequence and EParallel means the list of jobs can be executed simultaneously.

The js__JobBase has another important member which is strRuleSet. It contains the rule set in XML form to be used for analysis of a job. In case of job list, the rule set should be applied to all elements of the list which have empty rule sets. So it means that the rule set can be empty and in this case the rule set from “parent” job will be derived.

Starting with version 7.0 all submitted jobs contain configuration requirement information. This information can be set with oCnfg member of js__UserJob class object. oCnfg member is an object of class js__JobConfig. The class is intended for providing job configuration information. The class has strConfig member that contains required server product name and versions. Normally you fill it with a string like eCognition.10.5.0.last which says that the job has to be executed on a server product with the name starting with eCognition version 10.5.0 of the last installed build.

Other members of the class are optional and intended for advanced users:

Monitoring a Job

The js__RequestStatusInfo() service method can be used to monitor a job. The job is identified by wkspGUID and jobID.

Theoretically, it would be enough to provide the jobID, because it is unique in the Job Scheduler, however due to security reasons it is necessary to provide wkspGUID, the same that was used to submit the job.

The service method returns an array of js__StatusInfo messages via a vector output parameter. If since is greater than 1, then it represents a timestamp. For each scene (job node if we see the job structure as a tree), that has changed its status after the given timestamp. The first member of the array (index is 0) contains some information about the Job Scheduler and normally should not be of interest for you.

js__StatusInfo is added to the returned array. This allows the retrieval of the status only of scenes that recently changed. This reduces network load when pooling information about scenes (sub-jobs) from jobs containing significant amount of them. The js__StatusInfo contains a status (waiting, processing, done, etc.) of a job or a job list. In case of a single job, the scene id can be found in iSceneID member. The js__StatusInfo also contains submitting, starting, finishing timestamps and additionally the timestamp of fetching the status by the service method, which can be then used for the since parameter for subsequent calls to js__RequestStatusInfo(). If a job has failed,then some failure description can be found in strRemarks member.

Cancel a Job

To cancel a job the js__CancelJob() service method has to be used. The job is identified by wkspGUID and jobID.

Like in case of monitoring a job, the wkspGUID is used for security reasons. If only analysis of particular scenes is to be cancelled, then the list of scene ids has to be provided in the last input array parameter: sceneIDs.

Step by Step Sample of Remote Automation

Submitting a Job

This example project for Monitoring a Job submitted to the eCognition Server® can be found in the Samples folder.

The code is provided here below for your convenience to cross check against the instructions how to build a SOAP application.

Includes
#include <sys/stat.h>
#include "soap/DiaJS.nsmap"
#include "soap/soapClient.cpp"
#pragma warning( push, 0 )
#include "soap/jsSrvcC.cpp"
#include "soap/stdsoap2.cpp"
#pragma warning( pop )
#include "soap/jsSrvcDiaJSProxy.h"
using namespace jsSrvc;
Helper Functions
const char *GetFullPath(const char* szPath)
{
	static char szBuffer[_MAX_PATH];
	if(bLocal)
	{
		char* pPart = NULL;
                GetFullPathName(szPath,_MAX_PATH,szBuffer,&pPart);
                return szBuffer;
	}
	else
	{
		return szPath;
	}
}
The Main Application
int main(int argc, char* argv[])
{
//---------------------------------------------------------------
//   load ruleset
//---------------------------------------------------------------
	char* ruleset = NULL;
	{
		FILE* rlstfile = fopen(szRullsetPath, "rb");
                if(!rlstfile)
		{
			fprintf(stderr,"Can't load ruleset");
                        exit(-1);
		}
		struct stat st;
		fstat(fileno(rlstfile), &st);
		size_t length = st.st_size;
		ruleset = new char[length+1];
		fread(ruleset, 1, length, rlstfile);
		ruleset[length] = '\0';
	}
//---------------------------------------------------------------
//   Job creation
//---------------------------------------------------------------
	js__UserJob userjob;
	userjob.strWkspGUID = szGuid;
	userjob.strWkspName = "test workspace";
	userjob.strUserName = "john";
	userjob.eOrderType = EParallel;
	userjob.strRuleSet.str = ruleset;
	
	//-------------------------------------------------------------
	// add a scene to the job
	//-------------------------------------------------------------
	{
		js__Job* scene = new js__Job;
		scene->eTask = EAnalyse;
		scene->mScene.iID = "1";
		scene->mScene.iVer = 1;
		scene->mScene.strName = "test scene 1";
		scene->mScene.strFilePath = "";
		scene->mScene.dPxlRes = 1;
		scene->mScene.dXPos = 0;
		scene->mScene.dYPos = 0;
		scene->mScene.iSrc = 0;
		scene->mScene.iSizeX = 890;
		scene->mScene.iSizeY = 600;
		scene->mScene.iCSType = 0;
		scene->mScene.iUnit = 1;
 
		//-----------------------------------------------------
		// image layers
		//-----------------------------------------------------
		js__ImgLayer layer;
		layer.dPxlRes = 1;
		layer.dXPos = 0;
		layer.dYPos = 0;
		layer.iSrc = 0;
		layer.iSizeX = 890;
		layer.iSizeY = 600;
		layer.iCSType = 0;
		layer.iNoDataType = 0;
		layer.dNoDataVal = 0;
		layer.iDataType = 1;
		
		//-----------------------------------------------------
		// image layer 1
		//-----------------------------------------------------
		layer.strAlias = "Layer 1";
		layer.strFilePath = GetFullPath(szLayer1ImgPath);
		layer.iIndx = 0;
		scene->mScene.vImgLayers.AddTail(layer);
 
		//-----------------------------------------------------
		// image layer 2
		//-----------------------------------------------------
		layer.strAlias = "Layer 2";
		layer.strFilePath = GetFullPath(szLayer1ImgPath);
		layer.iIndx = 1;
		scene->mScene.vImgLayers.AddTail(layer);
		
		//-----------------------------------------------------
		// export spec
		//-----------------------------------------------------
		js__ExportItem expitem;
		expitem.strName = "MyStats1";
		expitem.strPath = GetFullPath(szExpItem1);
		expitem.strDriver = "CSV";
		expitem.iType = 1; /* TABLE */
		scene->mExportSpec.vExportItems.AddTail(expitem);
		expitem.strName = "ProjectFile";
		expitem.strPath = GetFullPath(szExpItem2);
		expitem.strDriver = "DPR";
		expitem.iType = 0; /* PROJECT */
		scene->mExportSpec.vExportItems.AddTail(expitem);
 
		//-----------------------------------------------------
		// add the scene
		//-----------------------------------------------------
		userjob.vJobs.AddTail(scene);
	}
		userjob.oCnfg.strConfig = "eCognition.10.5.0.last";
		
		//-----------------------------------------------------
		// add a scene to the job
		//-----------------------------------------------------
	{
		js__Job* scene = new js__Job;
		scene->eTask = EAnalyse;
		scene->mScene.iID = "2";
		scene->mScene.iVer = 1;
		scene->mScene.strName = "test scene 2";
		scene->mScene.strFilePath = "";
		scene->mScene.dPxlRes = 1;
		scene->mScene.dXPos = 0;
		scene->mScene.dYPos = 0;
		scene->mScene.iSrc = 0;
		scene->mScene.iSizeX = 890;
		scene->mScene.iSizeY = 600;
		scene->mScene.iCSType = 0;
		scene->mScene.iUnit = 1;
		
		//-----------------------------------------------------
		// image layers
		//-----------------------------------------------------
		js__ImgLayer layer;
		layer.dPxlRes = 1;
		layer.dXPos = 0;
		layer.dYPos = 0;
		layer.iSrc = 0;
		layer.iSizeX = 890;
		layer.iSizeY = 600;
		layer.iCSType = 0;
		layer.iNoDataType = 0;
		layer.dNoDataVal = 0;
		layer.iDataType = 1;
	
		//-----------------------------------------------------
		// image layer 1
		//-----------------------------------------------------
		layer.strAlias = "Layer 1";
		layer.strFilePath = GetFullPath(szLayer2ImgPath);
		layer.iIndx = 0;
		scene->mScene.vImgLayers.AddTail(layer);
 
		//-----------------------------------------------------
		// image layer 2
		//-----------------------------------------------------
		layer.strAlias = "Layer 2";
		layer.strFilePath = GetFullPath(szLayer2ImgPath);
		layer.iIndx = 1;
		scene->mScene.vImgLayers.AddTail(layer);
 
		//-----------------------------------------------------
		// export spec
		//-----------------------------------------------------
		js__ExportItem expitem;
		expitem.strName = "MyStats1";
		expitem.strPath = GetFullPath(szExpItem1);
		expitem.strDriver = "CSV";
		expitem.iType = 1; /* TABLE */
		scene->mExportSpec.vExportItems.AddTail(expitem);
		expitem.strName = "ProjectFile";
		expitem.strPath = GetFullPath(szExpItem3);
		expitem.strDriver = "DPR";
		expitem.iType = 0; /* PROJECT */
		scene->mExportSpec.vExportItems.AddTail(expitem);
		// add the scene
		userjob.vJobs.AddTail(scene);
	}
userjob.oCnfg.strConfig = "eCognition.10.5.0.last";
	//-------------------------------------------------------------
	// submit the job
	//-------------------------------------------------------------
 
	DiaJS proxy;
	proxy.endpoint = szServer;      // set URL
	
	int job_id;
	if(proxy.js__SubmitJob(&userjob, job_id) != SOAP_OK)
	{
		fprintf(stderr,"Can't submit job to %s."
		" Make sure JobScheduler is running.", szServer);
		exit(-1);
	}
	return 0;
}

Monitoring a Job

This example project for Monitoring a Job submitted to the eCognition Server® can be found in the Samples folder.

The code is provided here below for your convenience to cross check against the instructions how to build a SOAP application.

Includes
#include <assert.h>
#define ASSERT assert
#include "soap/jsSrvc.nsmap"
#include "soap/jsSrvcClient.cpp"
#pragma warning( push, 0 )
#include "soap/jsSrvcC.cpp"
#include "soap/stdsoap2.cpp"
#pragma warning( pop )
#include "soap/jsSrvcDiaJSProxy.h"
using namespace jsSrvc;
Helper Functions
// --------------------------------------------------------------
//   GetElapsedTime
// --------------------------------------------------------------
// --------------------------------------------------------------
//   Sample data
// --------------------------------------------------------------
const char* szServer = "http://localhost:8184";   // server url
char* guid = "81B9641D-6671-4979-9224-F1E4AC4AD553";
int GetElapsedTime(js__StatusInfo& s)
{
	if(s.timeStop) {
		// job stopped; return total processing time
		return s.timeStart? int(s.timeStop - s.timeStart) : 0;
	}
	if(s.timeStart) {
		// job started; return current processing time
		return int(time(NULL) - s.timeStart);
	}
	// job is still waiting; return waiting time
	return int(time(NULL) - s.timeSubmit);
}
 
// --------------------------------------------------------------
// GetStatusString
// --------------------------------------------------------------
const char* GetStatusString(js__StatusInfo& s)
{
	switch(s.eStatus)
	{
	case EWaiting:      return "waiting";
	case EWorking:      return "processing";
	case EJobDone:      return "done";
	case EJobFailed:   return "failed";
	case ETimeout:      return "timeout";
	case ECrashed:      return "crashed";
	case ECancelled:   return "cancelled";
	};
	return NULL;
}
 
The Main Application
//---------------------------------------------------------------
//   data
//---------------------------------------------------------------
const char* szServer = "http://localhost:8182";         // server url
char* guid = "81B9641D-6671-4979-9224-F1E4AC4AD553";   // job guid
//---------------------------------------------------------------
//!tmain
//---------------------------------------------------------------
int main(int argc, char* argv[])
{
	DiaJS proxy;
	proxy.endpoint = szServer;   // set URL
 
//---------------------------------------------------------------
//   get maximum job id
//---------------------------------------------------------------
	SVector<int> ids;
	
	if(proxy.js__RequestJobIDs(guid, ids) != SOAP_OK)
	{
		fprintf(stderr,"Can't monitor the job");
		exit(-1);
	}
 
	if(ids.IsEmpty())
	{
		fprintf(stderr,"There is no jobs in the Job Scheduler"
		" for the given workspace");
		exit(-1);
	}
 
	printf("Waiting for results...\n");
	long long last_status_time = 1;
	for(;;)
	{
 
	//---------------------------------------------------------------
	// request overall status
	//---------------------------------------------------------------
		SVector<js__StatusInfo> infos;
		if   (proxy.js__RequestStatusInfo(guid, ids.Last(), 0, infos) !=
			SOAP_OK || infos.GetSize() <= 1)
		{
			fprintf(stderr,"Can't monitor the job");
			exit(-1);
		}
		js__StatusInfo overall = infos[1];
 
	//---------------------------------------------------------------
	// wait a bit
	//---------------------------------------------------------------
		Sleep(1100);
 
	//---------------------------------------------------------------
	// request detailed statuses
	//---------------------------------------------------------------
		infos.clear();
		if(proxy.js__RequestStatusInfo  (guid,
						ids.Last(),
						last_status_time,
						infos) != SOAP_OK)
		{
			fprintf(stderr,"Can't monitor the job");
			exit(-1);
		}
		
		if(infos.IsEmpty())
		{
 
			//-------------------------------------------------------
			// detail statuses are empty; show the overall status
			//-------------------------------------------------------
			if   (overall.eStatus == EWaiting || overall.eStatus ==
				EWorking)
			{
				continue;
			}
			//------------------------------------------------
			// job finished
			//------------------------------------------------
			printf   ("Total: %s: %d s\n", GetStatusString(overall),
				GetElapsedTime(overall));
			break;
		}
 
		last_status_time = infos[0].timeStatus;
 
		//--------------------------------------------------------
		// go thru all detailed statuses and print them
		//--------------------------------------------------------
	
		for(int i = 1; i < infos.GetSize(); i++)
		{
			if(infos[i].eStatus != EWaiting)
			{	
				if(!infos[i].iSceneID)
				{
					continue;
				}
 
 
				if(infos[i].eStatus == EWorking)
				{
					printf("Scene (ID=%d): %s: : %s\n",
					infos[i].iSceneID, GetStatusString(infos[i]),
					(const char*)infos[i].strRemarks);
				} else
				{
					printf("Scene (ID=%d): %s: %d s: %s\n",
					infos[i].iSceneID, GetStatusString(infos[i]),
					GetElapsedTime(infos[i]),
					(const char*)infos[i].strRemarks);
				}
			}
		}
	}
	return 0;
}