Palladio Workflow Engine/Hello World Tutorial

Aus SDQ-Wiki

Introduction

In this tutorial we build a simple Job and successively extend the example.

As a general note: The code provided in this tutorial is examplary code and not intended for production. To provide a more compact tutorial we have also removed a lot of comments etc. This is not aligned with our coding guidelines but accepted here for space reasons.

An Eclipse project containing the HelloWorld example sources is available on github:
https://github.com/PalladioSimulator/Palladio-Supporting-WorkflowEngine/tree/master/bundles/de.uka.ipd.sdq.workflow.helloworld


The First Hello World Job

First, we start with a simple job just saying hello to whomever provided to its constructor:

package de.uka.ipd.sdq.workflow.helloworld;

import org.eclipse.core.runtime.IProgressMonitor;

import de.uka.ipd.sdq.workflow.jobs.CleanupFailedException;
import de.uka.ipd.sdq.workflow.jobs.IJob;
import de.uka.ipd.sdq.workflow.jobs.JobFailedException;
import de.uka.ipd.sdq.workflow.jobs.UserCanceledException;

public class HelloWorldJob implements IJob {
  private String who = "World";
	
  public HelloWorldJob(String who) {
      this.who = who;
  }  
  
  @Override
  public void execute(IProgressMonitor monitor) throws JobFailedException, UserCanceledException {
      System.out.println("Hello "+who); // job specific processing
  }

  @Override
  public String getName() {
      return "Hello World Job";
  }

  @Override
  public void cleanup(IProgressMonitor monitor) throws CleanupFailedException {
      // Nothing to clean up after a run.
  }
}

The most straightforward way to realize a job is to implement the IJob interface. The interface requires to provide a method named execute() which takes a progress monitor as a parameter one can report its job specific progress to.
Within the execute() method we print our hello message to the system output (line 19). This is the place your custom logic / processing has to be implemented.
During execution, a job can check if the user has canceled the processing and throw a UserCanceledException.
In addition, a JobFailedException can be thrown to identify that the job could not be processed successfully.

Providing the getName() method, a Job has to return a name which identifies him e.g. in context of a workflow.
The cleanup() method can contain any clean up of blocked resources etc. and is called by the workflow lifecycle management.

Job Execution

Jobs can be executed directly. To give this a first try, we use a Main class with a standalone main() method and provide a NullProgressMonitor to the job:

package de.uka.ipd.sdq.workflow.helloworld;

import org.eclipse.core.runtime.NullProgressMonitor;  
import de.uka.ipd.sdq.workflow.jobs.JobFailedException;
import de.uka.ipd.sdq.workflow.jobs.UserCanceledException;

public class Main {
  
  public static void main(String[] args) throws JobFailedException, UserCanceledException {
    HelloWorldJob job = new HelloWorldJob("World");
    job.execute(new NullProgressMonitor());
  }
  
}

The example will prompt Hello World to our console.

The Hello World Workflow

In the example above, we executed a job in a very raw manner. For example, the cleanup() method has not been called afterward, which is essential because a job might need to free some resources etc.
So a better way to run a job is to execute it in the context of a Workflow. A Workflow itself is a specific type of the a job which encapsulates other jobs and takes care for their lifecycle. In addition, a workflow provides some logging facilities to provide information about the executed jobs as well as error handling.

In the example below, we create a workflow, and add our hello world job to it.

package de.uka.ipd.sdq.workflow.helloworld;

import org.apache.log4j.BasicConfigurator;
import org.apache.log4j.ConsoleAppender;
import org.apache.log4j.PatternLayout;
import de.uka.ipd.sdq.workflow.Workflow;
import de.uka.ipd.sdq.workflow.jobs.JobFailedException;
import de.uka.ipd.sdq.workflow.jobs.UserCanceledException;

public class MainWorkflow {

  public static void main(String[] args) {
    
    // set up a basic logging configuration
    BasicConfigurator.resetConfiguration();
    BasicConfigurator.configure(new ConsoleAppender(new PatternLayout("%m%n")));
        
    HelloWorldJob job = new HelloWorldJob("World");
    Workflow myWorkflow = new Workflow(job);
    myWorkflow.run();
  }
}

Note: Our main method no longer throws exceptions. The workflows run method (line 12) already encapsulates and handles any exceptions thrown by the jobs. In line 15 and 16 we initialize a log4j logging infrastructure. The workflow automatically logs workflow information to a log4j logger which requires a prepared log4j environment. Typically, this is already done in your environment. Furthermore, the Palladio Workflow Engine provides infrastructure to send those logs to the console of your active Eclipse instance: Palladio Workflow Engine/Logging.

Now, when we execute our program we get the following output on our console:

 Creating workflow engine and starting workflow
 Palladio Workflow-Engine: Running job Hello World Job
 Hello World
 Task Sequential Job Execution completed in 1.57792E-4 seconds
 Cleaning up...
 Workflow engine completed task

As you can see, the workflow takes care for and reports about the workflow lifecycle.

Composed Hello World Workflow Chain

A workflow executing a single job is not very interesting. To build "real" workflows, jobs can be plugged together to build more complex composite jobs. The Palladio Workflow Engine implements its composition concept based on specific composite job types: (see also Composite Job Types)

Two different main composite types are provided which again could be composed into more complex jobs: SequentialJob and ParallelJob. In the following, we extend our hello world example to make use of these two types.

Sequential Job Composition

To execute a sequence of jobs, simply create a SequentialJob and add your jobs to be executed in one after each other (See code example below).
In addition to the default constructor the SquentialJob provides a constructor allowing to pass a boolean parameter to decide if the cleanup() method of each job should be called immediately after its execution (true) or after all jobs have been completed (false). The immediate clean up is the default behaviour to free resources as soon as possible.

package de.uka.ipd.sdq.workflow.helloworld;

import org.apache.log4j.BasicConfigurator;
import org.apache.log4j.ConsoleAppender;
import org.apache.log4j.PatternLayout;

import de.uka.ipd.sdq.workflow.Workflow;
import de.uka.ipd.sdq.workflow.jobs.JobFailedException;
import de.uka.ipd.sdq.workflow.jobs.SequentialJob;
import de.uka.ipd.sdq.workflow.jobs.UserCanceledException;

public class MainSequentialWorkflow {
 
  public static void main(String[] args) {
    
    // set up a basic logging configuration
    BasicConfigurator.resetConfiguration();
    BasicConfigurator.configure(new ConsoleAppender(new PatternLayout("%m%n")));
        
    SequentialJob jobSequence = new SequentialJob();
    
    jobSequence.add(new HelloWorldJob("Palladio"));
    jobSequence.add(new HelloWorldJob("Workflow"));
    jobSequence.add(new HelloWorldJob("Engine"));
    
    Workflow myWorkflow = new Workflow(jobSequence);
    myWorkflow.run();
  }
}

The code above creates a workflow executing three HelloWorldJobs greeting "Palladio", "Workflow", "Engine" as shown in this activity diagram:
Palladio-workflow-engine-sequential-hello-world-example.png

Executing the code above will produce the following console output:

 Creating workflow engine and starting workflow
 Palladio Workflow-Engine: Running job Sequential Job
 Palladio Workflow-Engine: Running job Hello World Job
 Hello Palladio
 Palladio Workflow-Engine: Running job Hello World Job
 Hello Workflow
 Palladio Workflow-Engine: Running job Hello World Job
 Hello Engine
 Task Sequential Job Execution completed in 2.32203E-4 seconds
 Task Workflow Execution completed in 5.90558E-4 seconds
 Cleaning up...
 Workflow engine completed task

Parallel Job Composition

package de.uka.ipd.sdq.workflow.helloworld;

import org.apache.log4j.BasicConfigurator;
import org.apache.log4j.ConsoleAppender;
import org.apache.log4j.PatternLayout;
import de.uka.ipd.sdq.workflow.Workflow;
import de.uka.ipd.sdq.workflow.jobs.JobFailedException;
import de.uka.ipd.sdq.workflow.jobs.ParallelJob;
import de.uka.ipd.sdq.workflow.jobs.UserCanceledException;

public class MainParallelWorkflow {

  public static void main(String[] args) {
    
    // set up a basic logging configuration
    BasicConfigurator.resetConfiguration();
    BasicConfigurator.configure(new ConsoleAppender(new PatternLayout("%m%n")));
        
    ParallelJob parallelJob = new ParallelJob();
    
    parallelJob.add(new HelloWorldJob("Palladio"));
    parallelJob.add(new HelloWorldJob("Workflow"));
    parallelJob.add(new HelloWorldJob("Engine"));
    
    Workflow myWorkflow = new Workflow(parallelJob);
    myWorkflow.run();
  }
}


The code above creates a workflow executing three HelloWorldJobs greeting "Palladio", "Workflow", "Engine" as shown in this activity diagram:
Palladio-workflow-engine-parallel-hello-world-example.png

Executing this workflow will produce the following logging output:

 Creating workflow engine and starting workflow
 Palladio Workflow-Engine: Running job CompositeJob <Hello World Job Hello World Job Hello World Job >
 Hello Palladio
 Hello Workflow
 Hello Engine
 Task Workflow Execution completed in 0.002190325 seconds
 Cleaning up...
 Workflow engine completed task

As you can see, the hello messages are produced in the order they have been placed in the parallel job but they do not have to. Each parallelized job is executed in a separate Thread and the order of job execution depends on your thread management.
The ParallelJob provides a constructor to specify the number of threads to be used to execute the parallel jobs. The default setting is -1 which specifies to use the CPUs default setting of available threads.

However, an important point is, that the parallelized Jobs of the Workflow Engine are not deamons as when executing parallel tasks of Eclipse. So they are joined again and one can build workflows with parallized jobs followed by another job which is executed after the parallel job executions have been joined.