WorkflowHadoopAdaptor

From Gcube Wiki
Revision as of 17:13, 24 February 2010 by Giorgos.papanikos (Talk | contribs)

Jump to: navigation, search

Overview

This adaptor as part of the adaptors offered by the WorkflowEngine constructs an Execution Plan that can mediate to submit a job writen under the Map Reduce design pattern and written against the utilities offered by the Hadoop infrastructure.. After its submission the job is monitored for its status and once completed the output files are retrieved and stored in the StorageSystem. The resources that are provided and need to be moved to the Hadoop infrastructure are all transfered through the StorageSystem. They are stored once the plan is constructed and are then retrieved once the execution is started. The Hadoop infrastructure utilized resides in a cloud infrastructure with which the ExecutionEngine negotiates the resource availability.

Plan Template

The entire execution process takes place in the gLite Grid UI node. This node is picked from the InformationSystem and is currently chosen randomly from all the available ones. Currently once the node has been picked, the execution cannot be moved to a different one even if there is a problem communicating with that node. The execution that takes place is a series of steps executed sequentially. These steps include the following:

  • Contact the remote node
  • Retrieval of the data stored in the StorageSystem and these include the resources marked as Configuration, Input Data, and JDL description
  • Submit the job using the provided JDL file and optionally any configuration additionally provided using the provided user proxy certificate
  • Go into a loop until either the job is completed or a timeout has expired (If a timeout has been set)
    • Wait for a defined period
    • Retrieve the job status
    • Retrieve the job logging info
    • Process the results of the above two steps
  • Check the reason the loop ended
  • If a timeout happened, cancel the job
  • If the job terminated successfully retrieve the output files of the job