Semantic composition of Web and Grid Services

Duration: 02/2007-12/2009
Grant: APVV project APVV-0391-06

Project WIKI (access restricted)

Goal of the project

The goal of this project presents an innovative solution in the domain of construction of distributed applications and workflows of web and grid services. Presently, no similar existing systems solve comprehensively the problems of automatic workflow composition problems together with look up of effective compositions of their components and suitable data and parameters. The result of the project will be innovative also in the broader domain of SOA applications, since the problems of automatic semantic based composition is not being solved also in other technological branches of SOA, not only in the domain of web and grid services. An important innovation in the domain of workflows is exploitation of semantic representation of web services for dynamic composition of process models. One of the possible applications of this model will be in the tasks of text mining and text annotation. For example, different steps of pre-processing of text documents can be chosen based on the semantic information extracted from a set of documents. Based on that information a subsequent process of searching for optimal service workflow may be automated. Specific innovative element of the proposal is an extension of existing results in the domain of service workflow composition also to grid services that are currently experiencing a boom in grid computing and scientific applications.


User Interface

Realized as a web portal. Its core will be a portlet container based on the JSR-168 standard, for example GridSphere. Other parts will be interfaces for the module Integration and Control, and a set of collaboration tools, mainly tools for annotation and experience exchange, discussion groups, and instant messaging tools. These will use Data Store to manage their information.

Data Store

The core of the system. It will contain semantic information of available components. This information will be organized in an ontology, which will be based on the OWL standard, and its dialect OWL-S, and which will extend existing standards with both WSRF services ontology, and application data and parameters ontology.


Set of tools and components of the user interface, which will support team management of service workflows. Specific tool, leveraging the semantic capabilities of the system, will be an experience sharing tool, based on text notes. This tool will enable its users to enter context-bound textual information, and to provide this information later to any other user, working in a similar context.


Important time in the lifecycle of a compatible application is its initial discovery and semantic description, crucial for later use in semantically-driven workflow composition. This will be covered by the module Integration. It will contain a set of tools for semi-automatic semantic data generation, covering the instances of services, their interfaces, data and parameters. This module will be accessible through the User Interface.


Establishes feedback of updated information into the Data Store. It will receive data from the module Control. This data, containing information about service accessibility, functionality, and other QoS parameters, will be analyzed, and new facts will be entered into Data Store. These will be then used in subsequential workflow construction and execution.


Responsible for construction and execution of service workflows. For this the information contained in Data Store will be used. Workflow construction will be comprised of several phases, covering mainly identification of user request, construction of workflow in the profile (interface) level, and instantiation of this workflow in a suitable set of deployed services. This process will be highly automated, but at the same time the user will be able to modified proposed workflow solutions in a way he/she deems necessary. The resulting workflow will be executed and controlled, until desired results are obtained. The whole execution will be monitored, with all events forwarded to the Inference module.