Ontology Alignment Evaluation Initiative - SEALS platform evaluation modalities for OAEI 2011 CampaignOAEI

SEALS platform evaluation modalities for OAEI 2011

SEALS

The SEALS project is dedicated to the evaluation of semantic web technologies. To that extent, it is creating a platform for easing this evaluation, organising evaluation campaigns, and building the community of tool providers and tool users around this evaluation activity.

OAEI and SEALS are closely coordinated in the area of ontology matching. The SEALS platform covers other areas as well. We plan to integrate progressively the SEALS platform within OAEI evaluation. Starting in 2010, three tracks (benchmark, anatomy, conference) have been supported by SEALS with the use of a webbased evaluation service. In 2011 we go a step further and will deploy and execute participating matching tools on the SEALS platform. Currently, the same three tracks (benchmark, anatomy, conference) are planned to be conducted in this modality.

Process

Participants of SEALS supported tracks have to follow this procedure (some participants of OAEI 2010 have already conducted the first two steps):

  1. Create a user account in the SEALS platform and register your matching tool. Go to the SEALS portal (http://www.seals-project.eu/join-the-community) and create a user account. On the platform Logged in as a user, you can register a tool in the platform (click on the link 'Register your tool'). This requires that you shortly describe your tool and categorize it choosing the menu entry 'Ontology Mapping Tool'. It does not require that you upload the tool itself.
  2. Wrap the current version of your matching tool. This step is described in detail in a tutorial (see Section below). It requires to create a small java class that acts as a bridge. Further, you have to put the libraries used by your tool in a certain folder structure.
  3. Upload wrapped version of your matching tool and ask for a dry run. Deadline 01.09.2011, earlier requests are highly welcome and recommended) Once you finished the step of tool wrapping and successfully made the tests described in the tutorial, you can upload your tool. Go to the SEALS portal (http://www.seals-project.eu/) and log-in. Click on the link 'Upload a tool version', choose your tool, specify version information and upload the zip-file. On request, we evaluate your system on the platform to ensure compatibility. This step is not yet automated and it requires us to manually run your wrapped tool inside the platform. For doing so, you have to write an email (contact address at the end of this page) and tell us about the evaluation request, also refering to the uploaded version of your tool. As an answer we will tell you a few days (up to one week) later whether your systems can be run on the SEALS platform and generates meaningful values. In the future this step will be fully automated. For OAEI 2011, we will do it like that. In case of problems a cycle from step 3 to step 5 is suggested.
  4. Prepare, wrap and upload the final version of your matching tool. (Final Deadline 23.09.2011) Many system developers work hard till the final deadline is reached to improve their system. Please wrap the final version of your system as you did for a previous test version in Step 3. For the final evaluation we will use the final version that has been uploaded to the SEALS portal prior to the deadline. Please add a remark in the description of the version to indicate that you want to partipate with this version in OAEI 2011.

Once these steps have been conducted, we run all systems on the SEALS platform and generate the results. Result visualization in the SEALS portal is currently an open issue. It has not yet been decided whether the results will finally be presented via the SEALS portal (with a personal view for each participant) or if they will be presented via result pages (similar as in the years before).

Tutorial on Tool Wrapping

We have prepared a comprehensive PDF-tutorial for wrapping an ontology matching tool. You can download it here:

In the tutorial there are mentioned several additional materials that are available here.

We encourage developers to use the Alignement API. For developers using it, the following ant package is available for packaging and validating the wrapped tools:

Within the Tutorial we show how you can use your wrapped tool to run locally a full evaluation. Note that this is an additonal benefit of the tutorial and the client, which is delivered together with the tutorial, because it allows you to compute precision and recall for the OAEI 2010 datasets (some will be changed slightly for OAEI 2011) and for the new Benchmark2 from 2011. You can use these results to report about them in your results paper in case that full results for OAEI 2011 are not available in time.

Track specific participation

A system that plans to participate in one of the SEALS supported tracks, will be evaluated for all tracks supported by SEALS. This means that it is no longer allowed to participate in one track, e.g., to participate in just the anatomy track. We know that this can be a problem for some systems that have specifically been developed for, e.g., matching biomedical ontologies. However, this point can still be emphasized in the specific results paper that you have to write about your system. In other words, if the results generated for some specific track are not good at all, there is a place where this can be explained in the appropriate way.

This rule holds only for the tracks supported by SEALS!

Web service based evaluation (used in 2010)

In its first simple presentation, the SEALS platform had allowed matchers to be evaluated as web services. Participants had to implement a very simple web service, provide its URL to the platform and could view results online. This service is still available under http://www.seals-project.eu/ontology-matching-evaluation-ui. It is still maintained, but it will not be used for generating the alignments and results of OAEI 2011. Note that using this service requires to have a user account in the SEALS platform. The instructions for implementing your matcher as a web service are still hosted at http://alignapi.gforge.inria.fr/tutorial/tutorial5/

Contacts

Do not hesitate to contact christian # informatik.uni-mannheim : de and Cassia.Trojahn # inrialpes : fr for any questions, which can be related to the overall procedure, to problems in tool-wrapping, and so on ... and do not forget to send us your evaluation request (the earlier, the better)!

Acknowledgement

While developing and improving the tutorial, we have been in contact with several matching tool developer to have some 'reference' matcher for testing our tutorial and the client that comes with it. Thanks go out to Hua Wei Khong Watson (Eff2match), Peigang Xu (Falcon-OA), Faycal Hamdi (Taxomap), Peng Wang (Lily), Zhichun Wang (RiMOM), and Cosmin Stroe (AgreementMaker).