The SEALS project was dedicated to the evaluation of semantic web technologies. To that extent, it created infrastructure for easing this evaluation, organizing evaluation campaigns, and building the community of tool providers and tool users around this evaluation activity. The SEALS infrastructure is now routingly used in ontology matching evaluation.
To ease communication between participants and track organizers, we will have two OAEI contact points: Ernesto Jimenez-Ruiz (ernesto [.] jimenez [.] ruiz [at] gmail [.] com) and Daniel Faria (daniel [.] faria [.] 81 [at] gmail [.] com). General questions about the OAEI campaign (deadlines, registration, tool wrapping, submissions, rules, datasets, etc.) should be addressed to Ernesto. Technical questions about the use of the SEALS client and potential library incompatibilities should be addressed to Daniel.
Participants have to follow this procedure:
Previously, some tracks had experienced problems for running all the the tools under the same JDK version. Some participants continue to use JDK 1.6.xx, but new participants tend to use JDK 1.7. or even 1.8. To facilitate the evaluation process please try to run your tool under this version (JDK 1.7). If it is not possible for you, please keep us informed.
Once these steps have been conducted, we run all systems on the SEALS infrastructure and generate the results.
We have prepared a comprehensive PDF-tutorial for wrapping an ontology matching tool and testing your tool. You can download it here:
In the tutorial there are mentioned several additional materials that are available here.
We have also prepared a tutorial to communicate with the Oracle in the interactive track:
We detected some problems in the past when using libraries in conflict with the libraries used by the SEALS client. We encourage when possible to use the same versions as the SEALS client. Please check the maven dependencies of the SEALS client here. Note that you must not include the SEALS client jar file within your wrapped system.
We also encourage developers to use the Alignment API. For developers using it, the following ant package is available for packaging and validating the wrapped tools:
Within the Tutorial we show how you can use your wrapped tool to run locally a full evaluation. Thus, you can compute precision and recall for all of those testsuites listed in the track web pages at any time in your development process.
Do not hesitate to contact Ernesto Jimenez-Ruiz (ernesto [.] jimenez [.] ruiz [at] gmail [.] com) and Daniel Faria (daniel [.] faria [.] 81 [at] gmail [.] com) for any question you may have.
While developing and improving the tutorial, we have been in contact with several matching tool developer to have some 'reference' matcher for testing our tutorial and the client that comes with it. Thanks go out to Hua Wei Khong Watson (Eff2match), Peigang Xu (Falcon-OA), Faycal Hamdi (Taxomap), Peng Wang (Lily), Zhichun Wang (RiMOM), and Cosmin Stroe (AgreementMaker).