Ontology Alignment Evaluation Initiative - OAEI 2015 Campaign

MultiFarm Preliminary Results for OAEI 2015

In this page one can find the results of the OAEI 2015 campaign for the MultiFarm track. The details on this data set can be found at MultiFarm data set. If you notice any kind of error (wrong numbers, incorrect information on a matching system, etc.) do not hesitate to contact us (for the mail see below in the last paragraph on this page).

Experimental setting

Since 2014, part of the data set is used for a kind of blind evaluation. This subset includes all matching tasks involving the edas and ekaw ontologies (resulting in 55x24 matching tasks -- including Arabic and Italian translations), which were not used in previous campaigns. We refer to this blind evaluation as edas and ekaw based evaluation in the following. Participants were able to test their systems on the available subset of matching tasks (open evaluation), available via the SEALS repository. The open subset counts on 45x25 tasks (does not include Italian translations).

We can distinguish two types of matching tasks : (i) those tasks where two different ontologies (cmt-confOf, for instance) have been translated into two different languages; and (ii) those tasks where the same ontology (cmt-cmt) has been translated into two different languages. For the tasks of type (ii), good results are not directly related to the use of specific techniques for dealing with cross-lingual ontologies, but on the ability to exploit the fact that both ontologies have an identical structure.

This year, 5 systems implement cross-lingual matching strategies: AML, CLONA, LogMap, LYAM++ and XMap. This number increased with respect to the last campaign (3 in 2014, 7 in 2013, and 7 in 2012). Most of them integrate a translation module in their implementations (AML, CLONA, LogMap and XMap) and one of them applies an alternative strategy by making use of the multilingual resource BabelNet (LYAM++).

Evaluation results

Open evaluation

Execution setting and runtime

The systems have been executed on a Debian Linux VM configured with four processors and 20GB of RAM running under a Dell PowerEdge T610 with 2*Intel Xeon Quad Core 2.26GHz E5607 processors. All measurements are based on a single run. Some systems have been executed in a different setting (MAMBA due to the issues with Gurobi optimizer, LogMap due to network problems when accessing the translator servers, and LYAM++ due to the issues with BabelNet license).

We can observe large differences between the time required for a system to complete the 45x25 matching tasks. However, we have experimented some problems when accessing the SEALS test repositories due to the many accesses to the server (i.e., tracks running their evaluations in parallel). Hence, the reported runtime may not reflect the real execution runtime required for completing the tasks.

Overall results

The table below presents the aggregated results for the open subset, for the test cases of type (i) and (ii). The results have been computed using the Alignment API 4.6. We do not apply any threshold on the confidence measure. We observe significant differences between the results obtained for each type of matching task, specially in terms of precision, for most systems, with lower differences in terms of recall. As expected, in terms of F-measure, the systems implementing cross-lingual techniques outperform the non-cross-lingual systems for test cases of type (i). For these cases, non-specific matchers have good precision but generating very few correspondences. While LogMap has the best precision (at the expense of recall), AML has similar results in terms of precision and recall and outperforms the other systems in terms of F-measure (what is the case for both types of tasks). For type (ii), CroMatcher takes advantage of the ontology structure and performs better than some specific cross-lingual systems. The reader can refer to the OAEI paper for a more detailed discussion on these results.

Different ontologies (i) Same ontologies (ii)
System Time #pairs Size Prec. F-m. Rec. Size Prec. F-m. Rec.
AML 10 45 11.58 .53(.53) .51(.51) .50(.50) 58.29 .93(.93) .64(.64) .50(.50)
CLONA 1629 45 9.45 .46(.46) .39(.39) .35(.35) 50.89 .91(.91) .58(.58) .42(.42)
LogMap* 36 45 6.37 .75(.75) .41(.41) .29(.29) 42.83 .95(.95) .45(.45) .30(.30)
LYAM++* - 13 12.29 .14(.50) .14(.49) .14(.44) 64.20 .26(.90) .19(.66) .15(.53)
XMap 4012 45 36.39 .22(.23) .24(.25) .27(.28) 61.65 .66(.69) .37(.39) .27(.29)
CroMatcher 257 45 10.72 .30(.30) .07(.07) .04(.04) 66.02 .78(.78) .55(.55) .45(.45)
DKP-AOM 11 19 2.53 .39(.92) .03(.08) .01(.04) 4.23 .50(.99) .01(.02) .01(.01)
GMap 2069 21 1.69 .37(.80) .03(.06) .01(.03) 3.13 .67(.98) .01(.02) .01(.01)
LogMap-C 56 19 1.41 .38(.90) .03(.09) .02(.04) 3.68 .35(.56) .01(.03) .01(.01)
LogMapLite 13 19 1.29 .39(.91) .04(.08) .02(.04) 3.70 .32(.57) .01(.03) .01(.01)
Mamba* 297 21 1.52 .36(.78) .06(.13) .03(.07) 3.68 .48.(99) .02(.05) .01(.03)
RSDLWB 14 45 30.71 .01(.01) .01(.01) .01(.01) 43.71 .20(.20) .11(.11) .08(.08)
MultiFarm aggregated results per matcher, for each type of matching task -- different ontologies (i) and same ontologies (ii). Time is measured in minutes. Tools marked with an * have been executed in a different setting. \#pairs indicates the number of pairs of languages the tool is able to generated (non empty) alignments. Size indicates the average of the number of generated correspondences for the tests where an (non empty) alignment has been generated. Two kinds of results are reported : those do not distinguishing empty and erroneous (or not generated) alignments and those -- indicated between parenthesis -- considering only non empty generated alignments for a pair of languages.

Language specific results (type i)

Table below presents the results per pair of language, involving matching different ontologies (test cases of type i). With exception of CroMatcher and RSDLWB, non-specific systems are not able to deal with all pairs of languages, in particular those involving Arabic, Chinese and Russian. They take instead advantage of the similarities in the vocabulary of some languages, in the absence of specific strategies. This can be corroborated by the fact that most of them generate their best F-measure for the pairs es-pt (followed by de-en): CroMatcher (es-pt .28, de-en .23), DKP-AOM (es-pt .25, de-en .22), GMap (es-pt .21, fr-nl .20), LogMap-C (es-pt .26, de-en .18), LogMapLite (es-pt .25, de-en .22), and Mamba (es-pt .29, en-nl .23, de-en .22). This behavior has been also observed last year. On the other hand, although it is likely harder to find correspondences between cz-pt than es-pt, for some non-specific systems this pair is present in their top-3 F-measure (with exception of Mamba -es-pt, en-nl, de-en, en-es, de-nl).

For the group of systems implementing cross-lingual strategies, some pairs involving Czech (cz-en, cz-es, cz-pt, cz-de, cz-ru) are again present in the top-5 F-measure of 4 systems (out of 5, the exception is LYAM++ ): AML - cz-en (.63), cz-ru (.62), cz-es (.61), cz-nl (.60), en-es (.59), CLONA - es-ru (.53), cz-es (.51), es-pt (.51), cz-en (.50) and cz-ru (.49), LogMap - cz-de (.55), cz-pt (.54), cz-ru (.53), cz-nl and cz-en (.52), XMAP - cz-es (.52), cz-pt (.50), en-es (.48), cz-ru (.45), and de-es (.45). LYAM++ is the exception, once it was not able to generate alignments for some of these pairs : es-fr (.56), en-es (.53), es-pt (.52), en-ru (.52) and en-fr (.52). A different behavior is observed for the tasks of type (ii), for which these systems perform better for the pairs en-pt, es-fr, en-fr, de-en and es-pt. The exception is LogMap (es-ru, es-nl and fr-nl).

MultiFarm results per pair of languages (H-mean) for the test cases of type (i)
Cross-lingual systems Non-specific systems
AML CLONA LogMap LYAM++ XMap CroMatcher DKP-AOM GMap LogMap-C LogMap-Lite Mamba RSDLWB
ar-cn .46 .33 .26 .44 .29 .22 .75 .25 .15 NaN NaN .00 .15 .17 .18 .02 .02 .03 NaN NaN .00 NaN NaN .00 NaN NaN .00 NaN NaN .00 NaN NaN .00 .00 .00 .01
ar-cz .51 .50 .49 .47 .36 .29 .82 .44 .30 NaN NaN .00 .25 .27 .29 .02 .03 .03 NaN NaN .00 NaN NaN .00 NaN NaN .00 NaN NaN .00 NaN NaN .00 .00 .01 .01
ar-de .32 .30 .29 .29 .22 .18 .46 .21 .14 NaN NaN .00 .14 .15 .15 .02 .02 .03 NaN NaN .00 NaN NaN .00 NaN NaN .00 NaN NaN .00 NaN NaN .00 .00 .00 .01
ar-en .56 .56 .56 .44 .33 .27 .77 .44 .31 NaN NaN .00 .21 .24 .28 .02 .02 .03 NaN NaN .00 NaN NaN .00 NaN NaN .00 NaN NaN .00 NaN NaN .00 .00 .01 .01
ar-es .45 .46 .48 .45 .37 .32 .74 .43 .30 NaN NaN .00 .28 .29 .30 .02 .02 .03 NaN NaN .00 NaN NaN .00 NaN NaN .00 NaN NaN .00 NaN NaN .00 .01 .01 .02
ar-fr .12 .12 .11 .14 .12 .10 .18 .09 .06 NaN NaN .00 .04 .05 .05 .00 .00 .00 NaN NaN .00 NaN NaN .00 NaN NaN .00 NaN NaN .00 NaN NaN .00 .00 NaN .00
ar-nl .47 .45 .43 .36 .26 .20 .81 .43 .30 NaN NaN .00 .22 .24 .26 .02 .03 .03 NaN NaN .00 NaN NaN .00 NaN NaN .00 NaN NaN .00 NaN NaN .00 .00 .00 .01
ar-pt .50 .50 .49 .48 .38 .31 .84 .45 .31 NaN NaN .00 .29 .28 .28 .02 .02 .03 NaN NaN .00 NaN NaN .00 NaN NaN .00 NaN NaN .00 NaN NaN .00 .01 .01 .02
ar-ru .48 .44 .41 .44 .33 .26 .79 .46 .32 NaN NaN .00 .18 .20 .23 .02 .03 .03 NaN NaN .00 NaN NaN .00 NaN NaN .00 NaN NaN .00 NaN NaN .00 .00 .00 .00
cn-cz .58 .45 .36 .48 .36 .28 .86 .39 .26 NaN NaN .00 .28 .29 .29 .02 .02 .03 NaN NaN .00 NaN NaN .00 NaN NaN .00 NaN NaN .00 NaN NaN .00 .00 .01 .01
cn-de .58 .49 .42 .49 .34 .26 .78 .27 .16 NaN NaN .00 .27 .28 .29 .02 .02 .02 NaN NaN .00 NaN NaN .00 NaN NaN .00 NaN NaN .00 NaN NaN .00 .00 .00 .01
cn-en .60 .55 .50 .39 .29 .23 .89 .29 .17 NaN NaN .00 .25 .27 .30 .02 .02 .03 NaN NaN .00 NaN NaN .00 NaN NaN .00 NaN NaN .00 NaN NaN .00 .01 .02 .03
cn-es .58 .53 .49 .47 .39 .34 .73 .34 .22 NaN NaN .00 .30 .30 .30 .02 .02 .03 NaN NaN .00 NaN NaN .00 NaN NaN .00 NaN NaN .00 NaN NaN .00 .01 .01 .02
cn-fr .57 .52 .48 .42 .36 .32 .73 .31 .20 NaN NaN .00 .30 .31 .33 .02 .02 .02 NaN NaN .00 NaN NaN .00 NaN NaN .00 NaN NaN .00 NaN NaN .00 .02 .02 .04
cn-nl .48 .34 .27 .40 .30 .24 .79 .28 .17 NaN NaN .00 .22 .23 .24 .02 .02 .03 NaN NaN .00 NaN NaN .00 NaN NaN .00 NaN NaN .00 NaN NaN .00 .01 .01 .02
cn-pt .56 .50 .46 .48 .35 .28 .81 .31 .19 NaN NaN .00 .28 .27 .26 .02 .02 .02 NaN NaN .00 NaN NaN .00 NaN NaN .00 NaN NaN .00 NaN NaN .00 .01 .01 .02
cn-ru .50 .39 .32 .42 .34 .28 .78 .41 .28 NaN NaN .00 .23 .25 .28 .01 .01 .02 NaN NaN .00 NaN NaN .00 NaN NaN .00 NaN NaN .00 NaN NaN .00 .01 .01 .02
cz-de .55 .52 .49 .50 .42 .37 .82 .55 .41 NaN NaN .00 .39 .42 .45 .46 .11 .07 .85 .08 .04 1.00 .04 .02 .85 .08 .04 .85 .08 .04 1.00 .11 .06 .00 .00 .01
cz-en .61 .63 .65 .53 .50 .46 .82 .52 .39 NaN NaN .00 .37 .43 .52 .59 .12 .07 1.00 .05 .02 1.00 .06 .03 1.00 .06 .03 1.00 .07 .04 .79 .19 .11 .01 .01 .02
cz-es .58 .61 .65 .52 .51 .50 .74 .50 .38 NaN NaN .00 .48 .52 .59 .40 .10 .06 1.00 .12 .07 1.00 .06 .03 1.00 .12 .07 1.00 .12 .07 1.00 .16 .09 .00 .01 .01
cz-fr .57 .57 .57 .46 .42 .39 .77 .49 .36 NaN NaN .00 .41 .45 .50 .18 .02 .01 1.00 .02 .01 1.00 .02 .01 1.00 .02 .01 1.00 .02 .01 .75 .09 .05 .01 .01 .02
cz-nl .60 .60 .60 .48 .44 .40 .77 .52 .39 NaN NaN .00 .39 .43 .49 .82 .11 .06 1.00 .04 .02 .56 .07 .04 1.00 .04 .02 1.00 .06 .03 .61 .10 .06 .00 .01 .01
cz-pt .57 .59 .62 .49 .47 .46 .80 .54 .41 NaN NaN .00 .46 .50 .54 .80 .20 .11 1.00 .14 .07 1.00 .11 .06 1.00 .14 .07 1.00 .14 .08 1.00 .18 .10 .01 .02 .03
cz-ru .63 .62 .62 .53 .49 .45 .81 .53 .39 NaN NaN .00 .39 .45 .52 .02 .02 .02 NaN NaN .00 NaN NaN .00 NaN NaN .00 NaN NaN .00 NaN NaN .00 .01 .02 .03
de-en .53 .54 .55 .43 .38 .35 .79 .50 .36 NaN NaN .00 .37 .41 .46 .62 .23 .14 1.00 .22 .12 1.00 .04 .02 1.00 .18 .10 1.00 .22 .12 .91 .22 .13 .01 .01 .02
de-es .51 .52 .52 .51 .45 .40 .79 .48 .34 NaN NaN .00 .46 .45 .45 .61 .16 .09 1.00 .06 .03 1.00 .03 .02 1.00 .11 .06 1.00 .06 .03 .92 .17 .09 .01 .01 .02
de-fr .52 .53 .55 .49 .44 .40 .76 .44 .31 NaN NaN .00 .37 .39 .42 .40 .06 .03 .80 .03 .02 1.00 .01 .00 .80 .03 .02 .80 .03 .02 .62 .06 .03 .01 .01 .02
de-nl .54 .50 .46 .48 .40 .35 .80 .44 .30 NaN NaN .00 .32 .33 .35 .83 .17 .10 .86 .05 .02 .67 .03 .02 .86 .05 .02 .86 .05 .02 .91 .21 .12 .01 .01 .01
de-pt .50 .50 .50 .52 .46 .41 .75 .47 .35 NaN NaN .00 .42 .43 .43 .90 .14 .07 1.00 .08 .04 1.00 .04 .02 1.00 .08 .04 1.00 .08 .04 1.00 .09 .05 .00 .01 .01
de-ru .56 .51 .47 .50 .42 .36 .79 .46 .33 NaN NaN .00 .36 .39 .42 .01 .02 .02 NaN NaN .00 NaN NaN .00 NaN NaN .00 NaN NaN .00 NaN NaN .00 .00 .00 .01
en-es .56 .59 .62 .49 .46 .44 .81 .43 .29 .48 .53 .59 .45 .48 .52 .63 .14 .08 1.00 .05 .02 1.00 .06 .03 1.00 .14 .08 1.00 .05 .02 .91 .21 .12 .01 .01 .02
en-fr .52 .55 .57 .45 .43 .41 .74 .38 .26 .45 .51 .58 .37 .42 .48 .42 .06 .03 .63 .04 .02 .61 .10 .06 .44 .03 .02 .45 .04 .02 .62 .17 .10 .01 .01 .02
en-nl .56 .57 .58 .44 .39 .34 .82 .39 .26 .51 .50 .49 .01 .02 .04 .75 .19 .11 .83 .08 .04 .76 .12 .07 .73 .08 .04 .80 .12 .07 .89 .23 .13 .00 .00 .00
en-pt .56 .58 .61 .49 .46 .43 .82 .52 .38 .48 .50 .52 .02 .03 .05 .88 .17 .09 1.00 .06 .03 .91 .08 .04 1.00 .06 .03 1.00 .09 .04 .87 .10 .05 .01 .02 .03
en-ru .59 .59 .59 .44 .39 .35 .79 .46 .33 .60 .52 .46 .01 .01 .02 .02 .03 .03 NaN NaN .00 NaN NaN .00 NaN NaN .00 NaN NaN .00 NaN NaN .00 .01 .01 .02
es-fr .54 .57 .60 .47 .48 .49 .67 .32 .21 .50 .56 .64 .01 .02 .09 .62 .18 .11 1.00 .02 .01 .78 .05 .03 .91 .08 .04 1.00 .02 .01 .69 .07 .04 .01 .01 .02
es-nl .57 .59 .62 .48 .46 .43 .68 .39 .27 .50 .47 .45 .00 .01 .04 .46 .05 .02 NaN NaN .00 .17 .01 .00 NaN NaN .00 NaN NaN .00 .29 .02 .01 .00 .00 .01
es-pt .54 .57 .59 .50 .51 .52 .74 .51 .39 .50 .52 .54 .05 .08 .28 .70 .28 .17 .79 .25 .15 .70 .21 .13 .76 .26 .16 .73 .25 .15 .71 .29 .18 .00 .00 .00
es-ru .56 .57 .57 .55 .53 .50 .66 .39 .27 .51 .51 .50 .00 .01 .02 .01 .02 .02 NaN NaN .00 NaN NaN .00 NaN NaN .00 NaN NaN .00 NaN NaN .00 .00 .00 .00
fr-nl .55 .55 .56 .48 .46 .44 .70 .30 .19 .46 .44 .41 .02 .03 .15 .62 .13 .07 .82 .13 .07 .88 .20 .11 .82 .13 .07 .82 .13 .07 .82 .13 .07 .00 .01 .01
fr-pt .55 .56 .57 .50 .47 .45 .74 .44 .32 .48 .48 .48 .01 .03 .11 .85 .08 .04 NaN NaN .00 .50 .03 .02 NaN NaN .00 NaN NaN .00 .69 .07 .04 .01 .01 .02
fr-ru .57 .57 .57 .42 .40 .39 .67 .38 .26 .54 .46 .41 .01 .01 .08 .01 .01 .02 NaN NaN .00 NaN NaN .00 NaN NaN .00 NaN NaN .00 NaN NaN .00 .00 .01 .01
nl-pt .56 .57 .58 .52 .46 .41 .76 .43 .30 NaN NaN .00 .01 .01 .06 .65 .10 .05 1.00 .02 .01 .33 .04 .02 1.00 .02 .01 1.00 .02 .01 .50 .03 .02 .01 .01 .02
nl-ru .57 .55 .54 .46 .41 .37 .79 .48 .34 .52 .47 .42 .00 .01 .03 .01 .01 .02 NaN NaN .00 NaN NaN .00 NaN NaN .00 NaN NaN .00 NaN NaN .00 .00 .00 .01
pt-ru .54 .52 .50 .50 .47 .45 .75 .51 .38 NaN NaN .00 .00 .01 .02 .02 .03 .03 NaN NaN .00 NaN NaN .00 NaN NaN .00 NaN NaN .00 NaN NaN .00 .00 .00 .01

NaN: division per zero, likely due to empty alignment.

Generated alignments and additional table of results

You can download the complete set of generated alignments. These alignments have been generated by executing the tools with the help of the SEALS infrastructure. All results presented above were based on these alignments. You can download as well additional tables of results (including precision and recall for each matching task), for both types of matching task (i) and (ii).

Edas and ekaw based evaluation

The table belows presents the aggregated results for the matching tasks involving edas and ekaw ontologies. LYAM++ has participated only in the open test. The overall results here are close to what has been observed for the open evaluation. For both types of tasks, LogMap outperforms all systems in terms of precision and AML in terms of F-measure. Both of them required more time for finishing the tasks due to the fact that new translations were computed on the fly (for Italian). For CLONA, its runtime can be explained by the fact that it may have experimented no (concurrent) problems to access the SEALS repositories. Note again that runtimes this year may not reflect the real time required for finishing the tasks.

Different ontologies (i) Same ontologies (ii)
System Time #pairs Size Prec. F-m. Rec. Size Prec. F-m. Rec.
AML1285513.33.52(.52).47(.47).42(.42)68.62.93(.93).64(.64).49(.49)
CLONA*931559.62.40(.40).29(.29).23(.23)61.98.88(.88).57(.57).42(.42)
LogMap*253557.43.71(.71).38(.38).27(.27)52.69.97(.97) .44(.44).30(.30)
LYAM++**-- - - - - - - --
XMap1187752 182.55.14(.15).13(.13).17(.18)285.53.40(.44).22(.24).19(.21)
CroMatcher2975513.53.32(.32).09(.09).06(.06)75.08.81(.81) .54(.54) .44(.44)
DKP-AOM20242.58.43(.98).04(.09).02(.05)4.37.49(1.0).02(.03).01(.01)
GMap2968271.81.45(.92).05(.11).03(.06)4.4.49(.99).02(.05).01(.02)
LogMap-C73261.24.38(.81).05(.10).03(.05)93.69.02(.04).01(.03).01(.02)
LogMapLite17251.16.36(.78).04(.09).02(.05)94.5.02(.04).01(.03).01(.02)
Mamba*383281.81.48(.93).08(.15).04(.09)3.74.59(.99).03(.05).01(.02)
RSDLWB195532.12.01(.01).01(.01).01(.01)43.31.19(.10).10(.10).06(.06)
MultiFarm aggregated results per matcher for the edas and ekaw based evaluation, for each type of matching task -- different ontologies (i) and same ontologies (ii). Time is measured in minutes (for completing the 55x24 matching tasks). Tools marked with an * have been executed in a different setting.

Language specific results (type i)

Looking for the overall results of non-specific systems, for cases of type (i), DKP-OAM still generates good values of precision but has been outperformed by GMap and Mamba. For cases of type (ii), CroMatcher corroborates the good results obtained by its structural strategy, while LogMap-C and LogMap-Lite decrease their precision, considerably increasing the number of generated correspondences (in particular for edas-edas task).

With respect to the pairs of languages for test cases of type (i), although the overall results remain relatively stable, new pairs of languages take place in the top-3 F-measure. For non specific systems, it is the case for the pairs es-it and it-pt : CroMatcher (es-it .25, it-pt .25, en-it .24, and en-nl .21), DKP-AOM (es-pt .20, de-en .20, it-pt .17, es-it .16), GMap (it-pt .31, en-it .25, en-fr .19), LogMap-C (de-en .23, es-pt .21, it-pt .20, es-it .19), LogMapLite (de-en .20, es-pt .20, it-pt .17, es-it .16), and Mamba (de-en .27, en-it .26, en-nl .25, it-pt .24). For the group of systems implementing cross-lingual strategies, this fact has been observed for 2 (AML and XMAP) out of 4 systems. For those systems, some pairs involving Czech (cn-cz, cz-de, cz-en ou cz-ru) are again present in the top-5 F-measure of 3 out of 4 systems: AML (es-it .58, en-pt .58 en-nl .57 cz-en .57 nl-pt .57 es-nl .56, cz-nl .55, en-es .55, cz-es .54), CLONA (cn-cz .38, cz-pt .38, de-pt .38, de-en .37, fr-pt .37, pt-ru .36, es-pt .36, es-ru .35, fr-ru .35, cz-de .35), LogMap (en-nl .53, en-pt .51, cz-en .49, en-ru .48, cz-nl .46, cz-ru .46). The exception is XMAP (nl-pt .53, nl-ru .43, it-pt .41, pt-ru .37, fr-ru .37). Finally, with respect to type (ii), the pair it-pt appears in the top-3 F-measure of AML and CLONA.

MultiFarm results per pair of languages () for the test cases of type (i)
Cross-lingual systems Non-specific systems
AML CLONA LogMap XMap CroMatcher DKP-AOM GMap LogMap-C LogMap-Lite Mamba RSDLWB
ar-cn .39.22.16.36.15.09.61.19.11.17.17.17.02.02.03 NaN NaN .00 NaN NaN .00.00 NaN .00.00 NaN .00 NaN NaN .00.00.01 .01
ar-cz .51.43.37.44.25.18.71.40.28.30.30.30.02.03.04 NaN NaN .00 NaN NaN .00.00 NaN .00.00 NaN .00 NaN NaN .00.01.01 .02
ar-de .46.39.34.41.27.20.75.36.24.30.29.29.02.03.03 NaN NaN .00 NaN NaN .00.00 NaN .00.00 NaN .00 NaN NaN .00.01.01 .01
ar-en .58.54.51.40.28.22.73.41.28.31.33.35.02.03.03 NaN NaN .00 NaN NaN .00.00 NaN .00.00 NaN .00 NaN NaN .00.01.01 .02
ar-es .42.38.36.39.26.20.68.37.25.32.30.29.02.02.03 NaN NaN .00 NaN NaN .00.00 NaN .00.00 NaN .00 NaN NaN .00.01.01 .01
ar-fr .51.43.37.37.26.20.64.31.20.29.28.27.02.03.04 NaN NaN .00 NaN NaN .00.00 NaN .00.00 NaN .00 NaN NaN .00.01.01 .02
ar-it .51.46.42.42.30.23.70.32.21.58.12.07.02.03.03 NaN NaN .00 NaN NaN .00.00 NaN .00.00 NaN .00 NaN NaN .00.00.01 .01
ar-nl .48.41.36.38.24.18.73.42.30.30.30.30.03.03.04 NaN NaN .00 NaN NaN .00.00 NaN .00.00 NaN .00 NaN NaN .00.00.01 .01
ar-pt .51.46.42.41.30.23.73.38.25.35.33.31.03.03.04 NaN NaN .00 NaN NaN .00.00 NaN .00.00 NaN .00 NaN NaN .00.01.01 .01
ar-ru .41.33.27.45.28.21.76.41.28.21.21.21.02.02.02 NaN NaN .00 NaN NaN .00.00 NaN .00.00 NaN .00 NaN NaN .00.01.01 .02
cn-cz .47.34.27.34.20.15.72.27.17.32.29.27.03.03.04 NaN NaN .00 NaN NaN .00.00 NaN .00.00 NaN .00 NaN NaN .00.01.01 .02
cn-de .47.32.24.30.16.11.71.23.13.26.26.25.03.04.05 NaN NaN .00 NaN NaN .00.00 NaN .00.00 NaN .00 NaN NaN .00.02.02 .03
cn-en .52.44.38.29.18.13.77.22.13.27.27.27.02.03.03 NaN NaN .00 NaN NaN .00.00 NaN .00.00 NaN .00 NaN NaN .00.01.01 .01
cn-es .45.37.31.28.17.12.66.25.15.32.30.28.03.04.05 NaN NaN .00 NaN NaN .00.00 NaN .00.00 NaN .00 NaN NaN .00.01.01 .01
cn-fr .46.36.29.25.14.10.70.23.14.30.29.28.03.03.04 NaN NaN .00 NaN NaN .00.00 NaN .00.00 NaN .00 NaN NaN .00.00.01 .01
cn-it .50.37.29.30.16.11.66.20.11.03.04.04.03.04.04 NaN NaN .00 NaN NaN .00.00 NaN .00.00 NaN .00 NaN NaN .00.01.01 .01
cn-nl .50.33.25.28.14.09.67.21.12.00.01.02.02.03.04 NaN NaN .00 NaN NaN .00.00 NaN .00.00 NaN .00 NaN NaN .00.01.02 .02
cn-pt .50.39.33.30.17.11.74.25.15.00.01.02.03.04.05 NaN NaN .00 NaN NaN .00.00 NaN .00.00 NaN .00 NaN NaN .00.00.01 .01
cn-ru .53.39.31.35.21.15.72.31.19.00.01.96.03.03.04 NaN NaN .00 NaN NaN .00.00 NaN .00.00 NaN .00 NaN NaN .00.01.01 .01
cz-de .52.47.42.44.35.29.71.39.27.02.03.14.54.12.071.00.13.071.00.06.03.93.13.07.93.13.071.00.13.07.01.01 .02
cz-en .60.57.55.40.32.26.76.49.37.01.02.04.49.10.051.00.07.04.79.08.04.87.07.04.65.07.04.85.21.12.01.02 .02
cz-es .55.54.52.41.33.28.70.41.29.02.03.11.53.04.021.00.05.021.00.02.01.82.05.02.82.05.021.00.06.03.01.01 .02
cz-fr .57.51.47.43.33.27.69.44.33.01.01.05.40.03.02 NaN NaN .00 NaN NaN .00.00 NaN .00.00 NaN .001.00.10.05.01.01 .02
cz-it .48.43.40.43.30.23.62.37.26.01.02.06.47.09.051.00.05.03.90.10.05.83.05.03.83.05.031.00.09.05.02.02 .03
cz-nl .61.55.51.44.31.23.73.46.34.01.02.08.47.10.051.00.08.04.86.12.07.90.09.05.80.08.04.97.17.10.01.01 .02
cz-pt .51.49.47.48.38.32.71.43.31.02.03.12.60.09.051.00.11.06.96.12.07.91.11.06.88.11.061.00.14.07.01.01 .02
cz-ru .57.51.46.41.33.28.75.46.33.01.01.05.03.04.04 NaN NaN .00 NaN NaN .00.00 NaN .00.00 NaN .00 NaN NaN .00.01.01 .01
de-en .52.49.47.45.37.32.76.44.31.02.04.09.62.19.11.95.20.11.95.10.05.89.23.13.89.20.11.91.27.16.00.01 .01
de-es .46.42.38.43.32.26.75.39.27.01.01.06.62.09.051.00.01.011.00.02.01.50.02.01.50.01.011.00.07.04.01.01 .01
de-fr .51.47.43.45.35.29.77.43.30.01.02.08.50.06.03.89.04.021.00.02.01.80.04.02.75.05.02.95.10.05.01.01 .02
de-it .54.48.44.43.31.24.69.37.25.01.02.05.59.17.101.00.05.03.94.09.05.89.08.04.83.05.03.94.16.09.01.01 .01
de-nl .55.48.42.47.33.25.78.45.32.01.02.10.71.21.121.00.10.05.91.05.03.91.11.06.90.10.05.92.23.13.01.01 .01
de-pt .52.45.40.49.38.31.71.38.26.01.02.09.68.12.061.00.07.041.00.05.02.87.07.04.87.07.041.00.07.04.01.01 .01
de-ru .54.44.37.44.30.23.78.44.30.01.01.04.03.03.04 NaN NaN .00 NaN NaN .00.00 NaN .00.00 NaN .00 NaN NaN .00.01.01 .01
en-es .56.55.54.42.34.29.75.45.32.01.01.03.81.17.101.00.03.021.00.10.05.90.10.05.75.03.02.94.16.08.01.01 .02
en-fr .56.53.50.39.31.26.69.43.32.02.03.05.67.14.08.84.08.04.71.19.11.79.15.08.79.10.05.76.24.14.01.01 .02
en-it .55.52.49.40.30.24.66.42.30.00 NaN .00.66.24.14.95.09.05.91.25.14.83.15.08.86.09.05.86.26.15.00.00 .00
en-nl .60.57.55.39.29.23.77.53.40.02.02.05.65.21.121.00.12.07.91.15.08.88.14.08.86.13.07.87.25.15.01.01 .01
en-pt .60.58.57.48.38.31.76.51.39.02.02.05.60.14.081.00.09.05.97.15.08.92.12.06.86.09.05.97.16.09.01.01 .02
en-ru .56.51.46.37.27.21.87.48.34.00 NaN .00.03.03.04 NaN NaN .00 NaN NaN .00.00 NaN .00.00 NaN .00 NaN NaN .00.01.01 .02
es-fr .53.50.47.37.32.29.70.40.28.01.02.07.65.10.05 NaN NaN .00.88.04.02.25.01.00.00 NaN .00.95.09.05.01.01 .01
es-it .58.58.57.38.33.29.64.42.31.02.03.06.80.25.141.00.16.091.00.12.06.95.19.10.94.16.09.97.19.10.01.01 .01
es-nl .58.56.54.38.33.30.69.40.28.01.01.09.79.11.06 NaN NaN .001.00.03.01.75.03.02.00 NaN .00.78.04.02.01.01 .02
es-pt .55.53.51.40.36.33.70.44.33.04.06.23.66.21.13.95.20.11.74.15.08.92.21.12.82.20.11.82.19.11.01.01 .02
es-ru .55.50.46.42.36.31.75.41.28.00.00.02.03.03.04 NaN NaN .00 NaN NaN .00.00 NaN .00.00 NaN .00 NaN NaN .00.01.01 .02
fr-it .55.53.51.35.30.27.64.41.30.01.01.03.63.15.09 NaN NaN .00.81.09.05.00 NaN .00.00 NaN .00.91.20.11.01.01 .01
fr-nl .55.52.48.39.34.30.71.43.31.02.04.16.52.14.081.00.09.051.00.15.08.92.12.06.90.09.05.96.11.06.01.01 .02
fr-pt .53.49.46.46.37.32.65.40.29.01.02.07.59.10.051.00.01.011.00.09.05.40.01.01.50.01.011.00.12.07.00.01 .01
fr-ru .55.50.45.43.35.29.73.36.24.35.36.36.03.03.04 NaN NaN .00 NaN NaN .00.00 NaN .00.00 NaN .00 NaN NaN .00.01.01 .02
it-nl .52.48.45.39.30.24.67.41.29.47.17.10.59.16.091.00.06.03.78.13.07.75.09.05.85.06.03.85.12.06.01.01 .02
it-pt .53.52.52.34.29.26.66.41.29.57.41.31.67.25.15.97.17.09.99.31.19.93.20.11.92.17.09.98.24.13.01.02 .02
it-ru .49.45.43.35.24.18.75.39.27.00 NaN .00.03.03.04 NaN NaN .00 NaN NaN .00.00 NaN .00.00 NaN .00 NaN NaN .00.00.00 .01
nl-pt .59.57.55.43.34.29.69.44.33.53.53.54.48.10.061.00.06.03.96.11.06.86.06.03.86.06.031.00.11.06.00.00 .00
nl-ru .59.51.46.37.27.21.78.46.33.42.43.44.03.03.04 NaN NaN .00 NaN NaN .00.00 NaN .00.00 NaN .00 NaN NaN .00.01.01 .01
pt-ru .49.45.40.50.37.29.75.46.33.37.37.36.03.03.04 NaN NaN .00 NaN NaN .00.00 NaN .00.00 NaN .00 NaN NaN .00.01.01 .02

NaN: division per zero, likely due to empty alignment.

References

[1] Christian Meilicke, Raul Garcia-Castro, Fred Freitas, Willem Robert van Hage, Elena Montiel-Ponsoda, Ryan Ribeiro de Azevedo, Heiner Stuckenschmidt, Ondrej Svab-Zamazal, Vojtech Svatek, Andrei Tamilin, Cassia Trojahn, Shenghui Wang. MultiFarm: A Benchmark for Multilingual Ontology Matching. Accepted for publication at the Journal of Web Semantics.

An authors version of the paper can be found at the MultiFarm homepage, where the data set is described in details.

Contact

This track is organised by Cassia Trojahn dos Santos. If you have any problems working with the ontologies, any questions or suggestions, feel free to write an email to cassia [.] trojahn [at] irit [.] fr.