Ontology Alignment Evaluation Initiative - OAEI-2023 CampaignOAEI

Evaluation results for the Biodiv track at OAEI 2023

Participants

In our preliminary evaluation, only 5 systems (LogMap, LogMapLt, LogMapKG, Matcha and OLaLa) managed to generate an output for at least one of the track tasks.

Experimental setting

We conducted experiments using the MELT client. We executed each system in its standard settings and we calculated precision, recall and f-measure. The execution times are calculated considering the whole process pipeline, starting from ontologies upload and environment preparation.

We have run the evaluation on: a Windows 10 (64-bit) desktop with an Intel Core i7-4770 CPU @ 3.40GHz x 4, allocating 16GB of RAM.

Results

1. Results for the ENVO-SWEET matching task

System Time (HH:MM:SS) # Mappings Scores
Precision  Recall  F-measure
LogMapKG 00:00:28 677 0.781 0.657 0.714
LogMap 00:00:36 681 0.780 0.655 0.713
LogMapLt 00:05:40 595 0.829 0.594 0.693
OLaLa 06:46:18 1081 0.484 0.650 0.555
Table 1: Results for ENVO-SWEET.

2. Results for the MACROALGAE-MACROZOOBENTHOS matching task

System Time (HH:MM:SS) # Mappings Scores
Precision  Recall  F-measure
OLaLa 00:08:30 10 0.7 0.388 0.5
LogMapLt 00:00:00 9 0.857 0.333 0.480
LogMap 00:00:03 29 0.275 0.444 0.340
LogMapKG 00:00:04 29 0.275 0.444 0.340
Matcha 00:00:07 45 0.2 0.5 0.285
Table 2: Results for MACROALGAE-MACROZOOBENTHOS.

3. Results for the FISH-ZOOPLANKTON matching task

System Time (HH:MM:SS) # Mappings Scores
Precision  Recall  F-measure
OLaLa 00:07:59 13 1 0.866 0.928
LogMapLt 00:00:00 8 1 0.533 0.695
Matcha 00:00:11 47 0.276 0.866 0.419
LogMapKG 00:00:04 55 0.218 0.8 0.342
LogMap 00:00:03 32 0.093 0.2 0.127
Table 3: Results for FISH-ZOOPLANKTON.

4. Results for the NCBITAXON-TAXREFLD matching task

System Time (HH:MM:SS) # Mappings Scores
Precision  Recall  F-measure
OLaLa 68:27:32 70821 0.679 0.998 0.808
Matcha 00:04:18 71008 0.674 0.993 0.803
LogMapLt 00:00:43 72010 0.665 0.993 0.796
LogMap 00:00:43 72899 0.660 0.998 0.795
LogMapKG 00:11:32 72898 0.660 0.998 0.795
Table 4: Results for NCBITAXON-TAXREFLD Animalia.
System Time (HH:MM:SS) # Mappings Scores
Precision  Recall  F-measure
LogMapLt 00:00:00 290 0.6 0.994 0.748
OLaLa 00:19:32 294 0.593 0.994 0.743
Matcha 00:00:14 300 0.58 0.994 0.732
LogMap 00:00:01 304 0.575 1.0 0.730
LogMapKG 00:00:01 304 0.575 1 0.730
Table 5: Results for NCBITAXON-TAXREFLD Bacteria.
System Time (HH:MM:SS) # Mappings Scores
Precision  Recall  F-measure
LogMapLt 00:00:01 2165 0.637 0.982 0.773
OLaLa 01:59:05 2173 0.634 0.981 0.771
LogMap 00:00:04 2218 0.623 0.985 0.764
LogMapKG 00:00:04 2218 0.623 0.985 0.764
Matcha 00:00:48 2213 0.624 0.984 0.764
Table 1: Results for NCBITAXON-TAXREFLD Chromista.
System Time (HH:MM:SS) # Mappings Scores
Precision  Recall  F-measure
OLaLa 11:54:37 12549 0.807 0.996 0.891
Matcha 00:01:43 12925 0.785 0.998 0.879
LogMap 00:00:39 12949 0.783 0.998 0.878
LogMapKG 00:00:40 12949 0.783 0.998 0.878
LogMapLt 00:00:07 12929 0.783 0.997 0.877
Table 1: Results for NCBITAXON-TAXREFLD Fungi.
System Time (HH:MM:SS) # Mappings Scores
Precision  Recall  F-measure
OLaLa 22:04:48 25667 0.769 0.991 0.866
LogMapLt 00:00:17 26359 0.746 0.987 0.849
Matcha 00:03:16 26597 0.741 0.989 0.847
LogMap 00:01:44 26912 0.731 0.988 0.840
LogMapKG 00:01:36 26910 0.731 0.988 0.840
Table 1: Results for NCBITAXON-TAXREFLD Plantae.
System Time (HH:MM:SS) # Mappings Scores
Precision  Recall  F-measure
OLaLa 00:32:17 476 0.750 1.0 0.857
LogMapLt 00:00:00 477 0.746 0.997 0.853
Matcha 00:00:44 493 0.724 1.0 0.840
LogMap 00:00:01 496 0.719 1.0 0.837
LogMapKG 00:00:01 496 0.719 1.0 0.837
Table 1: Results for NCBITAXON-TAXREFLD Protozoa.

Contact

This evaluation has been run by Naouel Karam and Alsayed Algergawy. If you have any problems working with the ontologies, any questions related to tool wrapping, or any suggestions related to the Biodiv track, feel free to write an email to: naouel [.] karam [at] fokus [.] fraunhofer [.] de