Ontology Alignment Evaluation Initiative - OAEI-2024 CampaignOAEI

Evaluation results for the Biodiv track at OAEI 2024

Participants

In our preliminary evaluation, only 4 systems (LogMap, LogMapLt, LogMapKG and Matcha) managed to generate an output for at least one of the track tasks.

Experimental setting

We conducted experiments using the MELT client. We executed each system in its standard settings and we calculated precision, recall and f-measure. The execution times are calculated considering the whole process pipeline, starting from ontologies upload and environment preparation.

We have run the evaluation on: a Windows 10 (64-bit) desktop with an Intel Core i7-4770 CPU @ 3.40GHz x 4, allocating 16GB of RAM.

Results

1. Results for the ENVO-SWEET matching task

System Time (HH:MM:SS) # Mappings Scores
Precision  Recall  F-measure
LogMap 00:00:21 683 0.776 0.659 0.713
LogMapKG 00:00:24 683 0.775 0.658 0.711
LogMapLt 00:04:47 595 0.803 0.595 0.683
Table 1: Results for ENVO-SWEET.

2. Results for the MACROALGAE-MACROZOOBENTHOS matching task

System Time (HH:MM:SS) # Mappings Scores
Precision  Recall  F-measure
LogMapLt 00:00:00 9 0.667 0.333 0.444
LogMap 00:00:02 29 0.276 0.444 0.340
LogMapKG 00:00:03 29 0.276 0.444 0.340
Matcha 00:00:05 45 0.200 0.500 0.286
Table 2: Results for MACROALGAE-MACROZOOBENTHOS.

3. Results for the FISH-ZOOPLANKTON matching task

System Time (HH:MM:SS) # Mappings Scores
Precision  Recall  F-measure
LogMapLt 00:00:00 10 0.800 0.533 0.640
Matcha 00:00:08 47 0.277 0.867 0.419
LogMapKG 00:00:03 55 0.218 0.800 0.343
LogMap 00:00:02 32 0.094 0.200 0.128
Table 3: Results for FISH-ZOOPLANKTON.

4. Results for the NCBITAXON-TAXREFLD matching task

System Time (HH:MM:SS) # Mappings Scores
Precision  Recall  F-measure
Matcha 00:01:18 71008 0.675 0.994 0.804
LogMapLt 13:56:54 72010 0.665 0.993 0.797
LogMap 00:06:31 72899 0.661 0.999 0.795
LogMapKG 00:06:13 72898 0.661 0.999 0.795
Table 4: Results for NCBITAXON-TAXREFLD Animalia.
System Time (HH:MM:SS) # Mappings Scores
Precision  Recall  F-measure
LogMapLt 00:00:00 290 0.6 0.994 0.748
Matcha 00:00:04 303 0.578 1.0 0.732
LogMap 00:00:00 304 0.576 1.0 0.731
LogMapKG 00:00:00 304 0.576 1 0.731
Table 5: Results for NCBITAXON-TAXREFLD Bacteria.
System Time (HH:MM:SS) # Mappings Scores
Precision  Recall  F-measure
LogMapLt 00:00:00 2165 0.637 0.982 0.773
LogMap 00:00:02 2218 0.624 0.985 0.764
LogMapKG 00:00:02 2218 0.624 0.985 0.764
Matcha 00:00:14 2219 0.623 0.984 0.763
Table 6: Results for NCBITAXON-TAXREFLD Chromista.
System Time (HH:MM:SS) # Mappings Scores
Precision  Recall  F-measure
Matcha 00:00:36 12936 0.785 0.998 0.879
LogMap 00:00:25 12949 0.784 0.998 0.878
LogMapKG 00:00:24 12949 0.784 0.998 0.878
LogMapLt 00:00:03 12929 0.784 0.997 0.878
Table 7: Results for NCBITAXON-TAXREFLD Fungi.
System Time (HH:MM:SS) # Mappings Scores
Precision  Recall  F-measure
LogMapLt 00:00:06 26359 0.746 0.987 0.850
Matcha 00:01:01 26675 0.741 0.993 0.849
LogMap 00:01:00 26912 0.731 0.988 0.841
LogMapKG 00:00:55 26910 0.732 0.988 0.841
Table 8: Results for NCBITAXON-TAXREFLD Plantae.
System Time (HH:MM:SS) # Mappings Scores
Precision  Recall  F-measure
LogMapLt 00:00:00 477 0.746 0.997 0.854
Matcha 00:00:11 494 0.723 1.0 0.839
LogMap 00:00:00 496 0.720 1.0 0.837
LogMapKG 00:00:01 496 0.720 1.0 0.837
Table 9: Results for NCBITAXON-TAXREFLD Protozoa.

Contact

This evaluation has been run by Naouel Karam and Alsayed Algergawy. If you have any problems working with the ontologies, any questions related to tool wrapping, or any suggestions related to the Biodiv track, feel free to write an email to: karam [at] infai [.] org