Results for OAEI 2021 - Knowledge Graph Track
Matching systems
As a pre-test, we executed all systems submitted to OAEI (even if they are not registered for the track) on a very small matching example (dataset)
with a similar structure and shape like the real knowledge graphs (in fact, they are a small subset of them).
It showed that not all matching systems are able to complete this small task due to exceptions or other failures.
The following matching systems produced an exception:
- ALIN (Nullpointer - similar to last year)
- GMap (NoSuchMethodError probably due to a wrong OWLAPI version)
- Lily (Invalid version number - similar to last year)
For AMD we adjusted the requirements.txt file to also include lxml which is necessary to execute the matcher.
Thus, we executed the following systems:
- ALOD2Vec
- AMD
- AML
- ATMatcher
- BaselineAltLabel
- BaselineLabel
- KGMatcher
- LSMatch
- OTMapOnto
- Fine-TOM
- TOM
- Wiktionary
The source code for the baseline matchers is available.
The baselineLabel matcher matches all resources which share the same rdfs:label. (in case multiple resources share the same label, all of them are matched).
BaselineAltLabel is additionally using skos:altLabel. Again, in cases where multiple resources share a common label, all those resources are matched in a cross product manner.
Experimental setting
The evaluation is executed on a virtual machine(VM) with 32GB of RAM and 16 vCPUs (2.4 GHz).
The operating system is debian 9 with openjdk version "1.8.0_265".
We used the MELT toolkit for the evaluation
which internally uses the SEALS client (version 7.0.5) to execute matcher packaged with SEALS.
Matching systems which use the web packaging, are executed with the MatcherHTTPCall class.
The reported times includes the environment preparation of SEALS as well as the file upload to the docker container (the start of the container is not timed).
The alignments were evaluated based on Precision, Recall and F-Measure for classes, properties and instances (each in isolation).
Our partial gold standard consist of 1:1 mappings extracted from links contained in wiki pages (cross wiki links).
The schema was matched by ontology experts.
We assume that in each knowledge graph, only one representation of one concept exists.
This means if we have the mapping in our gold standard we can count the mapping as a false positive
(the assumption here is that in the seconds knowledge graph no similar concept to B exists).
The value of false negatives is only increased if we have a 1:1 mapping and it is not found by a matcher.
The source code for generating the evaluation results is also available.
We imposed a maximum execution time of 24h per task, however, that time limit was never exceeded.
Generated dashboard / CSV file
We also generated an online dashboard with the help of the MELT framework.
Have a look at the knowledge graph results here (it may take some seconds to load due to 200 000 correspondences).
Moreover, we also generated a CSV file which allows to analyze each matcher on a correspondence level.
This should help matcher developers to increase the matcher performance.
Alignment results
The generated alignment files are also available.
Results overview
|
|
|
class |
property |
instance |
overall |
ALOD2Vec | 00:21:52 | 5 |
20.0 | 1.00 (1.00) | 0.80 (0.80) | 0.67 (0.67) |
76.8 | 0.94 (0.94) | 0.95 (0.95) | 0.97 (0.97) |
4893.4 | 0.91 (0.91) | 0.87 (0.87) | 0.83 (0.83) |
4990.2 | 0.91 (0.91) | 0.87 (0.87) | 0.83 (0.83) |
AMD | 00:37:47 | 2 |
23.0 | 0.40 (1.00) | 0.25 (0.62) | 0.18 (0.45) |
0.0 | 0.00 (0.00) | 0.00 (0.00) | 0.00 (0.00) |
0.0 | 0.00 (0.00) | 0.00 (0.00) | 0.00 (0.00) |
23.0 | 0.40 (1.00) | 0.00 (0.00) | 0.00 (0.00) |
AML | 00:50:26 | 5 |
23.6 | 0.98 (0.98) | 0.89 (0.89) | 0.81 (0.81) |
48.4 | 0.92 (0.92) | 0.70 (0.70) | 0.57 (0.57) |
6802.8 | 0.90 (0.90) | 0.85 (0.85) | 0.80 (0.80) |
6874.8 | 0.90 (0.90) | 0.85 (0.85) | 0.80 (0.80) |
ATMatcher | 00:19:34 | 5 |
25.6 | 0.97 (0.97) | 0.87 (0.87) | 0.79 (0.79) |
78.8 | 0.97 (0.97) | 0.96 (0.96) | 0.95 (0.95) |
4859.0 | 0.89 (0.89) | 0.85 (0.85) | 0.80 (0.80) |
4963.4 | 0.89 (0.89) | 0.85 (0.85) | 0.81 (0.81) |
BaselineAltLabel | 00:11:37 | 5 |
16.4 | 1.00 (1.00) | 0.74 (0.74) | 0.59 (0.59) |
47.8 | 0.99 (0.99) | 0.79 (0.79) | 0.66 (0.66) |
4674.8 | 0.89 (0.89) | 0.84 (0.84) | 0.80 (0.80) |
4739.0 | 0.89 (0.89) | 0.84 (0.84) | 0.80 (0.80) |
BaselineLabel | 00:11:27 | 5 |
16.4 | 1.00 (1.00) | 0.74 (0.74) | 0.59 (0.59) |
47.8 | 0.99 (0.99) | 0.79 (0.79) | 0.66 (0.66) |
3641.8 | 0.95 (0.95) | 0.81 (0.81) | 0.71 (0.71) |
3706.0 | 0.95 (0.95) | 0.81 (0.81) | 0.71 (0.71) |
Fine-TOM | 14:55:09 | 5 |
19.2 | 1.00 (1.00) | 0.80 (0.80) | 0.66 (0.66) |
29.0 | 0.40 (0.40) | 0.39 (0.39) | 0.38 (0.38) |
4116.0 | 0.92 (0.92) | 0.83 (0.83) | 0.76 (0.76) |
4164.2 | 0.92 (0.92) | 0.83 (0.83) | 0.75 (0.75) |
KGMatcher | 04:55:32 | 5 |
23.2 | 1.00 (1.00) | 0.79 (0.79) | 0.66 (0.66) |
0.0 | 0.00 (0.00) | 0.00 (0.00) | 0.00 (0.00) |
3789.6 | 0.94 (0.94) | 0.82 (0.82) | 0.74 (0.74) |
3812.8 | 0.94 (0.94) | 0.82 (0.82) | 0.72 (0.72) |
LogMap | 01:04:45 | 5 |
19.4 | 0.93 (0.93) | 0.81 (0.81) | 0.71 (0.71) |
0.0 | 0.00 (0.00) | 0.00 (0.00) | 0.00 (0.00) |
4012.4 | 0.90 (0.90) | 0.78 (0.78) | 0.69 (0.69) |
4031.8 | 0.90 (0.90) | 0.77 (0.77) | 0.68 (0.68) |
LSMatch | 02:02:55 | 5 |
18.4 | 1.00 (1.00) | 0.78 (0.78) | 0.64 (0.64) |
0.0 | 0.00 (0.00) | 0.00 (0.00) | 0.00 (0.00) |
0.0 | 0.00 (0.00) | 0.00 (0.00) | 0.00 (0.00) |
18.4 | 1.00 (1.00) | 0.01 (0.01) | 0.00 (0.00) |
OTMapOnto | 00:48:25 | 4 |
122.5 | 0.59 (0.73) | 0.61 (0.77) | 0.64 (0.80) |
0.0 | 0.00 (0.00) | 0.00 (0.00) | 0.00 (0.00) |
0.0 | 0.00 (0.00) | 0.00 (0.00) | 0.00 (0.00) |
122.5 | 0.59 (0.73) | 0.01 (0.01) | 0.00 (0.01) |
TOM | 23:30:25 | 5 |
19.4 | 1.00 (1.00) | 0.83 (0.83) | 0.71 (0.71) |
0.0 | 0.00 (0.00) | 0.00 (0.00) | 0.00 (0.00) |
311.4 | 0.91 (0.91) | 0.12 (0.12) | 0.06 (0.06) |
330.8 | 0.92 (0.92) | 0.12 (0.12) | 0.06 (0.06) |
Wiktionary | 00:43:18 | 5 |
22.0 | 1.00 (1.00) | 0.80 (0.80) | 0.67 (0.67) |
79.8 | 0.94 (0.94) | 0.95 (0.95) | 0.97 (0.97) |
4894.4 | 0.91 (0.91) | 0.87 (0.87) | 0.83 (0.83) |
4996.2 | 0.91 (0.91) | 0.87 (0.87) | 0.83 (0.83) |
Aggregated results per matcher, divided into class, property, instance, and overall alignments.
Time is displayed as HH:MM:SS. Column #testcases indicates the number of testcases where the tool is able to generate (non empty) alignments.
Column size indicates the averaged number of system correspondences.
Two kinds of results are reported: (1) those not distinguishing empty and erroneous (or not generated) alignments,
and (2) those considering only non empty alignments (value between parenthesis).
AMD and OTMapOnto could not return results for all test cases. Furthermore AMD, LSMatch, and OTMapOnto only return class correspondences.
KGMatcher and TOM do not return property alignments. This is propbably due to the fact that they don't care about rdf:property.
The longest runtime was used by TOM with 23 hours and nearly 31 minutes for all 5 test cases.
Test case specific results
Overall results
This result table shows the overall performance (without dividing into class, property or instance) of the matchers.
|
marvelcinematicuniverse-marvel |
memoryalpha-memorybeta |
memoryalpha-stexpanded |
starwars-swg |
starwars-swtor |
ALOD2Vec |
3093 | 0.86 | 0.76 | 0.68 |
13455 | 0.92 | 0.91 | 0.90 |
3402 | 0.92 | 0.92 | 0.93 |
2185 | 0.92 | 0.83 | 0.75 |
2816 | 0.93 | 0.92 | 0.92 |
AMD |
0 | 0.00 | 0.00 | 0.00 |
21 | 1.00 | 0.00 | 0.00 |
25 | 1.00 | 0.01 | 0.00 |
0 | 0.00 | 0.00 | 0.00 |
0 | 0.00 | 0.00 | 0.00 |
AML |
4687 | 0.85 | 0.68 | 0.56 |
18439 | 0.91 | 0.89 | 0.87 |
3795 | 0.93 | 0.93 | 0.92 |
3515 | 0.90 | 0.81 | 0.74 |
3938 | 0.93 | 0.91 | 0.90 |
ATMatcher |
3516 | 0.67 | 0.59 | 0.53 |
13006 | 0.96 | 0.93 | 0.91 |
3281 | 0.96 | 0.94 | 0.92 |
2243 | 0.93 | 0.84 | 0.76 |
2771 | 0.95 | 0.93 | 0.91 |
BaselineAltLabel |
2574 | 0.86 | 0.76 | 0.68 |
13514 | 0.88 | 0.89 | 0.89 |
3230 | 0.88 | 0.90 | 0.92 |
1712 | 0.92 | 0.74 | 0.63 |
2665 | 0.92 | 0.91 | 0.90 |
BaselineLabel |
1879 | 0.90 | 0.69 | 0.56 |
10552 | 0.95 | 0.85 | 0.77 |
2582 | 0.98 | 0.90 | 0.83 |
1245 | 0.96 | 0.68 | 0.53 |
2272 | 0.95 | 0.89 | 0.84 |
Fine-TOM |
2054 | 0.86 | 0.68 | 0.56 |
11315 | 0.93 | 0.84 | 0.78 |
2696 | 0.95 | 0.88 | 0.81 |
2018 | 0.93 | 0.81 | 0.72 |
2738 | 0.93 | 0.91 | 0.90 |
KGMatcher |
1909 | 0.89 | 0.69 | 0.56 |
10764 | 0.94 | 0.85 | 0.77 |
2577 | 0.98 | 0.89 | 0.82 |
1493 | 0.94 | 0.75 | 0.62 |
2321 | 0.94 | 0.88 | 0.83 |
LogMap |
2255 | 0.84 | 0.59 | 0.46 |
11648 | 0.89 | 0.82 | 0.76 |
2491 | 0.88 | 0.81 | 0.75 |
1577 | 0.94 | 0.79 | 0.68 |
2188 | 0.94 | 0.84 | 0.75 |
LSMatch |
8 | 1.00 | 0.00 | 0.00 |
21 | 1.00 | 0.00 | 0.00 |
24 | 1.00 | 0.01 | 0.00 |
11 | 1.00 | 0.01 | 0.00 |
28 | 1.00 | 0.02 | 0.01 |
OTMapOnto |
0 | 0.00 | 0.00 | 0.00 |
149 | 0.50 | 0.00 | 0.00 |
171 | 0.67 | 0.01 | 0.01 |
66 | 1.00 | 0.01 | 0.00 |
104 | 0.76 | 0.02 | 0.01 |
TOM |
119 | 0.93 | 0.08 | 0.04 |
1001 | 0.96 | 0.15 | 0.08 |
248 | 0.87 | 0.16 | 0.09 |
36 | 0.90 | 0.03 | 0.02 |
250 | 0.94 | 0.18 | 0.10 |
Wiktionary |
3095 | 0.86 | 0.76 | 0.68 |
13465 | 0.92 | 0.91 | 0.90 |
3408 | 0.92 | 0.92 | 0.93 |
2191 | 0.92 | 0.83 | 0.75 |
2822 | 0.93 | 0.92 | 0.92 |
Class results
All matchers were able to generate class correspondences.
|
marvelcinematicuniverse-marvel |
memoryalpha-memorybeta |
memoryalpha-stexpanded |
starwars-swg |
starwars-swtor |
ALOD2Vec |
7 | 1.00 | 1.00 | 1.00 |
21 | 1.00 | 0.44 | 0.29 |
29 | 1.00 | 0.76 | 0.62 |
14 | 1.00 | 0.75 | 0.60 |
29 | 1.00 | 0.93 | 0.87 |
AMD |
0 | 0.00 | 0.00 | 0.00 |
21 | 1.00 | 0.44 | 0.29 |
25 | 1.00 | 0.76 | 0.62 |
0 | 0.00 | 0.00 | 0.00 |
0 | 0.00 | 0.00 | 0.00 |
AML |
8 | 1.00 | 1.00 | 1.00 |
36 | 0.91 | 0.80 | 0.71 |
32 | 1.00 | 0.82 | 0.69 |
12 | 1.00 | 0.89 | 0.80 |
30 | 1.00 | 0.93 | 0.87 |
ATMatcher |
11 | 1.00 | 1.00 | 1.00 |
39 | 0.83 | 0.77 | 0.71 |
34 | 1.00 | 0.87 | 0.77 |
13 | 1.00 | 0.75 | 0.60 |
31 | 1.00 | 0.93 | 0.87 |
BaselineAltLabel |
8 | 1.00 | 1.00 | 1.00 |
19 | 1.00 | 0.44 | 0.29 |
19 | 1.00 | 0.63 | 0.46 |
9 | 1.00 | 0.57 | 0.40 |
27 | 1.00 | 0.89 | 0.80 |
BaselineLabel |
8 | 1.00 | 1.00 | 1.00 |
19 | 1.00 | 0.44 | 0.29 |
19 | 1.00 | 0.63 | 0.46 |
9 | 1.00 | 0.57 | 0.40 |
27 | 1.00 | 0.89 | 0.80 |
Fine-TOM |
8 | 1.00 | 1.00 | 1.00 |
22 | 1.00 | 0.44 | 0.29 |
27 | 1.00 | 0.76 | 0.62 |
11 | 1.00 | 0.75 | 0.60 |
28 | 1.00 | 0.89 | 0.80 |
KGMatcher |
8 | 1.00 | 1.00 | 1.00 |
27 | 1.00 | 0.44 | 0.29 |
29 | 1.00 | 0.70 | 0.54 |
22 | 1.00 | 0.75 | 0.60 |
30 | 1.00 | 0.93 | 0.87 |
LogMap |
10 | 1.00 | 1.00 | 1.00 |
21 | 0.88 | 0.64 | 0.50 |
26 | 0.78 | 0.64 | 0.54 |
12 | 1.00 | 0.89 | 0.80 |
28 | 1.00 | 0.85 | 0.73 |
LSMatch |
8 | 1.00 | 1.00 | 1.00 |
21 | 1.00 | 0.44 | 0.29 |
24 | 1.00 | 0.70 | 0.54 |
11 | 1.00 | 0.75 | 0.60 |
28 | 1.00 | 0.89 | 0.80 |
OTMapOnto |
0 | 0.00 | 0.00 | 0.00 |
149 | 0.50 | 0.53 | 0.57 |
171 | 0.67 | 0.71 | 0.77 |
66 | 1.00 | 1.00 | 1.00 |
104 | 0.76 | 0.81 | 0.87 |
TOM |
8 | 1.00 | 1.00 | 1.00 |
21 | 1.00 | 0.44 | 0.29 |
26 | 1.00 | 0.76 | 0.62 |
13 | 1.00 | 0.89 | 0.80 |
29 | 1.00 | 0.93 | 0.87 |
Wiktionary |
8 | 1.00 | 1.00 | 1.00 |
26 | 1.00 | 0.44 | 0.29 |
33 | 1.00 | 0.76 | 0.62 |
14 | 1.00 | 0.75 | 0.60 |
29 | 1.00 | 0.93 | 0.87 |
Property results
While last year, most matchers struggled on creating mappings typed as rdf:Property
(instead of owl:ObjectProperty
or owl:DatatypeProperty
, many matchers were able to produce property mappings also in that case.
|
marvelcinematicuniverse-marvel |
memoryalpha-memorybeta |
memoryalpha-stexpanded |
starwars-swg |
starwars-swtor |
ALOD2Vec |
21 | 1.00 | 1.00 | 1.00 |
111 | 0.84 | 0.88 | 0.92 |
88 | 0.87 | 0.91 | 0.95 |
53 | 1.00 | 1.00 | 1.00 |
111 | 0.98 | 0.98 | 0.98 |
AMD |
0 | 0.00 | 0.00 | 0.00 |
0 | 0.00 | 0.00 | 0.00 |
0 | 0.00 | 0.00 | 0.00 |
0 | 0.00 | 0.00 | 0.00 |
0 | 0.00 | 0.00 | 0.00 |
AML |
9 | 1.00 | 0.53 | 0.36 |
84 | 0.76 | 0.73 | 0.70 |
64 | 0.89 | 0.85 | 0.80 |
12 | 1.00 | 0.40 | 0.25 |
73 | 0.98 | 0.82 | 0.71 |
ATMatcher |
24 | 0.91 | 0.91 | 0.91 |
103 | 0.98 | 0.95 | 0.92 |
85 | 0.95 | 0.95 | 0.95 |
61 | 1.00 | 1.00 | 1.00 |
121 | 1.00 | 0.99 | 0.98 |
BaselineAltLabel |
7 | 1.00 | 0.53 | 0.36 |
41 | 1.00 | 0.51 | 0.34 |
46 | 0.97 | 0.80 | 0.68 |
42 | 1.00 | 1.00 | 1.00 |
103 | 1.00 | 0.94 | 0.89 |
BaselineLabel |
7 | 1.00 | 0.53 | 0.36 |
41 | 1.00 | 0.51 | 0.34 |
46 | 0.97 | 0.80 | 0.68 |
42 | 1.00 | 1.00 | 1.00 |
103 | 1.00 | 0.94 | 0.89 |
Fine-TOM |
0 | 0.00 | 0.00 | 0.00 |
0 | 0.00 | 0.00 | 0.00 |
0 | 0.00 | 0.00 | 0.00 |
42 | 1.00 | 1.00 | 1.00 |
103 | 1.00 | 0.94 | 0.89 |
KGMatcher |
0 | 0.00 | 0.00 | 0.00 |
0 | 0.00 | 0.00 | 0.00 |
0 | 0.00 | 0.00 | 0.00 |
0 | 0.00 | 0.00 | 0.00 |
0 | 0.00 | 0.00 | 0.00 |
LogMap |
0 | 0.00 | 0.00 | 0.00 |
0 | 0.00 | 0.00 | 0.00 |
0 | 0.00 | 0.00 | 0.00 |
0 | 0.00 | 0.00 | 0.00 |
0 | 0.00 | 0.00 | 0.00 |
LSMatch |
0 | 0.00 | 0.00 | 0.00 |
0 | 0.00 | 0.00 | 0.00 |
0 | 0.00 | 0.00 | 0.00 |
0 | 0.00 | 0.00 | 0.00 |
0 | 0.00 | 0.00 | 0.00 |
OTMapOnto |
0 | 0.00 | 0.00 | 0.00 |
0 | 0.00 | 0.00 | 0.00 |
0 | 0.00 | 0.00 | 0.00 |
0 | 0.00 | 0.00 | 0.00 |
0 | 0.00 | 0.00 | 0.00 |
TOM |
0 | 0.00 | 0.00 | 0.00 |
0 | 0.00 | 0.00 | 0.00 |
0 | 0.00 | 0.00 | 0.00 |
0 | 0.00 | 0.00 | 0.00 |
0 | 0.00 | 0.00 | 0.00 |
Wiktionary |
22 | 1.00 | 1.00 | 1.00 |
112 | 0.84 | 0.88 | 0.92 |
89 | 0.87 | 0.91 | 0.95 |
59 | 1.00 | 1.00 | 1.00 |
117 | 0.98 | 0.98 | 0.98 |
Instance results
Only AMD, LSMatch, and OTMapOnto did not return any instance matches.
|
marvelcinematicuniverse-marvel |
memoryalpha-memorybeta |
memoryalpha-stexpanded |
starwars-swg |
starwars-swtor |
ALOD2Vec |
3065 | 0.86 | 0.76 | 0.68 |
13323 | 0.92 | 0.91 | 0.90 |
3285 | 0.92 | 0.93 | 0.93 |
2118 | 0.92 | 0.82 | 0.75 |
2676 | 0.92 | 0.92 | 0.91 |
AMD |
0 | 0.00 | 0.00 | 0.00 |
0 | 0.00 | 0.00 | 0.00 |
0 | 0.00 | 0.00 | 0.00 |
0 | 0.00 | 0.00 | 0.00 |
0 | 0.00 | 0.00 | 0.00 |
AML |
4670 | 0.85 | 0.68 | 0.56 |
18319 | 0.91 | 0.89 | 0.87 |
3699 | 0.93 | 0.93 | 0.93 |
3491 | 0.90 | 0.81 | 0.75 |
3835 | 0.93 | 0.92 | 0.90 |
ATMatcher |
3481 | 0.66 | 0.58 | 0.52 |
12864 | 0.96 | 0.93 | 0.91 |
3162 | 0.96 | 0.94 | 0.92 |
2169 | 0.93 | 0.83 | 0.76 |
2619 | 0.94 | 0.92 | 0.91 |
BaselineAltLabel |
2559 | 0.86 | 0.76 | 0.68 |
13454 | 0.88 | 0.89 | 0.89 |
3165 | 0.88 | 0.90 | 0.93 |
1661 | 0.92 | 0.74 | 0.62 |
2535 | 0.92 | 0.91 | 0.90 |
BaselineLabel |
1864 | 0.90 | 0.69 | 0.56 |
10492 | 0.95 | 0.85 | 0.77 |
2517 | 0.98 | 0.91 | 0.84 |
1194 | 0.95 | 0.67 | 0.52 |
2142 | 0.95 | 0.89 | 0.84 |
Fine-TOM |
2046 | 0.86 | 0.68 | 0.56 |
11293 | 0.93 | 0.85 | 0.78 |
2669 | 0.95 | 0.89 | 0.83 |
1965 | 0.93 | 0.81 | 0.72 |
2607 | 0.93 | 0.91 | 0.90 |
KGMatcher |
1901 | 0.89 | 0.69 | 0.57 |
10737 | 0.94 | 0.85 | 0.78 |
2548 | 0.98 | 0.91 | 0.84 |
1471 | 0.94 | 0.76 | 0.63 |
2291 | 0.94 | 0.90 | 0.86 |
LogMap |
2245 | 0.84 | 0.60 | 0.46 |
11627 | 0.89 | 0.82 | 0.76 |
2465 | 0.88 | 0.82 | 0.77 |
1565 | 0.94 | 0.80 | 0.69 |
2160 | 0.94 | 0.86 | 0.78 |
LSMatch |
0 | 0.00 | 0.00 | 0.00 |
0 | 0.00 | 0.00 | 0.00 |
0 | 0.00 | 0.00 | 0.00 |
0 | 0.00 | 0.00 | 0.00 |
0 | 0.00 | 0.00 | 0.00 |
OTMapOnto |
0 | 0.00 | 0.00 | 0.00 |
0 | 0.00 | 0.00 | 0.00 |
0 | 0.00 | 0.00 | 0.00 |
0 | 0.00 | 0.00 | 0.00 |
0 | 0.00 | 0.00 | 0.00 |
TOM |
111 | 0.93 | 0.07 | 0.04 |
980 | 0.96 | 0.15 | 0.08 |
222 | 0.86 | 0.16 | 0.09 |
23 | 0.88 | 0.03 | 0.01 |
221 | 0.94 | 0.17 | 0.10 |
Wiktionary |
3065 | 0.86 | 0.76 | 0.68 |
13327 | 0.92 | 0.91 | 0.90 |
3286 | 0.92 | 0.93 | 0.93 |
2118 | 0.92 | 0.82 | 0.75 |
2676 | 0.92 | 0.92 | 0.91 |
Runtime
|
marvelcinematicuniverse-marvel |
memoryalpha-memorybeta |
memoryalpha-stexpanded |
starwars-swg |
starwars-swtor |
ALOD2Vec |
00:05:25 |
00:03:25 |
00:02:08 |
00:05:16 |
00:05:37 |
AMD |
00:00:00 |
00:23:39 |
00:14:08 |
00:00:00 |
00:00:00 |
AML |
00:14:43 |
00:16:34 |
00:05:24 |
00:06:54 |
00:06:49 |
ATMatcher |
00:04:43 |
00:03:34 |
00:02:08 |
00:04:41 |
00:04:27 |
BaselineAltLabel |
00:02:45 |
00:01:54 |
00:01:11 |
00:02:54 |
00:02:50 |
BaselineLabel |
00:02:40 |
00:01:50 |
00:01:11 |
00:02:52 |
00:02:51 |
Fine-TOM |
06:47:31 |
05:03:51 |
01:35:05 |
00:57:10 |
00:31:31 |
KGMatcher |
00:47:19 |
00:50:48 |
00:31:31 |
01:28:55 |
01:16:57 |
LogMap |
00:36:24 |
00:06:09 |
00:03:50 |
00:09:24 |
00:08:57 |
LSMatch |
00:31:04 |
00:19:00 |
00:11:53 |
00:30:34 |
00:30:22 |
OTMapOnto |
00:00:00 |
00:09:57 |
00:06:12 |
00:16:12 |
00:16:04 |
TOM |
08:01:39 |
08:48:33 |
02:11:48 |
03:07:06 |
01:21:18 |
Wiktionary |
00:12:03 |
00:05:14 |
00:05:38 |
00:09:47 |
00:10:34 |
Organizers
- Sven Hertling (University of Mannheim, Germany), main contact for the track, sven at informatik dot uni-mannheim dot de
- Heiko Paulheim (University of Mannheim, Germany)
[1] Sven Hertling, Heiko Paulheim: The knowledge graph track at OAEI : Gold standards, baselines, and the golden hammer bias. ESWC 2020. [pdf]
[2] Sven Hertling, Heiko Paulheim: DBkWik: A Consolidated Knowledge Graph from Thousands of Wikis. International Conference on Big Knowledge 2018. [pdf]
[3] Alexandra Hofmann, Samresh Perchani, Jan Portisch, Sven Hertling, and Heiko Paulheim. DBkWik: Towards Knowledge Graph Creation from Thousands of Wikis. International Semantic Web Conference (Posters & Demos) 2017. [pdf]