Results for OAEI 2020 - Knowledge Graph Track
Matching systems
As a pre-test, we executed all systems submitted as SEALS packages to OAEI (even if they are not registered for the track) on a very small matching example (dataset)
with a similar structure and shape like the real knowledge graphs (in fact, they are a small subset of them).
It showed that not all matching systems are able to complete this small task due to exceptions or other failures.
The following matching systems produced an exception:
We wrote an email to the respective system developers to give them the possibility to modify the system configuration to complete this task.
However, none of them were ultimately capable of producing results on the test task of the Knowledge Graph track.
Thus, we executed the following systems:
- ALOD2Vec
- AML
- ATBox
- baselineAltLabel (seals package)
- baselineLabel (seals package)
- DESKMatcher
- LogMap
- LogMapBio
- LogMapIM
- LogMapKG
- LogMapLt
- Wiktionary
The source code for the baseline matchers is available.
The baselineLabel matcher matches all resources which share the same rdfs:label. (in case multiple resources share the same label, all of them are matched).
BaselineAltLabel is additionally using skos:altLabel. Again, in cases where multiple resources share a common label, all those resources are matched in a cross product manner.
Experimental setting
The evaluation is executed on a virtual machine(VM) with 32GB of RAM and 16 vCPUs (2.4 GHz).
The operating system is debian 9 with openjdk version "1.8.0_265".
We used the "-o" option in SEALS (version 7.0.5) to provide the two knowledge graphs which should be matched.
The two given URLs are file URLs and not HTTP URLs because downloading the knowledge graphs from another server would be a huge time overhead
(downloading 500 MB multiple times for different tasks).
Thus the call of the SEALS client looks like the following:
java -Xmx25g -Xms15g -jar ../seals-omt-client.jar ${MATCHER_DIR} -z -o file:///data/ont1.xml file:///data/ont2.xml -f out.xml
This also means that the reported time also includes the environment preparation of SEALS
(copying the configuration folder of each matcher at the start of the matching task) as well as copying the resulting alignment to the output file.
We could not use the "-x" option of SEALS because we had to modify the evaluation routine based on two reasons.
The first one is to differentiate between class, property and instance mappings.
The second reason is to deal with the partial gold standard of this track.
The alignments were evaluated based on Precision, Recall and F-Measure for classes, properties and instances (each in isolation).
Our partial gold standard consist of 1:1 mappings extracted from links contained in wiki pages (cross wiki links).
The schema was matched by ontology experts.
We assume that in each knowledge graph, only one representation of one concept exists.
This means if we have the mapping in our gold standard we can count the mapping as a false positive
(the assumption here is that in the seconds knowledge graph no similar concept to B exists).
The value of false negatives is only increased if we have a 1:1 mapping and it is not found by a matcher.
The source code for generating the evaluation results is also available.
We imposed a maximum execution time of 24h per task, however, that time limit was never exceeded.
Generated dashboard / CSV file
We also generated an online dashboard with the help of the MELT framework.
Have a look at the knowledge graph results here (it may take some seconds to load due to 200 000 correspondences).
Moreover, we also generated a CSV file which allows to analyze each matcher on a correspondence level.
This should help matcher developers to increase the matcher performance.
Alignment results
The generated alignment files are also available.
Results overview
|
|
|
class |
property |
instance |
overall |
ALOD2Vec | 0:13:24 | 5 |
20.0 | 1.00 (1.00) | 0.80 (0.80) | 0.67 (0.67) |
76.8 | 0.94 (0.94) | 0.95 (0.95) | 0.97 (0.97) |
4893.8 | 0.91 (0.91) | 0.87 (0.87) | 0.83 (0.83) |
4990.6 | 0.91 (0.91) | 0.87 (0.87) | 0.83 (0.83) |
AML | 0:50:55 | 5 |
23.6 | 0.98 (0.98) | 0.89 (0.89) | 0.81 (0.81) |
48.4 | 0.92 (0.92) | 0.70 (0.70) | 0.57 (0.57) |
6802.8 | 0.90 (0.90) | 0.85 (0.85) | 0.80 (0.80) |
6874.8 | 0.90 (0.90) | 0.85 (0.85) | 0.80 (0.80) |
ATBox | 0:16:22 | 5 |
25.6 | 0.97 (0.97) | 0.87 (0.87) | 0.79 (0.79) |
78.8 | 0.97 (0.97) | 0.96 (0.96) | 0.95 (0.95) |
4858.8 | 0.89 (0.89) | 0.84 (0.84) | 0.80 (0.80) |
4963.2 | 0.89 (0.89) | 0.85 (0.85) | 0.81 (0.81) |
baselineAltLabel | 0:10:57 | 5 |
16.4 | 1.00 (1.00) | 0.74 (0.74) | 0.59 (0.59) |
47.8 | 0.99 (0.99) | 0.79 (0.79) | 0.66 (0.66) |
4674.8 | 0.89 (0.89) | 0.84 (0.84) | 0.80 (0.80) |
4739.0 | 0.89 (0.89) | 0.84 (0.84) | 0.80 (0.80) |
baselineLabel | 0:10:44 | 5 |
16.4 | 1.00 (1.00) | 0.74 (0.74) | 0.59 (0.59) |
47.8 | 0.99 (0.99) | 0.79 (0.79) | 0.66 (0.66) |
3641.8 | 0.95 (0.95) | 0.81 (0.81) | 0.71 (0.71) |
3706.0 | 0.95 (0.95) | 0.81 (0.81) | 0.71 (0.71) |
DESKMatcher | 0:13:54 | 5 |
91.4 | 0.76 (0.76) | 0.71 (0.71) | 0.66 (0.66) |
0.0 | 0.00 (0.00) | 0.00 (0.00) | 0.00 (0.00) |
3820.6 | 0.94 (0.94) | 0.82 (0.82) | 0.74 (0.74) |
3912.0 | 0.93 (0.93) | 0.81 (0.81) | 0.72 (0.72) |
LogMap | 2:55:14 | 5 |
24.0 | 0.95 (0.95) | 0.84 (0.84) | 0.76 (0.76) |
0.0 | 0.00 (0.00) | 0.00 (0.00) | 0.00 (0.00) |
29190.4 | 0.40 (0.40) | 0.54 (0.54) | 0.86 (0.86) |
29214.4 | 0.40 (0.40) | 0.54 (0.54) | 0.84 (0.84) |
LogMapBio | 4:35:29 | 5 |
24.0 | 0.95 (0.95) | 0.84 (0.84) | 0.76 (0.76) |
0.0 | 0.00 (0.00) | 0.00 (0.00) | 0.00 (0.00) |
0.0 | 0.00 (0.00) | 0.00 (0.00) | 0.00 (0.00) |
24.0 | 0.95 (0.95) | 0.01 (0.01) | 0.00 (0.00) |
LogMapIM | 2:49:34 | 5 |
0.0 | 0.00 (0.00) | 0.00 (0.00) | 0.00 (0.00) |
0.0 | 0.00 (0.00) | 0.00 (0.00) | 0.00 (0.00) |
29190.4 | 0.40 (0.40) | 0.54 (0.54) | 0.86 (0.86) |
29190.4 | 0.40 (0.40) | 0.54 (0.54) | 0.84 (0.84) |
LogMapKG | 2:47:51 | 5 |
24.0 | 0.95 (0.95) | 0.84 (0.84) | 0.76 (0.76) |
0.0 | 0.00 (0.00) | 0.00 (0.00) | 0.00 (0.00) |
29190.4 | 0.40 (0.40) | 0.54 (0.54) | 0.86 (0.86) |
29214.4 | 0.40 (0.40) | 0.54 (0.54) | 0.84 (0.84) |
LogMapLt | 0:07:19 | 4 |
23.0 | 0.80 (1.00) | 0.56 (0.70) | 0.43 (0.54) |
0.0 | 0.00 (0.00) | 0.00 (0.00) | 0.00 (0.00) |
6653.8 | 0.73 (0.91) | 0.67 (0.84) | 0.62 (0.78) |
6676.8 | 0.73 (0.92) | 0.66 (0.83) | 0.61 (0.76) |
Wiktionary | 0:30:12 | 5 |
22.4 | 1.00 (1.00) | 0.80 (0.80) | 0.67 (0.67) |
80.0 | 0.94 (0.94) | 0.95 (0.95) | 0.97 (0.97) |
4893.8 | 0.91 (0.91) | 0.87 (0.87) | 0.83 (0.83) |
4996.2 | 0.91 (0.91) | 0.87 (0.87) | 0.83 (0.83) |
Aggregated results per matcher, divided into class, property, instance, and overall alignments.
Time is displayed as HH:MM:SS. Column #testcases indicates the number of testcases where the tool is able to generate (non empty) alignments.
Column size indicates the averaged number of system correspondences.
Two kinds of results are reported: (1) those not distinguishing empty and erroneous (or not generated) alignments,
and (2) those considering only non empty alignments (value between parenthesis).
All matchers except LogMapLt can produce an alignment for all test cases (LogMapLt generates a multi GB alignment file for test case marvelcinematicuniverse-marvel).
The longest runtime was observed for the LogMap family with more than 2 hours for all 5 test cases.
Test case specific results
Overall results
This result table shows the overall performance (without dividing into class, property or instance) of the matchers.
|
marvelcinematicuniverse-marvel |
memoryalpha-memorybeta |
memoryalpha-stexpanded |
starwars-swg |
starwars-swtor |
ALOD2Vec |
3093 | 0.86 | 0.76 | 0.68 |
13457 | 0.92 | 0.91 | 0.90 |
3402 | 0.92 | 0.92 | 0.93 |
2185 | 0.92 | 0.83 | 0.75 |
2816 | 0.93 | 0.92 | 0.92 |
AML |
4687 | 0.85 | 0.68 | 0.56 |
18439 | 0.91 | 0.89 | 0.87 |
3795 | 0.93 | 0.93 | 0.92 |
3515 | 0.90 | 0.81 | 0.74 |
3938 | 0.93 | 0.91 | 0.90 |
ATBox |
3518 | 0.67 | 0.59 | 0.52 |
13002 | 0.96 | 0.93 | 0.91 |
3281 | 0.96 | 0.94 | 0.92 |
2244 | 0.93 | 0.84 | 0.76 |
2771 | 0.95 | 0.93 | 0.91 |
baselineAltLabel |
2574 | 0.86 | 0.76 | 0.68 |
13514 | 0.88 | 0.89 | 0.89 |
3230 | 0.88 | 0.90 | 0.92 |
1712 | 0.92 | 0.74 | 0.63 |
2665 | 0.92 | 0.91 | 0.90 |
baselineLabel |
1879 | 0.90 | 0.69 | 0.56 |
10552 | 0.95 | 0.85 | 0.77 |
2582 | 0.98 | 0.90 | 0.83 |
1245 | 0.96 | 0.68 | 0.53 |
2272 | 0.95 | 0.89 | 0.84 |
DESKMatcher |
1938 | 0.89 | 0.69 | 0.56 |
10946 | 0.94 | 0.85 | 0.77 |
2677 | 0.97 | 0.89 | 0.82 |
1638 | 0.93 | 0.75 | 0.63 |
2361 | 0.94 | 0.88 | 0.83 |
LogMap |
41080 | 0.24 | 0.35 | 0.71 |
54537 | 0.43 | 0.58 | 0.86 |
15560 | 0.47 | 0.63 | 0.92 |
15335 | 0.58 | 0.67 | 0.80 |
19560 | 0.27 | 0.42 | 0.92 |
LogMapBio |
10 | 1.00 | 0.00 | 0.00 |
38 | 0.75 | 0.00 | 0.00 |
31 | 1.00 | 0.01 | 0.00 |
12 | 1.00 | 0.01 | 0.00 |
29 | 1.00 | 0.02 | 0.01 |
LogMapIM |
41070 | 0.24 | 0.35 | 0.71 |
54499 | 0.43 | 0.58 | 0.86 |
15529 | 0.47 | 0.62 | 0.91 |
15323 | 0.58 | 0.67 | 0.80 |
19531 | 0.27 | 0.42 | 0.91 |
LogMapKG |
41080 | 0.24 | 0.35 | 0.71 |
54537 | 0.43 | 0.58 | 0.86 |
15560 | 0.47 | 0.63 | 0.92 |
15335 | 0.58 | 0.67 | 0.80 |
19560 | 0.27 | 0.42 | 0.92 |
LogMapLt |
0 | 0.00 | 0.00 | 0.00 |
16688 | 0.90 | 0.83 | 0.77 |
3577 | 0.94 | 0.88 | 0.82 |
2807 | 0.91 | 0.74 | 0.62 |
3635 | 0.91 | 0.87 | 0.83 |
Wiktionary |
3096 | 0.86 | 0.76 | 0.68 |
13466 | 0.92 | 0.91 | 0.90 |
3405 | 0.92 | 0.92 | 0.93 |
2192 | 0.92 | 0.83 | 0.75 |
2822 | 0.93 | 0.92 | 0.92 |
Class results
All matchers except LogMapIM were able to generate class correspondences.
|
marvelcinematicuniverse-marvel |
memoryalpha-memorybeta |
memoryalpha-stexpanded |
starwars-swg |
starwars-swtor |
ALOD2Vec |
7 | 1.00 | 1.00 | 1.00 |
21 | 1.00 | 0.44 | 0.29 |
29 | 1.00 | 0.76 | 0.62 |
14 | 1.00 | 0.75 | 0.60 |
29 | 1.00 | 0.93 | 0.87 |
AML |
8 | 1.00 | 1.00 | 1.00 |
36 | 0.91 | 0.80 | 0.71 |
32 | 1.00 | 0.82 | 0.69 |
12 | 1.00 | 0.89 | 0.80 |
30 | 1.00 | 0.93 | 0.87 |
ATBox |
11 | 1.00 | 1.00 | 1.00 |
39 | 0.83 | 0.77 | 0.71 |
34 | 1.00 | 0.87 | 0.77 |
13 | 1.00 | 0.75 | 0.60 |
31 | 1.00 | 0.93 | 0.87 |
baselineAltLabel |
8 | 1.00 | 1.00 | 1.00 |
19 | 1.00 | 0.44 | 0.29 |
19 | 1.00 | 0.63 | 0.46 |
9 | 1.00 | 0.57 | 0.40 |
27 | 1.00 | 0.89 | 0.80 |
baselineLabel |
8 | 1.00 | 1.00 | 1.00 |
19 | 1.00 | 0.44 | 0.29 |
19 | 1.00 | 0.63 | 0.46 |
9 | 1.00 | 0.57 | 0.40 |
27 | 1.00 | 0.89 | 0.80 |
DESKMatcher |
40 | 1.00 | 1.00 | 1.00 |
115 | 0.33 | 0.31 | 0.29 |
120 | 0.62 | 0.62 | 0.62 |
116 | 1.00 | 0.75 | 0.60 |
66 | 0.86 | 0.83 | 0.80 |
LogMap |
10 | 1.00 | 1.00 | 1.00 |
38 | 0.75 | 0.69 | 0.64 |
31 | 1.00 | 0.70 | 0.54 |
12 | 1.00 | 0.89 | 0.80 |
29 | 1.00 | 0.89 | 0.80 |
LogMapBio |
10 | 1.00 | 1.00 | 1.00 |
38 | 0.75 | 0.69 | 0.64 |
31 | 1.00 | 0.70 | 0.54 |
12 | 1.00 | 0.89 | 0.80 |
29 | 1.00 | 0.89 | 0.80 |
LogMapIM |
0 | 0.00 | 0.00 | 0.00 |
0 | 0.00 | 0.00 | 0.00 |
0 | 0.00 | 0.00 | 0.00 |
0 | 0.00 | 0.00 | 0.00 |
0 | 0.00 | 0.00 | 0.00 |
LogMapKG |
10 | 1.00 | 1.00 | 1.00 |
38 | 0.75 | 0.69 | 0.64 |
31 | 1.00 | 0.70 | 0.54 |
12 | 1.00 | 0.89 | 0.80 |
29 | 1.00 | 0.89 | 0.80 |
LogMapLt |
0 | 0.00 | 0.00 | 0.00 |
23 | 1.00 | 0.44 | 0.29 |
27 | 1.00 | 0.70 | 0.54 |
12 | 1.00 | 0.75 | 0.60 |
30 | 1.00 | 0.85 | 0.73 |
Wiktionary |
9 | 1.00 | 1.00 | 1.00 |
27 | 1.00 | 0.44 | 0.29 |
33 | 1.00 | 0.76 | 0.62 |
14 | 1.00 | 0.75 | 0.60 |
29 | 1.00 | 0.93 | 0.87 |
Property results
While last year, most matchers struggled on creating mappings typed as rdf:Property
(instead of owl:ObjectProperty
or owl:DatatypeProperty
, many matchers were able to produce property mappings also in that case.
|
marvelcinematicuniverse-marvel |
memoryalpha-memorybeta |
memoryalpha-stexpanded |
starwars-swg |
starwars-swtor |
ALOD2Vec |
21 | 1.00 | 1.00 | 1.00 |
111 | 0.84 | 0.88 | 0.92 |
88 | 0.87 | 0.91 | 0.95 |
53 | 1.00 | 1.00 | 1.00 |
111 | 0.98 | 0.98 | 0.98 |
AML |
9 | 1.00 | 0.53 | 0.36 |
84 | 0.76 | 0.73 | 0.70 |
64 | 0.89 | 0.85 | 0.80 |
12 | 1.00 | 0.40 | 0.25 |
73 | 0.98 | 0.82 | 0.71 |
ATBox |
24 | 0.91 | 0.91 | 0.91 |
103 | 0.98 | 0.95 | 0.92 |
85 | 0.95 | 0.95 | 0.95 |
61 | 1.00 | 1.00 | 1.00 |
121 | 1.00 | 0.99 | 0.98 |
baselineAltLabel |
7 | 1.00 | 0.53 | 0.36 |
41 | 1.00 | 0.51 | 0.34 |
46 | 0.97 | 0.80 | 0.68 |
42 | 1.00 | 1.00 | 1.00 |
103 | 1.00 | 0.94 | 0.89 |
baselineLabel |
7 | 1.00 | 0.53 | 0.36 |
41 | 1.00 | 0.51 | 0.34 |
46 | 0.97 | 0.80 | 0.68 |
42 | 1.00 | 1.00 | 1.00 |
103 | 1.00 | 0.94 | 0.89 |
DESKMatcher |
0 | 0.00 | 0.00 | 0.00 |
0 | 0.00 | 0.00 | 0.00 |
0 | 0.00 | 0.00 | 0.00 |
0 | 0.00 | 0.00 | 0.00 |
0 | 0.00 | 0.00 | 0.00 |
LogMap |
0 | 0.00 | 0.00 | 0.00 |
0 | 0.00 | 0.00 | 0.00 |
0 | 0.00 | 0.00 | 0.00 |
0 | 0.00 | 0.00 | 0.00 |
0 | 0.00 | 0.00 | 0.00 |
LogMapBio |
0 | 0.00 | 0.00 | 0.00 |
0 | 0.00 | 0.00 | 0.00 |
0 | 0.00 | 0.00 | 0.00 |
0 | 0.00 | 0.00 | 0.00 |
0 | 0.00 | 0.00 | 0.00 |
LogMapIM |
0 | 0.00 | 0.00 | 0.00 |
0 | 0.00 | 0.00 | 0.00 |
0 | 0.00 | 0.00 | 0.00 |
0 | 0.00 | 0.00 | 0.00 |
0 | 0.00 | 0.00 | 0.00 |
LogMapKG |
0 | 0.00 | 0.00 | 0.00 |
0 | 0.00 | 0.00 | 0.00 |
0 | 0.00 | 0.00 | 0.00 |
0 | 0.00 | 0.00 | 0.00 |
0 | 0.00 | 0.00 | 0.00 |
LogMapLt |
0 | 0.00 | 0.00 | 0.00 |
0 | 0.00 | 0.00 | 0.00 |
0 | 0.00 | 0.00 | 0.00 |
0 | 0.00 | 0.00 | 0.00 |
0 | 0.00 | 0.00 | 0.00 |
Wiktionary |
22 | 1.00 | 1.00 | 1.00 |
112 | 0.84 | 0.88 | 0.92 |
89 | 0.87 | 0.91 | 0.95 |
60 | 1.00 | 1.00 | 1.00 |
117 | 0.98 | 0.98 | 0.98 |
Instance results
Only LogMapBio did not return any instance matches.
|
marvelcinematicuniverse-marvel |
memoryalpha-memorybeta |
memoryalpha-stexpanded |
starwars-swg |
starwars-swtor |
ALOD2Vec |
3065 | 0.86 | 0.76 | 0.68 |
13325 | 0.92 | 0.91 | 0.90 |
3285 | 0.92 | 0.93 | 0.93 |
2118 | 0.92 | 0.82 | 0.75 |
2676 | 0.92 | 0.92 | 0.91 |
AML |
4670 | 0.85 | 0.68 | 0.56 |
18319 | 0.91 | 0.89 | 0.87 |
3699 | 0.93 | 0.93 | 0.93 |
3491 | 0.90 | 0.81 | 0.75 |
3835 | 0.93 | 0.92 | 0.90 |
ATBox |
3483 | 0.66 | 0.58 | 0.52 |
12860 | 0.96 | 0.93 | 0.91 |
3162 | 0.96 | 0.94 | 0.92 |
2170 | 0.93 | 0.83 | 0.76 |
2619 | 0.94 | 0.92 | 0.91 |
baselineAltLabel |
2559 | 0.86 | 0.76 | 0.68 |
13454 | 0.88 | 0.89 | 0.89 |
3165 | 0.88 | 0.90 | 0.93 |
1661 | 0.92 | 0.74 | 0.62 |
2535 | 0.92 | 0.91 | 0.90 |
baselineLabel |
1864 | 0.90 | 0.69 | 0.56 |
10492 | 0.95 | 0.85 | 0.77 |
2517 | 0.98 | 0.91 | 0.84 |
1194 | 0.95 | 0.67 | 0.52 |
2142 | 0.95 | 0.89 | 0.84 |
DESKMatcher |
1898 | 0.89 | 0.69 | 0.56 |
10831 | 0.94 | 0.85 | 0.78 |
2557 | 0.98 | 0.91 | 0.84 |
1522 | 0.93 | 0.76 | 0.64 |
2295 | 0.94 | 0.90 | 0.86 |
LogMap |
41070 | 0.24 | 0.35 | 0.71 |
54499 | 0.43 | 0.58 | 0.87 |
15529 | 0.47 | 0.63 | 0.94 |
15323 | 0.58 | 0.68 | 0.82 |
19531 | 0.27 | 0.42 | 0.96 |
LogMapBio |
0 | 0.00 | 0.00 | 0.00 |
0 | 0.00 | 0.00 | 0.00 |
0 | 0.00 | 0.00 | 0.00 |
0 | 0.00 | 0.00 | 0.00 |
0 | 0.00 | 0.00 | 0.00 |
LogMapIM |
41070 | 0.24 | 0.35 | 0.71 |
54499 | 0.43 | 0.58 | 0.87 |
15529 | 0.47 | 0.63 | 0.94 |
15323 | 0.58 | 0.68 | 0.82 |
19531 | 0.27 | 0.42 | 0.96 |
LogMapKG |
41070 | 0.24 | 0.35 | 0.71 |
54499 | 0.43 | 0.58 | 0.87 |
15529 | 0.47 | 0.63 | 0.94 |
15323 | 0.58 | 0.68 | 0.82 |
19531 | 0.27 | 0.42 | 0.96 |
LogMapLt |
0 | 0.00 | 0.00 | 0.00 |
16665 | 0.90 | 0.83 | 0.77 |
3550 | 0.94 | 0.89 | 0.84 |
2795 | 0.91 | 0.75 | 0.63 |
3605 | 0.91 | 0.89 | 0.87 |
Wiktionary |
3065 | 0.86 | 0.76 | 0.68 |
13327 | 0.92 | 0.91 | 0.90 |
3283 | 0.92 | 0.92 | 0.93 |
2118 | 0.92 | 0.82 | 0.75 |
2676 | 0.92 | 0.92 | 0.91 |
Runtime
Most matchers have a very low runtime which shows that they scale to huge KGs.
Except the LogMap family, all matchers solve each task in less than 30 minutes.
|
marvelcinematicuniverse-marvel |
memoryalpha-memorybeta |
memoryalpha-stexpanded |
starwars-swg |
starwars-swtor |
ALOD2Vec |
0:03:05 |
0:02:10 |
0:01:20 |
0:03:22 |
0:03:27 |
AML |
0:12:34 |
0:22:17 |
0:04:43 |
0:05:30 |
0:05:51 |
ATBox |
0:03:54 |
0:03:26 |
0:01:49 |
0:03:28 |
0:03:45 |
baselineAltLabel |
0:02:35 |
0:01:49 |
0:01:09 |
0:02:35 |
0:02:49 |
baselineLabel |
0:02:30 |
0:01:43 |
0:01:17 |
0:02:38 |
0:02:36 |
DESKMatcher |
0:03:08 |
0:02:28 |
0:01:41 |
0:03:20 |
0:03:17 |
LogMap |
1:40:56 |
0:18:09 |
0:12:14 |
0:30:31 |
0:13:24 |
LogMapBio |
0:37:37 |
1:24:45 |
0:59:48 |
0:26:18 |
1:07:01 |
LogMapIM |
1:33:33 |
0:19:48 |
0:11:47 |
0:30:51 |
0:13:35 |
LogMapKG |
1:33:40 |
0:17:58 |
0:12:00 |
0:31:08 |
0:13:05 |
LogMapLt |
0:00:00 |
0:01:27 |
0:00:54 |
0:02:27 |
0:02:31 |
Wiktionary |
0:06:46 |
0:05:50 |
0:03:57 |
0:06:49 |
0:06:50 |
Organizers
- Sven Hertling (University of Mannheim, Germany), main contact for the track, sven at informatik dot uni-mannheim dot de
- Heiko Paulheim (University of Mannheim, Germany)
[1] Sven Hertling, Heiko Paulheim: The knowledge graph track at OAEI : Gold standards, baselines, and the golden hammer bias. ESWC 2020. [pdf]
[2] Sven Hertling, Heiko Paulheim: DBkWik: A Consolidated Knowledge Graph from Thousands of Wikis. International Conference on Big Knowledge 2018. [pdf]
[3] Alexandra Hofmann, Samresh Perchani, Jan Portisch, Sven Hertling, and Heiko Paulheim. DBkWik: Towards Knowledge Graph Creation from Thousands of Wikis. International Semantic Web Conference (Posters & Demos) 2017. [pdf]