Umeå University's logo

umu.sePublications
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Similarity-based link prediction from modular compression of network flows
Umeå University, Faculty of Science and Technology, Department of Physics.ORCID iD: 0000-0001-7881-2496
Umeå University, Faculty of Science and Technology, Department of Physics.ORCID iD: 0000-0003-0124-1909
Center for Artificial Intelligence and Data Science, University of Würzburg, Germany.
Umeå University, Faculty of Science and Technology, Department of Physics.ORCID iD: 0000-0002-7181-9940
2022 (English)In: Proceedings of the First Learning on Graphs Conference, ML Research Press , 2022, p. 52:1-52:18Conference paper, Published paper (Refereed)
Abstract [en]

Node similarity scores are a foundation for machine learning in graphs for clustering, node classification, anomaly detection, and link prediction with applications in biological systems, information networks, and recommender systems. Recent works on link prediction use vector space embeddings to calculate node similarities in undirected networks with good performance. Still, they have several disadvantages: limited interpretability, need for hyperparameter tuning, manual model fitting through dimensionality reduction, and poor performance from symmetric similarities in directed link prediction. We propose MapSim, an information-theoretic measure to assess node similarities based on modular compression of network flows. Unlike vector space embeddings, MapSim represents nodes in a discrete, non-metric space of communities and yields asymmetric similarities in an unsupervised fashion. We compare MapSim on a link prediction task to popular embedding-based algorithms across 47 networks and find that MapSim's average performance across all networks is more than 7% higher than its closest competitor, outperforming all embedding methods in 11 of the 47 networks. Our method demonstrates the potential of compression-based approaches in graph representation learning, with promising applications in other graph learning tasks.

Place, publisher, year, edition, pages
ML Research Press , 2022. p. 52:1-52:18
Series
Proceedings of Machine Learning Research, E-ISSN 2640-3498 ; 198
National Category
Computer Sciences Computer Systems
Identifiers
URN: urn:nbn:se:umu:diva-212276Scopus ID: 2-s2.0-85164537856OAI: oai:DiVA.org:umu-212276DiVA, id: diva2:1783301
Conference
LOG 2022, 1st Learning on Graphs Conference, Virtual, December9-12, 2022
Funder
Wallenberg AI, Autonomous Systems and Software Program (WASP)Knut and Alice Wallenberg FoundationSwedish Research Council, 2016-00796Available from: 2023-07-20 Created: 2023-07-20 Last updated: 2023-07-20Bibliographically approved

Open Access in DiVA

No full text in DiVA

Other links

ScopusPublisher's full text

Authority records

Blöcker, ChristopherSmiljanic, JelenaRosvall, Martin

Search in DiVA

By author/editor
Blöcker, ChristopherSmiljanic, JelenaRosvall, Martin
By organisation
Department of Physics
Computer SciencesComputer Systems

Search outside of DiVA

GoogleGoogle Scholar

urn-nbn

Altmetric score

urn-nbn
Total: 289 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf