1. Dates
  2. Keynote Speakers
  3. Call for Papers
  4. Schedule
  5. Accepted Papers
  6. Awards
  7. Organizers and PC
  8. Previous Workshops

22nd International Workshop on
Mining and Learning with Graphs

Monday, 15th September 2025, Porto, jointly with ECMLPKDD2025

GPT-4o generated picture of the bridge in porto with a graph floating behind it.

Important Dates

Keynote Speakers

Call for Papers

This workshop is a forum for exchanging ideas and methods for mining and learning with graphs, developing new common understandings of the problems at hand, sharing data sets where applicable, and leveraging existing knowledge from different disciplines. The goal is to bring together researchers from academia and industry to create a forum to discuss recent advances in graph analysis. In doing so, our aim is to better understand the overarching principles and limitations of current methods and to inspire research on new algorithms and techniques for mining and learning with graphs.

To reflect the broad scope of work on mining and learning with graphs, we encourage submissions that span the spectrum from theoretical analysis to algorithms and implementation to applications and empirical studies. We are interested in the full spectrum of graph data, including but not limited to attributed graphs, labeled graphs, knowledge graphs, evolving graphs, transactional graph databases, etc.

We therefore invite submissions on theoretical aspects, algorithms and methods, and applications of the following (non-exhaustive) list of areas:

We welcome many kinds of papers, such as, but not limited to:

Submission Guidelines: Authors should clearly indicate in their abstracts the kinds of submissions that the papers belong to, to help reviewers better understand their contributions. All papers will be peer-reviewed (single-blind). Submissions must be in PDF, long papers no more than 12 pages long, short papers no more than 8 pages long, formatted according to the standard Springer LNCS style required for ECMLPKDD submissions. References and appendix do not count towards the page limit. The accepted papers will be published on the workshop website and will not be considered archival for resubmission purposes. Authors whose papers are accepted to the workshop will have the opportunity to participate in a pitch and poster session, and the best two will also be chosen for oral presentation.

Papers should be submitted via CMT: https://cmt3.research.microsoft.com/ECMLPKDDWorkshopTrack2025. Please select the MLG: Mining and Learning with Graphs track.

Post-Workshop Springer Proceedings: High quality, original, non-dual-submitted papers will be invited to be published in post-workshop proceedings, assuming that ECMLPKDD offers them as in previous years.

Dual Submission Policy: We accept submissions that are currently under review at other venues. However, in this case, our page limits apply. Please also check the dual submission policy of the other venue.

Tentative Schedule

9.00h Welcoming
9.15h Keynote 1
10.15h Spotlight Talks (Group A)
10.30h Coffee + Poster Session (Group A)
12.00h Contributed Talk
Katharina Limbeck, Lydia Mezrag, Guy Wolf, Bastian Rieck:
Geometry-Aware Edge Pooling for Graph Neural Networks.
12.15h Contributed Talk
Patrick Indri, Tamara Drucks, Thomas Gärtner:
Private and Expressive Graph Representations.
12.30h Lunch Break
14.00h Keynote 2
15.00h Contributed Talk
Christoph Sandrock, Sebastian Lüderssen, Maximilian Thiessen, Thomas Gärtner:
Efficient Minimization of Peakless Functions on Bounded-degree Graphs.
15.15h Contributed Talk
Dionisia Naddeo, Tiago Azevedo, Nicola Toschi:
Do We Need Curved Spaces? A Critical Look at Hyperbolic Graph Learning in Graph Classification.
15.30h Spotlight Talks (Group B)
15.45h Coffee + Poster Session (Group B)
17.15h Contributed Talk
Pavel Prochazka, Michal Mares, Lukas Bajer:
Contrastive Learning as Optimal Homophilic Graph Structure Learning.
17.30h Contributed Talk
Adrian Arnaiz-Rodriguez, Federico Errica :
Oversmoothing, "Oversquashing", Heterophily, Long-Range, and more: Demystifying Common Beliefs in Graph Machine Learning.
17.45h Closing remarks and Awards

Accepted Papers

  1. Adarsh Jamadandi, Celia Rubio-Madrigal, Rebekka Burkholz (2025):
    Spectral Graph Pruning Against Over-Squashing and Over-Smoothing.

    [group tbd]

  2. Adrian Arnaiz-Rodriguez, Federico Errica (2025):
    Oversmoothing, "Oversquashing", Heterophily, Long-Range, and more: Demystifying Common Beliefs in Graph Machine Learning.

    [group tbd]

  3. Alessio Comparini, Lea Schmidt, Vanessa Siffredi, Damien Marie, Clara James, Jonas Richiardi (2025):
    Late and Early Fusion Graph Neural Network Architectures for Integrative Modeling of Multimodal Brain Connectivity Graphs.

    [group tbd]

  4. Andreas Roth, Thomas Liebig (2025):
    What Can We Learn From MIMO Graph Convolutions?.

    [group tbd]

  5. Celia Rubio-Madrigal, Adarsh Jamadandi, Rebekka Burkholz (2025):
    GNNs Getting ComFy: Community and Feature Similarity Guided Rewiring.

    [group tbd]

  6. Christoph Sandrock, Sebastian Lüderssen, Maximilian Thiessen, Thomas Gärtner (2025):
    Efficient Minimization of Peakless Functions on Bounded-degree Graphs.

    [group tbd]

  7. Dehn Xu, Tim Katzke, Emmanuel Müller (2025):
    From Pixels to Graphs: Deep Graph-Level Anomaly Detection on Dermoscopic Images.

    [group tbd]

  8. Dionisia Naddeo, Tiago Azevedo, Nicola Toschi (2025):
    Do We Need Curved Spaces? A Critical Look at Hyperbolic Graph Learning in Graph Classification.

    [group tbd]

  9. Giorgio Venturin, Ilie Sarpe, Fabio Vandin (2025):
    Efficient Approximate Temporal Triangle Counting in Streaming with Predictions.

    [group tbd]

  10. Jakub Peleška, Gustav Šír (2025):
    Task-Agnostic Contrastive Pretraining for Relational Deep Learning.

    [group tbd]

  11. Joël Mathys, Henrik Christiansen, Federico Errica, Francesco Alesiani (2025):
    Long Range Ising Model: A Benchmark for Long Range Capabilities in Graph Learning.

    [group tbd]

  12. Katharina Limbeck, Lydia Mezrag, Guy Wolf, Bastian Rieck (2025):
    Geometry-Aware Edge Pooling for Graph Neural Networks.

    [group tbd]

  13. Lisi Qarkaxhija, Anatol Wegner, Ingo Scholtes (2025):
    Link Prediction with Untrained Message Passing Layers.

    [group tbd]

  14. Manuel Dileo, Matteo Zignani, Sabrina Gaito (2025):
    Evaluating explainability techniques on discrete-time graph neural networks.

    [group tbd]

  15. Maximilian Seeliger, Fabian Jogl, Thomas Gärtner (2025):
    Graph Product Representations.

    [group tbd]

  16. Namrata Banerji, Tanya Berger-Wolf (2025):
    DynaSTy: A Spatio-Temporal Transformer for Node Attribute Prediction in Dynamic Graphs.

    [group tbd]

  17. Pascal Plettenberg, André Alcalde, Bernhard Sick, Josephine Thomas (2025):
    Graph Neural Networks for Automatic Addition of Optimizing Components in Printed Circuit Board Schematics.

    [group tbd]

  18. Patrick Indri, Tamara Drucks, Thomas Gärtner (2025):
    Private and Expressive Graph Representations.

    [group tbd]

  19. Pavel Prochazka, Michal Mares, Lukas Bajer (2025):
    Contrastive Learning as Optimal Homophilic Graph Structure Learning.

    [group tbd]

  20. Quentin Haenn, Brice Chardin, Mickaël Baron, Allel Hadjali (2025):
    Iterative Graph-Based Radius Constraint Clustering.

    [group tbd]

  21. Saku Peltonen, Roger Wattenhofer (2025):
    On the Expressive Power of GNNs for Boolean Satisfiability.

    [group tbd]

  22. Sami Guembour, Catherine Dominguès, Sabine Ploux (2025):
    Semantic Analysis Experiments for French Citizens' Contribution : Combinations of Language Models and Community Detection Algorithms.

    [group tbd]

  23. Simon Rittel, Sebastian Tschiatschek (2025):
    Expressivity of Parametrized Distributions over DAGs for Causal Discovery.

    [group tbd]

  24. Victor Toscano-Duran, Bastian Rieck (2025):
    A Topological Molecular Representation for Molecular Learning Based on the Euler Characteristic Transform.

    [group tbd]

Organizers

Program Committee

Previous Workshops

This page tries to be minimalistic in layout, bandwith, and used tools. It is hosted on github pages, using neat.css stylesheets, and bibtexparser to generate the lists of papers.