Francesco Tudisco

Associate Professor (Reader) in Machine Learning

School of Mathematics, The University of Edinburgh
The Maxwell Institute for Mathematical Sciences
School of Mathematics, Gran Sasso Science Institute JCMB, King’s Buildings, Edinburgh EH93FD UK
email: f dot tudisco at ed.ac.uk

Paper accepted @ ICML 2022

I am very happy that our paper Nonlinear Feature Diffusion on Hypergraphs – with Austin Benson and Konstantin Prokopchik – has been accepted on the proceedings of this year’s ICML. Congrats to my student Konstantin for one more important achievement!

Numerical Methods for Compression and Learning

Nicola Guglielmi and I are orginzing this week the workhop Numerical Methods for Compression and Learning at GSSI. The workshop will take place in the Main Lecture Hall in the Orange Building and will feature lectures from invited speakers as well as poster sessions open to all participants.

Excited to host great colleagues and looking forward to exciting talks!

Online participation will be possible via the zoom link: https://us02web.zoom.us/j/83830006962?pwd=SmI1MTVKRTllU3dBR01Ybko5bzBJdz09

As the amount of available data is growing very fast, the importance of being able to handle and exploit very-large-scale data in an efficient and robust manner is becoming increasingly more relevant. This workshop aims at bringing together experts from signal processing, compressed sensing, low rank methods and machine learning with the goal of highlighting modern approaches as well as challenges in computational mathematics arising in all these areas and at their intersection.


New paper out

A Variance-aware Multiobjective Louvain-like Method for Community Detection in Multiplex Networks

Abstract: In this paper, we focus on the community detection problem in multiplex networks, i.e., networks with multiple layers having same node sets and no inter-layer connections. In particular, we look for groups of nodes that can be recognized as communities consistently across the layers. To this end, we propose a new approach that generalizes the Louvain method by (a) simultaneously updating average and variance of the modularity scores across the layers, and (b) reformulating the greedy search procedure in terms of a filter-based multiobjective optimization scheme. ... Read more

New paper out

Learning the right layers: a data-driven layer-aggregation strategy for semi-supervised learning on multilayer graphs

Abstract: Clustering (or community detection) on multilayer graphs poses several additional complications with respect to standard graphs as different layers may be characterized by different structures and types of information. One of the major challenges is to establish the extent to which each layer contributes to the cluster assignment in order to effectively take advantage of the multilayer structure and improve upon the classification obtained using the individual layers or their union. ... Read more

Talk @ Calcolo Scientifico e Modelli Matematici (CSMM) workshop

Presenting today in Rome at the third workshop “Calcolo Scientifico e Modelli Matematici”. The streaming of my talk (in Italian) is available here and here you can find the pdf of my slides.

New paper out

Core-periphery detection in hypergraphs

Abstract: Core-periphery detection is a key task in exploratory network analysis where one aims to find a core, a set of nodes well-connected internally and with the periphery, and a periphery, a set of nodes connected only (or mostly) with the core. In this work we propose a model of core-periphery for higher-order networks modeled as hypergraphs and we propose a method for computing a core-score vector that quantifies how close each node is to the core. ... Read more

--- Core nodes on a ‘hyperplane’ hypergraph, color-coded according our hypergraph-based vs clique-expansion-based cp detection methods

Gianluca Ceruti from EPFL is visiting our group until Feb 18. Looking forward to some inspiring discussion on dynamical low rank!

Traveling to Naples for the 2ggALN

I traveling today to Naples to take part to the Italian annual workshop on Numerical Linear Algebra (2ggALN), one of the nicest events of the year. Looking forward to see colleagues and friends and to meet new people!

Presenting today @ MPI Leipzig

I am presenting today my work on nonlinear Laplacians for hypergraphs with applications to centrality and core-periphery detection at the Networks Seminar at MPI for Mathematics in the Sciences Leipzig. Thanks Raffella Mulas and Leo Torres for the kind invitation!

STANDS pro3 grant

I am honored I have been awarded a MUR-Pro3 grant on Stability of Neural Dynamical Systems (STANDS), in collaboration with Michele Benzi (SNS) and Nicola Guglielmi (GSSI). I am looking for a postdoc to join our group on this project. Please reach out if you are interested!

New paper out

Core-periphery partitioning and quantum annealing

Abstract: We propose a new kernel that quantifies success for the task of computing a core-periphery partition for an undirected network. Finding the associated optimal partitioning may be expressed in the form of a quadratic unconstrained binary optimization (QUBO) problem, to which a state-of-the-art quantum annealer may be applied. We therefore make use of the new objective function to (a) judge the performance of a quantum annealer, and (b) compare this approach with existing heuristic core-periphery partitioning methods. ... Read more

New paper out

Nodal domain count for the generalized graph $p$-Laplacian

Abstract: nspired by the linear Schrödinger operator, we consider a generalized $p$-Laplacian operator on discrete graphs and present new results that characterize several spectral properties of this operator with particular attention to the nodal domain count of its eigenfunctions. Just like the one-dimensional continuous $p$-Laplacian, we prove that the variational spectrum of the discrete generalized $p$-Laplacian on forests is the entire spectrum. Moreover, we show how to transfer Weyl’s inequalities for the Laplacian operator to the nonlinear case and prove new upper and lower bounds on the number of nodal domains of every eigenfunction of the generalized $p$-Laplacian on generic graphs, including variational eigenpairs. ... Read more

COMPiLE Reading Group

We are starting this week a reading group on Computational Learning, which will concentrate on papers in the general area of Numerics/Optimization/Matrix Theory/Graph Theory with particular emphasis on their application to / connection with Machine Learning, Data Mining and Network Science. The reading group is organized by PhD students and postdocs from the Numerics and Data Sciene group at GSSI. The organizer that is kicking off the initiative is Dayana Savostianova, who will also present the first paper. If you are interested in participating and/or presenting, please fill out this form (available to GSSI users only) to be notified about the meetings and to be kept in the loop.

New paper out

The self-consistent field iteration for p-spectral clustering

Abstract: The self-consistent field (SCF) iteration, combined with its variants, is one of the most widely used algorithms in quantum chemistry. We propose a procedure to adapt the SCF iteration for the p-Laplacian eigenproblem, which is an important problem in the field of unsupervised learning. We formulate the p-Laplacian eigenproblem as a type of nonlinear eigenproblem with one eigenvector nonlinearity , which then allows us to adapt the SCF iteration for its solution after the application of suitable regularization techniques. ... Read more

Talk @ Numerical Methods and Scientific Computing (NMSC) conference

Presenting today at the Numerical Methods and Scientific Computing conference that is taking place this week in Luminy at the CIRM conference center. This meeting is dedicated to Claude Brezinski for his 80th birthday, to whom I make my very best wishes. Thanks to the whole organizing comitte for the kind invitation! Below you can find the title of my talk, a link to the abstract as well as the slides of the presentation (in case you wish to have a look at them)


Title: Optimal L-shaped matrix reordering via nonlinear matrix eigenvectors

Abstract: We are interested in finding a permutation of the entries of a given square matrix $A$, so that the maximum number of its nonzero entries are moved to one of the corners in a L-shaped fashion.
If we interpret the nonzero entries of the matrix as the edges of a graph, this problem boils down to the so-called core–periphery structure, consisting of two sets: the core, a set of nodes that is highly connected across the whole graph, and the periphery, a set of nodes that is well connected only to the nodes that are in the core.
Matrix reordering problems have applications in sparse factorizations and preconditioning, while revealing core–periphery structures in networks has applications in economic, social and communication networks. This optimal reordering problem is a hard combinatorial optimization problem, which we relax into the continuous problem:

\begin{equation}\label{eq:1ft} \max f(x) := \sum_{ij}|A_{ij}| \max\{x_i,x_j\} \qquad \mathrm{s.t.} \qquad \|x\|=1 \end{equation}

While $f$ is still highly nonconvex and thus hardly treatable, we show that the global maximum of $f$ coincides with the nonlinear Perron eigenvector of a suitably defined parameter dependent matrix $M(x)$, i.e. the positive solution to the nonlinear eigenvector problem $M(x) x = \lambda x$. Using recent advances in nonlinear Perron–Frobenius theory, we show that \eqref{eq:1ft} has a unique solution and we propose a nonlinear power-method type scheme that allows us to solve \eqref{eq:1ft} with global convergence guarantees and effectively scales to very large and sparse matrices. We present several numerical experiments showing that the new method largely outperforms baseline techniques.

Link to slideshare presentation: Optimal L-shaped matrix reordering via nonlinear matrix eigenvectors

Top 20 most cited papers on SIMAX

I’m very excited (and honored!) to find out two papers of mine:

are in the top 20 most cited papers on SIMAX (SIAM Journal on Matrix Analysis and Applications) since 2018. Huge thanks to my amazing collaborators Matthias Hein and Antoine Gautier.

Dayana has succesfully passed her admission exam and officially joined our group of Numerical analysis and data science, starting her PhD work on Mathematics of Adversarial Attacks with me. Welcome!

Paper open-access on Journal of Scientific Computing

Excited that our paper The Global Convergence of the Nonlinear Power Method for Mixed-Subordinate Matrix Norms, joint work with Antoine Gautier and Matthias Hein, has been published open-access on the Journal of Scientific Computing

New paper out

Hitting times for second-order random walks

Abstract: A second-order random walk on a graph or network is a random walk where transition probabilities depend not only on the present node but also on the previous one. A notable example is the non-backtracking random walk, where the walker is not allowed to revisit a node in one step. Second-order random walks can model physical diffusion phenomena in a more realistic way than traditional random walks and have been very successfully used in various network mining and machine learning settings. ... Read more

Minitutorial @ SIAM LA 2021

I am very excited I will be giving today a minitutorial on Applied Nonlinear Perron–Frobenius Theory at the SIAM conference on Applied Linear Algebra (LA21).
I will present the tutorial together with Antoine Gautier.

Here you can find the webpage of the minitutorial.

Abstract

Nonnegative matrices are pervasive in data mining applications. For example, distance and similarity matrices are fundamental tools for data classification, affinity matrices are key instruments for graph matching, adjacency matrices are at the basis of almost every graph mining algorithm, transition matrices are the main tool for studying stochastic processes on data. The Perron-Frobenius theory makes the algorithms based on these matrices very attractive from a linear algebra point of view. At the same time, as the available data grows both in terms of size and complexity, more and more data mining methods rely on nonlinear mappings rather than just matrices, which however still have some form of nonnegativity.The nonlinear Perron-Frobenius theory allows us to transfer most of the theoretical and computational niceties of nonnegative matrices to the much broader class of nonlinear multihomogeneous operators. These types of operators include for example commonly used maps associated with tensors and are tightly connected to the formulation of nonlinear eigenvalue problems with eigenvector nonlinearities. In this minitutorial we will introduce the concept of multihomogeneous operators and we will present the state-of-the-art version of the nonlinear Perron-Frobenius theorem for nonnegative nonlinear mappings. We will discuss several numerical optimization implications connected to nonlinear and higher-order versions of the Power and the Sinkhorn methods and several open challenges, both from the theoretical and the computational viewpoints. We will also discuss a number of problems in data mining, machine learning and network science which can be cast in terms of nonlinear eigenvector problems and we will show how the nonlinear Perron-Frobenius theory can help solve them.

Minisymposium @ SIAM LA

Piero Deidda, Mario Putti and I are organizing a minisymposium on Nonlinear Laplacians on graphs and manifolds with applications to data and image processing within the SIAM Conference on Applied Linear Algebra 2021, happening virtually during the week May 17-21, 2021. See also the conference’s virtual program.

Our mini is scheduled on May 17, staring at 9:35am Central time (New Orleans)

Abstract

Nonlinear Laplacian operators on graphs and manifolds appear frequently in computational mathematics as they are widely used in a diverse range of applications, including data and image processing problems such as clustering, semi-supervised learning, segmentation. A multitude of recent work has shown that these operators may increase the performance of classical algorithms based on linear Laplacians. This has led to several developments on both the theoretical and the numerical aspects of nonlinear Laplacian operators on graphs and manifolds, including non-differential extreme cases such as the infinity-Laplacian, the 1-Laplacian and p-Laplacians with negative exponent. In this minisymposium we aim at sampling both theoretical and applied recent work in this active area of research.

Speakers:

  • Martin Burger — Nonlinear Spectral Decompositions Related to P-Laplacians
  • Qinglan Xia — P-Laplacians on Graphs with a Negative Exponent, and its Relation to Branched Optimal Transportation
  • Dong Zhang — Piecewise Multilinear Extension and Spectral Theory for Function Pairs
  • Piero Deidda — Nodal Domain Count and Spectral Properties of Generalized P-Laplacians on Graphs
  • Abderrahim Elmoataz — Game P-Laplacian on Graph: from Tug-of-War Games to Unified Processing in Image and Points Clouds Processing
  • Dimosthenis Pasadakis — Multiway P-Spectral Clustering on Grassmann Manifolds
  • Pan Li, Strongly Local Hypergraph Diffusions for Clustering and Semi-Supervised Learning
  • Shenghao Yang, P-Norm Hyper-Flow Diffusion

Presenting today @ TU Eindhoven

I am presenting today my work on semi-supervised learning with higher-order graph data at the CASA Colloquia at Eindhoven University of Technology (NE). Thanks Stefano Massei for the kind invitation!

Presenting today @ Joint GSSI-Sapienza AI seminar

I’m excited to open up today the joint GSSI-Sapienza seminar series on Artificial Intelligence with a talk on nonlinear diffusion methods for semi-supervised learning, joint work with Austin Benson and Konstantin Prokopchik. Thanks Giacomo Gradenigo for the kind invitation!


New paper out

Nonlinear Feature Diffusion on Hypergraphs

Abstract: Hypergraphs are a common model for multiway relationships in data, and hypergraph semi-supervised learning is the problem of assigning labels to all nodes in a hypergraph, given labels on just a few nodes. Diffusions and label spreading are classical techniques for semi-supervised learning in the graph setting, and there are some standard ways to extend them to hypergraphs. However, these methods are linear models, and do not offer an obvious way of incorporating node features for making predictions. ... Read more

New paper out

Node and Edge Eigenvector Centrality for Hypergraphs

Abstract: Network scientists have shown that there is great value in studying pairwise interactions between components in a system. From a linear algebra point of view, this involves defining and evaluating functions of the associated adjacency matrix. Recent work indicates that there are further benefits from accounting directly for higher order interactions, notably through a hypergraph representation where an edge may involve multiple nodes. Building on these ideas, we motivate, define and analyze a class of spectral centrality measures for identifying important nodes and hyperedges in hypergraphs, generalizing existing network science concepts. ... Read more

--- Different nonlinear eigenvector centrality models recognize different important nodes on the nonuniform sunflower.

Paper accepted on The Web Conference 2021

I am very happy to hear that our paper Nonlinear Higher-order Label Spreading – with Austin Benson and Konstantin Prokopchik – has been accepted on the proceedings of this year’s WWW conference.

Presenting today @ Cornell

Presenting today my work on nonlinear eigenvectors and nonlinear Perron-Frobenius theory at the SCAN seminar at Cornell University (USA). Thanks Austin Benson for the kind invitation!

Presenting today @ UniPisa

Presenting today our work on semi-supervised learning at the NumPi seminar at University of Pisa (Italy), joint work with Austin Benson and Konstantin Prokopchik. Thanks Leo Robol for the kind invitation!

Paper accepted on ESAIM: Math Modelling and Num Analysis

Happy that our paper Nonlocal PageRank, joint work with Stefano Cipolla (Edinburgh) and Fabio Durastante (Pisa), has been accepted for publication on ESAIM: Mathematical Modelling and Numerical Analysis

Minitutorial @ SIAM LA 2021

I am very excited I will be giving a minitutorial on Applied Nonlinear Perron–Frobenius Theory at the SIAM conference on Applied Linear Algebra (LA21).
I will present the tutorial together with Antoine Gautier.

We will introduce the concept of multihomogeneous operators and we will present the state-of-the-art version of the nonlinear Perron-Frobenius theorem for nonnegative nonlinear mappings. We will discuss several numerical optimization implications connected to nonlinear and higher-order versions of the Power and the Sinkhorn methods and several open challenges, both from the theoretical and the computational viewpoints. We will also discuss numerous problems in data mining, machine learning and network science which can be cast in terms of nonlinear eigenvalue problems with eigenvector nonlinearities and we will show how the nonlinear Perron-Frobenius theory can help solve them.

Editor for SIAM Review

I have accepted an invite to serve as associate editor in the Survey & Review section of SIAM Review (SIREV), the flagship section of one of the highest impact applied math journal. Excited and looking forward to starting!

Paper accepted on SIAM J Math of Data Science

Excited that our paper Ergodicity coefficients for higher-order stochastic processes, joint work with Dario Fasino, has been accepted on the SIAM Journal on Mathematics of Data Science

Talk @ SIAM Imaging Science Conference

Last day of the first virtual SIAM Imaging Science conference today. I am presenting a talk at the minisymposium Nonlinear Spectral Analysis with Applications in Imaging and Data Science organized by Leon Bungert (Friedrich-Alexander Universitaet Erlangen-Nuernberg, Germany), Guy Gilboa (Technion Israel Institute of Technology, Israel) and Ido Cohen (Israel Institute of Technology, Israel).

These are title and abstract of my talk:

Nodal Domain Theorem for the p-Laplacian on Graphs and the Related Multiway Cheeger Inequality
We consider the p-Laplacian on discrete graphs, a nonlinear operator that generalizes the standard graph Laplacian (obtained for p=2). We consider a set of variational eigenvalues of this operator and analyze the nodal domain count of the corresponding eigenfunctions. In particular, we show that the famous Courant’s nodal domain theorem for the linear Laplacian carries over almost unchanged to the nonlinear case. Moreover, we use the nodal domains to prove a higher-order Cheeger inequality that relates the k-way graph cut to the k-th variational eigenvalue of the p-Laplacian.

Below you can find my slides, in case you wish to have a look at them

Link to slideshare presentation: Nodal Domain Theorem for the p-Laplacian on Graphs and the Related Multiway Cheeger Inequality

SIAM NS happening virtually on July 9 and 10

I will preset my first ever Virtual Poster at the first official virtual SIAM Network Science workshop!

Free registration — Tweet feed #SIAMNS20 — More info and schedule: https://ns20.cs.cornell.edu/

Des Higham will present our work on higher-order eigenvector-based network coefficients on July 10, 9am Pacific Time (5pm UK, 6pm EU)

My poster session room will be on nonlinear eigenevector centralities and will be on for 45 min starting at 4pm Pacific Time (midnight UK, 1am EU). Lots of coffee is planned for that day. You may wish to have a look at my poster:


MSCA Day @ GSSI

We are organizing a “Marie Skłodowska Curie Action Day” virtual event to illustrate some fundamental aspects of Horizon 2020 MSC fellowships. We will discuss some of the application rules, evaluation criteria, how do we think a successful application should be written and we will share personal experiences as recipients and supervisors of MSC individual fellowships.

This event has been promoted and coordinated by my amazing colleague Elisabetta Baracchini

The event will be held virtually on July 2, 9am — 1pm (Italian CET time) via this Zoom meeting room. Details on the program can be found here. Participation is open to everyone and is totally free.

Virtual minisymposium @ SIAM MDS

Michael Schaub, Santiago Segarra and I are organizing a virtual minisymposium on Learning from data on networks within the SIAM Conference on Mathematics of Data Science 2020, happening virtually during the whole month of June. See also the conference’s virtual program.

Our mini will take place on June 30, staring at 10:00 am Eastern time (Boston)
[7am California, 9am Texas, 3pm UK, 4pm EU, 10pm China]

For more details and to register to join the event online (free of charge), please see the minisymposium webpage.

Abstract

Modern societies increasingly depend on complex networked systems to support our daily routines. Electrical energy is delivered by the power grid; the Internet enables almost instantaneous world-wide interactions; our economies rest upon a complex network of inter-dependencies spanning the globe. Networks are ubiquitous in complex biological, social, engineering, and physical systems. Understanding structures and dynamics defined over such networks has thus become a prevalent challenge across many disciplines. A recurring question which appears in a wide variety of problems is how one can exploit the interplay between the topological structure of the system and available measurements at the nodes (or edges) of the networks. The goal of this minisymposium is to bring together researchers from different mathematical communities – from network science, machine learning, statistics, signal processing and optimization – to discuss and highlight novel approaches to understand and learn from data defined on networks.

Speakers:

  • Michael Schaub — Learning from graphs and data on networks: overview and outlook
  • Caterina De Bacco — Incorporating node attributes in community detection for multilayer networks
  • Danai Koutra — The Power of Summarization in Network Representation Learning (and beyond)
  • Ekaterina Rapinchuk — Applications of Auction Dynamics to Data Defined on Networks
  • David Gleich — Nonlinear processes on networks
  • Jan Overgoor — Choosing To Grow a Graph: Modeling Network Formation as Discrete Choice

New paper out

Nonlinear Higher-Order Label Spreading

Abstract: Label spreading is a general technique for semi-supervised learning with point cloud or network data, which can be interpreted as a diffusion of labels on a graph. While there are many variants of label spreading, nearly all of them are linear models, where the incoming information to a node is a weighted sum of information from neighboring nodes. Here, we add nonlinearity to label spreading through nonlinear functions of higher-order structure in the graph, namely triangles in the graph. ... Read more

--- Accuracy of Nonlinear Higher-order Label Spreading on synthetic stochastic block models. Table entries are the average accuracy over 10 random instances.

Data Science Open Day @ Uni of Padua

Excited to take part today at the Open House event for the Master’s Degree in Data Science at the Department of Mathematics of the University of Padova. I will give a high-level introduction to the problem of link prediction in networks and how to use PageRank eigenvectors to compute a mathematically informed prediction. The live streaming of the event is available on youtube.

New paper out

Ergodicity coefficients for higher-order stochastic processes

Abstract: The use of higher-order stochastic processes such as nonlinear Markov chains or vertex-reinforced random walks is significantly growing in recent years as they are much better at modeling high dimensional data and nonlinear dynamics in numerous application settings. In many cases of practical interest, these processes are identified with a stochastic tensor, and their stationary distribution is a tensor Z-eigenvector. However, fundamental questions such as the convergence of the process towards a limiting distribution and the uniqueness of such a limit are still not well understood and are the subject of rich recent literature. ... Read more

One World seminars

The COVID19 pandemic resulted in the mass cancellation of in-person conferences and seminars across the globe. Wonderful initiatives have resulted as a response to this unfortunate situation. For example, many scientific communities worldwide have started “One World” online seminar series and several conference committees are working in order to put forward online versions of traditional meetings.

Here I would like to list the initiatives related to my research interests that I am aware of. If you know of any other online meeting I have missed, please do let me know!


Acronym Title When Platform
OWML One World Seminar Series on the Mathematics of Machine Learning Wednesdays @ 12 noon ET (UTC-4) Zoom
OWSP One World Signal Processing Seminar Fridays Zoom
MADS Mathematical Methods for Arbitrary Data Sources Mondays @ 2pm CET (UTC+2) Zoom
E-NLA Online seminar series on Numerical Linear Algebra Wednesdays @ 4pm CET (UTC+2) Zoom
MINDS One World Mathematics of INformation, Data, and Signals Seminar Thursdays @ 2:30pm EDT (UTC-4)
OPT One World Optimization Seminar Mondays @ 3pm CEST (UTC+2) Zoom
IMAGINE Imaging & Inverse Problems Wednesdays @ 4pm CET (UTC+2) Zoom
GAMENET One World Mathematical Game Theory Seminar Mondays @ 3pm CEST (UTC+2) Zoom
PROB One World Probability Seminar Weekends @ 3-4pm CEST (UTC+2) Zoom

Paper accepted on SIAM Applied Mathematics

Our paper Total variation based community detection using a nonlinear optimization approach, joint work with Andrea Cristofari and Francesco Rinaldi from the University of Padua, has been accepted on the SIAM Journal on Applied Mathematics

Visit and Talk @ Uni Kent

I am traveling today to visit and give a talk at the pure, applicable and numerical mathematics seminar at University of Kent, Canterbury (UK). Thanks Bas Lemmens and Marina Iliopoulou for the invitation and for hosting me!

New paper out

The Global Convergence of the Nonlinear Power Method for Mixed-Subordinate Matrix Norms

Abstract: We analyze the global convergence of the power iterates for the computation of a general mixed-subordinate matrix norm. We prove a new global convergence theorem for a class of entrywise nonnegative matrices that generalizes and improves a well-known results for mixed-subordinate $\ell^p$ matrix norms. In particular, exploiting the Birkoff–Hopf contraction ratio of nonnegative matrices, we obtain novel and explicit global convergence guarantees for a range of matrix norms whose computation has been recently proven to be NP-hard in the general case, including the case of mixed-subordinate norms induced by the vector norms made by the sum of different $\ell^p$-norms of subsets of entries. ... Read more

Talk @ Rutherford Appleton Laboratory

I am in Oxford (UK) today, giving a talk at the Rutherford Appleton Lab and Uni of Oxford’s Numerical Analysis group joint seminar on Computational Mathematics and Applications. Thank you Michael Wathen and Tyrone Rees for the invitation!

Paper accepted on Proc Royal Society A

Our paper A framework for second order eigenvector centralities and clustering coefficients, joint work with Francesca Arrigo and Des Higham, has been accepted in the Proceedings of the Royal Society Series A

Doctoral course @ Uni Padua

Starting from March 1, I will be visiting the University of Padua to teach the doctoral course Eigenvector methods for learning from data on networks for the PhD program in Computational Mathematics. You can use this link if you wish to enroll for my course. Thanks Michela for the invitation!

Plenary talk @ HHXXI

I have been invited to give a plenary talk this summer at the Householder Symposium XXI. You can read the abstract of my talk from the book of abstracts. Looking forward for this exciting opportunity!

New paper out

Nonlocal PageRank

Abstract: In this work we introduce and study a nonlocal version of the PageRank. In our approach, the random walker explores the graph using longer excursions than just moving between neighboring nodes. As a result, the corresponding ranking of the nodes, which takes into account a long-range interaction between them, does not exhibit concentration phenomena typical for spectral rankings taking into account just local interactions. We show that the predictive value of the rankings obtained using our proposals is considerably improved on different real world problems. ... Read more

Konstantin successfully passed today his preliminary PhD exam. Congratulations!

News highlights

*/}}