Francesco Tudisco

Associate Professor

School of Mathematics
Numerical Analysis and Data Science Group
GSSI Gran Sasso Science Institute
Viale Francesco Crispi 7 — 67100 — L’Aquila (Italy)
email: francesco dot tudisco at gssi dot it

New paper out

Quantifying the structural stability of simplicial homology

Abstract: The homology groups of a simplicial complex reveal fundamental properties of the topology of the data or the system and the notion of topological stability naturally poses an important yet not fully investigated question. In the current work, we study the stability in terms of the smallest perturbation sufficient to change the dimensionality of the corresponding homology group. Such definition requires an appropriate weighting and normalizing procedure for the boundary operators acting on the Hodge algebra’s homology groups. ... Read more

--- Continuous and discretized manifold with a 1-dimensional hole.

Paper accepted on Applied and Computational Harmonic Analysis

Excited that our paper Nodal domain count for the generalized p-Laplacian – with Piero Deidda and Mario Putti – has been accepted for publication on Applied and Computational Harmonic Analysis. Among the main results, we prove that the eigenvalues of the p-Laplacian on a tree are all variational (and thus they are exactly n), we show that the number of nodal domains of the p-Laplacian for general graphs can be bound both from above and from below, and we deduce that the higher-order Cheeger inequality is tight on trees.

Arturo De Marinis from our team is attending XMaths workshop at University of Bari this week, presenting preliminary results on our work on stability of neural dynamical systems.

I am giving an invited lecture today on Low-parametric deep learning, at the Data Science in Action day organized by the University of Padua. You can find here the slides of my talk.


Presenting today @ Örebro University

I am presenting today my work on fast and efficient neural networks' training via low-rank gradient flows at the Research Seminars in Mathematics at the School of Science and Technology, Örebro University (Sweden). Thanks Andrii Dmytryshyn for the kind invitation!

Steffen Schotthöfer and Emanuele Zangrando from our lab are attending NeurIPS Conference this week in person and will present our work on lowrank training and pruning of neural networks. In this work we developed a framework to perform stable and efficient training on low-rank manifolds, resulting in an order of magnitude less memory cost and training time! Tested successfully on Imagenet1K, transformers and several other benchmarks.

If you are there too, swing by our poster session in HallJ#604 on Wed 30 Nov 9:30 am PST


Emanuele Zangrando is presenting today our work on dynamical low-rank training of artificial neural networks at the SCDM seminar at Karlsruhe Institute of Technology.

Presenting today @ Texas A&M University

I am presenting today my work on generalized $p$-Laplacian on graphs at the Mathematical Physics and Harmonic Analysis Seminar at Texas A&M Univeristy. Thanks Gregory Berkolaiko for the kind invitation!

Paper accepted on NeurIPS 2022

Thrilled to hear that our paper on Low-rank lottery tickets has been accepted on NeurIPS 2022! We propose a method to speed up and reduce the memory footprint of the training phase (as well as the inference phase) of fully-connected and convolutional NNs by interpreting the training process as a gradient flow and integrating the corresponding ODE directly on the manifold of low-rank matrices.
It has been a wonderful collaboration among a fantastic team of collaborators, and it would have not been possible without the excellent work of the two PhD students Emanuele Zangrando and Steffen Schotthöfer.


New paper out

Nonlinear Spectral Duality

Abstract: Nonlinear eigenvalue problems for pairs of homogeneous convex functions are particular nonlinear constrained optimization problems that arise in a variety of settings, including graph mining, machine learning, and network science. By considering different notions of duality transforms from both classical and recent convex geometry theory, in this work we show that one can move from the primal to the dual nonlinear eigenvalue formulation maintaining the spectrum, the variational spectrum as well as the corresponding multiplicities unchanged. ... Read more

Our report on the XXI Householder Symposium on Numerical Linear Algebra appeared today on SIAM News. It has been a great meeting which I really enjoyed!

Paper accepted on Journal of Complex Networks

Excited that our paper A Variance-aware Multiobjective Louvain-like Method for Community Detection in Multiplex Networks – with Sara Venturini, Andrea Cristofari, Francesco Rinaldi – has been accepted for publication on the Journal of Complex Networks, Oxford academic press.

Paper accepted on European Journal of Applied Mathematics

Happy that our paper Hitting times for second-order random walks, joint work with Arianna Tonetto and Dario Fasino (Univ of Udine), has been accepted for publication on the European Journal of Applied Mathematics, Cambridge University Press.

Open postdoc position GSSI-SNS

We are looking for a postdoctoral research associate to join our group on a joint project with Michele Benzi from Scuola Normale Superiore in Pisa. The postdoctoral fellow will be working on topics at the interface between Numerical Methods and Machine Learning and will be funded by the MUR-Pro3 grant “STANDS - Numerical STAbility of Neural Dynamical Systems”. The official call for application will open up soon. For more details and in order to express your interest, please refer to this form.

New paper out

Low-rank lottery tickets: finding efficient low-rank neural networks via matrix differential equations

Abstract: Neural networks have achieved tremendous success in a large variety of applications. However, their memory footprint and computational demand can render them impractical in application settings with limited hardware or energy resources. In this work, we propose a novel algorithm to find efficient low-rank subnetworks. Remarkably, these subnetworks are determined and adapted already during the training phase and the overall time and memory resources required by both training and evaluating them is significantly reduced. ... Read more

--- By re-interpreting the weight-update phase as a time-continuous process we directly perform training within the manifold of low-rank matrices.

Presenting today @ USTC Hefei

I am presenting today my work on generalized $p$-Laplacian on graphs at the Spectral Geomtry Seminar at University of Science and Technology of China in Hefei. Thanks Shiping Liu for the kind invitation!

Paper accepted @ KDD 2022

One more great news! Our paper Core-periphery partitioning and quantum annealing – with Catherine Higham and Desmond Higham – has been accepted on the proceedings of this year’s ACM SIGKDD International Conference on Knowledge Discovery and Data Mining.

Paper accepted @ ICML 2022

I am very happy that our paper Nonlinear Feature Diffusion on Hypergraphs – with Austin Benson and Konstantin Prokopchik – has been accepted on the proceedings of this year’s ICML. Congrats to my student Konstantin for one more important achievement!

Numerical Methods for Compression and Learning

Nicola Guglielmi and I are orginzing this week the workhop Numerical Methods for Compression and Learning at GSSI. The workshop will take place in the Main Lecture Hall in the Orange Building and will feature lectures from invited speakers as well as poster sessions open to all participants.

Excited to host great colleagues and looking forward to exciting talks!

Online participation will be possible via the zoom link: https://us02web.zoom.us/j/83830006962?pwd=SmI1MTVKRTllU3dBR01Ybko5bzBJdz09

As the amount of available data is growing very fast, the importance of being able to handle and exploit very-large-scale data in an efficient and robust manner is becoming increasingly more relevant. This workshop aims at bringing together experts from signal processing, compressed sensing, low rank methods and machine learning with the goal of highlighting modern approaches as well as challenges in computational mathematics arising in all these areas and at their intersection.