Skip to main content
Workshops & seminars

Mathematics & Statistics Departmental Research Seminar


Date & time
Friday, March 27, 2026
12 p.m. – 1 p.m.
Speaker(s)

Dr. Sina Mohammad-Taheri

Cost

This event is free

Where

J.W. McConnell Building
1400 De Maisonneuve Blvd. W.
Room 921-4

Accessible location

Yes - See details

Title: Greedy sparse recovery under structure: weighted generalization and deep unrolling

Abstract: Sparsity is a praise for conciseness. Sparse recovery methods seek the most parsimonious representation from limited data acquisitions, a problem typically formulated as finding a sparse solution, a solution with mostly zero (negligible) entries, to an underdetermined linear system of equations, possibly corrupted with noise. In contrast, deep neural networks rely on increasing the number of parameters to discover emergent patterns. Deep learning architectures are jungles of affine and nonlinear functions optimized over a huge number of data, that despite their approximation capabilities, renders them extremely hard to interpret and analyze. This raises serious concerns regarding their safety, especially in critical domains and sectors such as health, as they tend to “hallucinate” due to lack of stability and recovery bounds.

In this talk, focusing on the class of greedy sparse recovery solvers as efficient alternatives to convex methods, I will consider two sparse reconstruction setups in tandem to the latent structure within the data: First, when a priori knowledge about the signal is available, we incorporate it into “weights” by designing weighted algorithms, which are superior to their unweighted versions according to several criteria. Second, when such knowledge is not provided, hidden structures within the data are learned utilizing explainable deep learning models based on sparse recovery algorithms. To do so, we resolve the non-differentiability issue of greedy sparse recovery algorithms due to the argsort operator by approximating their permutation matrices in a differentiable manner. Both settings are proposed with recovery guarantees under suitable conditions on the sparse recovery model’s forward operator, which marks a step towards addressing stability of deep networks. These settings span a range of applications, of particular interest those arising in medical signal analysis and function approximation.

Back to top

© Concordia University