David A. R. Robin

David A. R. Robin's picture

Postdoc at Dauphine University

Machine Learning Phd


Publications

  1. Stab-SGD: Noise-Adaptivity in Smooth Optimization with Stability Ratios

    NeurIPS 25, Neural Information Processing Systems, San Diego, 2025

    DAR Robin, K. Bakong, K. Scaman


    A variant of SGD computing the stability ratio (relative noise level) of gradient estimates to automatically compute a scheduler to shrink step-sizes, with proofs of adaptivity in expectated loss last-iterate loss values, matching nearly all best rates of SGD with noise-tuned schedulers.

    Full text :  [ OpenReview ]

  2. Random Sparse Lifts: Con­struc­tion, Ana­ly­sis and Con­ver­gence of fi­ni­te sparse net­works

    ICLR 24, International Conference on Learning Representations, Vienna, 2024

    DAR Robin, K. Scaman, M. Lelarge


    Proof of convergence of finite-width multi-layer networks (and transformer-likes) to arbitrarily low loss values by gradient flow, when initialization is diverse and sparse enough. This shows that Probable-Approximate-Correctness is a type of structural guarantee that is achievable for large neural networks of essentially any architecture.

    Full text :  [ OpenReview ]

  3. Con­ver­gence be­yond the over­pa­ram­et­er­ized re­gi­me with Ray­leigh quot­ients

    NeurIPS 22, Neural Information Processing Systems, New Orleans, 2022

    DAR Robin, K. Scaman, M. Lelarge


    Proof of convergence of two-layer neural networks of finite width to arbitrarily low loss values under gradient flow. Without over-parameterization assumptions and thus stronger than infinite-width simplifications, this is achieved by integration of Kurdyka-Lojasiewicz inequalities, a technique to show optimal convergence even without convexity.

    Full text & code :  [ OpenReview ] [ Github ]

  4. Pe­rio­dic Sig­nal Re­covery with Re­gu­la­rized Sine Neu­ral Net­works

    NeurIPS 22, Neural Information Processing Systems, Neur­Reps Work­shop, New Orleans, 2022

    DAR Robin, K. Scaman, M. Lelarge


    Neural networks fail to learn periodic functions of unknown frequency, even with sine-like activations, despite previously claimed fixes. Obstructions identified include need for a more diverse (non-vanishing high-variance) init and non-convex sparsity-promoting regularization. With both, perfect recovery far outside the training interval.

    Full text & code :  [ OpenReview ] [ Github ]

  5. Attacking and Fixing the Android Protected Confirmation Protocol

    Euro S&P 25, IEEE European Symposium on Security and Privacy, Venice, 2025

    M. Arapinis, V. Danos, M. Racouchot, DAR Robin, T. Zacharias


    Android's Protected Confirmation (APC) protocol exhibits two vulnerabilities in its communication with the Trusted Execution Environment, leading to a possible bypass of user consent, shown on Google's Pixel. Patching both leads to a provably correct protocol with intended APC user-consent guarantees, in the Universal Composability framework.

    Full text :  [ HAL ][ CISPA Link ]

  6. Return-oriented programming on RISC-V

    ASIA CCS 20, ACM Asia Conference on Computer and Communications Security, Taipei, 2020

    GA Jaloyan, K. Markantonakis, RN Akram, DAR Robin, K. Mayes, D. Naccache


    Prefix-code machine instructions allow hiding malicious instructions using unaligned jumps, crafting sequences of long (32-bit) instructions whose last 16 bits are either a valid instruction or a valid prefix that can be chained into overlapping sequences fooling ROP gadgets detectors. A tree-based detection method identifies them correctly.

    Full text :  [ ACM Link ][ ArXiv ]

See all Publications & Patents

Latest Positions

  1. Postdoc : Adversarial Training

    July 2025 - present, Dauphine University, Paris. LAMSADE / MILES team

    with Yann Chevaleyre (LAMSADE, Dauphine) and Rafaël Pinot (LPSM, Jussieu)

  2. PhD in Mathematics

    Oct 2021 - Jun 2025, INRIA - ENS, Paris. DYOGENE / ARGO Project-team

    Advised by Marc Lelarge and Kevin Scaman

    Construction and convergence of provably-correct neural networks.

Teaching

  1. Teaching Assistant : Deep Learning

    Deep Learning (MAP583) course by Kevin Scaman (INRIA - ENS), École Polytechnique


    Practical introduction to deep learning and all implementation details, with a focus on coverage of a large amount of different data domains and network architectures.

    Resources :  [ Synapses page ][ Practicals repository ][ Custom python package ]

  2. Guest Lecture : Neural network compression

    Deep Learning course by Marc Lelarge (INRIA - ENS), ENS Paris


    Introduction to neural network compression concepts and recent results, with a focus and practical session on activation reconstruction.

    Resources :  [ Lecture slides ][ Practical Session ][ Practical Session Solution ]

View Resume