David A. R. Robin

David A. R. Robin's picture

PhD candidate at INRIA


  1. Research Internship : Hypentropic reparameterization

    Laboratoire de Mathématiques LMO, Orsay

    Oct 2020 - Jun 2021

    Research internship with Lénaïc Chizat (CNRS) on the implicit bias induced by the gradient descent algorithm on two-layer neural networks. Characterized the continuous limit point as the Bregman projection with hyperbolic entropy potential of the initialization weights to the set of zero-loss weights, with linear convergence speed under some technical assumptions.

    Internship report :  [ pdf ]

  2. Research Internship : Clifford-valued networks

    Upstride SAS, Station F, Paris

    Feb - Aug 2020

    Research internship with Wilder Lopes exploring computational efficiency of variational auto- encoders defined over Clifford algebras. Demonstrated experimentally superior reconstruction performance of networks leveraging higher-dimensional algebras on small images.

  3. Research Internship : Neural network compression

    Technicolor AI Lab (acquired by Interdigital), San Francisco (CA)

    Feb - Aug 2019

    Research internship with Swayambhoo Jain on compression of neural networks. Developed a fast compression method able to cut up to 90% of weights with no drop in accuracy by casting layerwise compression as a series of convex activation reconstruction problems.

    Internship report :  [ html ][ pdf ][ slides ]

  4. Research Internship : Optimal Transport

    Massachussets Institute of Technology, Boston (MA)

    Jun - Aug 2018

    Research Internship with Philippe Rigollet (MIT) on reconstruction of cellular trajectories in gene expression space with optimal transport. The resulting toolkit for single cell RNA sequencing timeseries analysis is open source and available as a Python package.

    Waddington Optimal Transport : broadinstitute/wot (diverged since)

    Internship report (in french) :  [ html ][ pdf ][ slides ]


  1. Guest Lecture : Neural network compression

    Deep Learning course by Marc Lelarge (INRIA - ENS), ENS Paris

    Introduction to neural network compression concepts and recent results, with a focus and practical session on activation reconstruction.

    Resources :  [ Lecture slides ][ Practical Session ][ Practical Session Solution ]


  1. PhD in Mathematics

    From Oct 2021 to present

    INRIA - ENS, Paris. DYOGENE Project-team

    Advised by Marc Lelarge and Kévin Scaman

    Reparameterizations of deep neural networks for structured data with symmetries.

  2. Diplôme de l'ENS (Info-Maths)

    Final year of the ENS cursus

    École Normale Supérieure, Paris, 2020-2021

    Additional advanced courses on stochastic processes and algebraic geometry.

  3. M. Sc. Computer Science

    Mathématiques, Vision & Apprentissage (MVA)

    École Normale Supérieure, Paris, 2018-2020

    Advanced mathematics and computer science, focused on Machine Learning

    Coursework includes:

    • Category theory
    • Network modelisation
    • Parallel programming
    • General Robotics
    • Convex optimization
    • Computer vision
    • Deep Learning
    • General Topology
    • Differential Geometry
    • Reinforcement Learning
    • Natural Language Processing
    • Optimal Transport
    • Graphical Models
    • Kernel Methods
  4. B. Sc. Computer Science

    École Normale Supérieure, Paris, 2017-2018

    Solid basis in modern mathematics and computer science.

    Coursework includes:

    • Mathematical Logic
    • Formal languages
    • Algebra
    • Cryptology
    • Information theory
    • λ-calculus and calculability
    • Processor's architectures
    • Operating Systems
    • Databases
    • Compilation
    • Randomized algorithms
    • Semantics and Verification

    Lycée Louis-le-Grand, Paris, 2015-2017

    Post-secondary program in advanced maths and physics leading to nationwide entrance examinations to the Grandes Écoles for scientific studies

  6. Baccalaureate in science

    Lycée Hoche, Versailles, 2015

    A-levels French equivalent

    Awarded with highest honours





  1. Raspberry Pi 3 64-bit OS

    UNIX-like 64-bit micro-kernel with MMU handling, dynamic memory allocation, hardware interruptions, multi-processing, and basic filesystem for the Raspberry Pi 3 (before even Linux implements 64-bit support)

    Source code available on github: robindar/sysres-os

  2. SMT Solver

    Small SMT solver for equality theory decision procedures.

    Implements DPLL, two-watched literals, and is fully unit-tested.

    Source code available on github: robindar/semver-smt

  3. Rust compiler

    Compiler for a small (yet Turing-complete) subset of Rust.

    Borrow-checked and compiled down to x86 assembly.

    Source code available on github: robindar/compil-petitrust

  4. RISC V processor emulator

    "RISC V"-style basic processor emulator in Minijazz (Netlist superset) and Minijazz-to-C compiler. Supports few instructions but has a good build system and is unit-tested

    Source code available on gitlab: alpr-sysdig/processor

  5. Genetic algorithms for the Traveling Salesman Problem

    School project (TIPE)

    Genetic algorithm to find good solutions to the Traveling Salesman Problem and a testing structure around it to optimize meta-parameters like population size, mutation probability or crossover method

  6. Online portfolio


    Headless Debian to practice web design and server administration

    Also acts as a personal Git server and occasional blog

Let's work together

If you have a project that you want to get started, think you need my help with something, or just fancy saying hi, send me a message, I'm always happy to help !

Message Me