David Sprunger

Date: September 10, 2019
Time: 12:00 am - 12:00 am
Location: ICT 616

We review recent results using Cartesian differential categories to model backpropagation through time, a training technique from machine learning used with recurrent neural networks.  We show that the property of being a Cartesian differential category is preserved by a variant of a stateful construction commonly used in signal flow graphs. Using an abstracted version of backpropagation through time, we lift the lift the differential operator from the starting differential category to the stateful one.
Bio:
    David is a project research at the ERATO MMSD project in Tokyo.  This project aims to extend formal methods and software verification techniques to cyber-physical systems, with particular emphasis on applications to automotive control and manufacturing.
    David received a PhD in mathematics at Indiana University in August 2017 as a student of Larry Moss.  His academic research interests are primarily in coalgebra, logic, and category theory.  Since moving to Tokyo, he has been developing an interest in quantitative refinements of bisimulation and other coalgebraically defined structures.  He has also been looking into deep learning and neural networks.