Matthew Adams
PhD Student

About Me

matt

I am currently a PhD student in Mathematics at the University of Calgary. My research is in the mathematical theory of artificial neural networks and deep learning. I am particularly interested in the functional analysis lurking behind these methods and their applications to computer vision.

In addition to my studies, I teach courses in introductory Python through the Contuining Education department here at the university. In this role, I have designed and delivered curriculums for both Foundations of Python and Python for Data Analysis. I also work as a teaching assistant for a variety of undergraduate courses, and was twice honored last year with the Fred A. McKinnon Award and the Students' Union Award for teaching excellence!

Research Interests

  • My interest in this field grew from my thorough enjoyment of undergraduate analysis courses. I'm most interested in Sobolev spaces, which are Lp spaces together with what are called 'weak derivatives'. It turns out that these are very natural spaces in which we search for solutions to many mathematical problems. For example, artificial neural networks search for accurate parametrizations of functions contained in Sobolev spaces.

  • These are enjoying immense popularity and are seen as a universal solution to countless problems. Somewhat suprisingly, the mathematical theory behind Neural Networks (NNs) is not well-developed or understood. Through my work, I hope to shed some theoretical light on the accuracy and stability of NNs with the most generality possible.

    Of course, this means I'll be designing, testing, and comparing various NN models, so stay tuned!

  • How can the world around us be represented digitally? How can a computer learn to see as we do? I'm examining the current framework for understanding these long-standing questions through the lens of functional analysis. I hope to add some insight through the Littlewood-Paley theory and the theory of wavelets.

  • I think that sharing knowledge is just as important as gaining it. I am energized by the spark of understanding that I see in students' eyes when they grasp a difficult concept. I have designed curricula for several courses, including a graduate course on the connections between geometry, art, and nature, a first course in Python programming, and an intensive course to bridge the gap between high school and university mathematics.

Recent Work

SSVM 2019

Computer Vision and Image Analysis

image

SSVM 2019, Hofgeismar, Germany

Computer Vision and Image Analysis

I was thrilled to travel to Hofgeismar, Germany to present my paper on the use of fractional derivatives in a classic image feature detector. The Harris-Laplace feature detector was defined in its simplest form in the late 1980s and has been widely used ever since. My work demonstrated that using fractional derivatives in the place of traditional first-order derivatives improves the repeatability of detected features.

This paper was a milestone: my first publication! I thoroughly enjoyed the conference, particularly the keynote addresses by Gitta Kutyniok and Julia Schnabel on the use and future of machine learning algorithms in computer vision. I was accompanied by my wonderful wife, and we enjoyed a few day outings in the beautiful communities surrounding Hofgeismar.

Photo of Schlösschen Schönburg, Gesundbrunnen, Hofgeismar © Raimond Spekking / CC BY-SA 4.0 (via Wikimedia Commons)

Fractional Derivatives in Python

Numerical Analysis and Programming

image

Fractional Derivatives in Python

Numerical Analysis and Programming

While working on my master's thesis, I noticed that it was difficult to find code implementations of fractional derivatives. There were a few scattered around the web, but I wanted a convenient repository for a variety of algorithms. Hence, I wrote differint: A Python Package for Numerical Fractional Calculus.

This Python package collects 4 numerical approximations to fractional derivatives. I had been sitting on this for two years before I finally uploaded it to the arXiv. I guess late is better than never!