Matthew Adams
PhD Candidate

About Me

matt

I am currently a PhD candidate in Mathematics at the University of Calgary and a Data Analyst with the Alberta Children's Hospital Research Institute. My research is in the mathematical theory of artificial neural networks and deep learning. I am particularly interested in the functional analysis lurking behind these methods and their applications to computer vision. In my role as a Data Analyst, I'm looking at the strength and breadth of collaborations between women's and children's health and wellness research across Canada.

In addition to my studies, I teach courses in introductory Python through the Continuing Education department here at the university. In this role, I have designed and delivered curriculums for both Foundations of Python and Python for Data Analysis. I also work as a teaching assistant for a variety of undergraduate courses, and was twice honored last year with the Fred A. McKinnon Award and the Students' Union Award for teaching excellence!

Research Interests

  • My interest in this field grew from my thorough enjoyment of undergraduate analysis courses. I'm most interested in Sobolev spaces, which are Lp spaces together with what are called 'weak derivatives'. It turns out that these are very natural spaces in which we search for solutions to many mathematical problems. For example, artificial neural networks search for accurate parametrizations of functions contained in Sobolev spaces.

  • These are enjoying immense popularity and are seen as a universal solution to countless problems. Somewhat suprisingly, the mathematical theory behind Neural Networks (NNs) is not well-developed or understood. Through my work, I hope to shed some theoretical light on the accuracy and stability of NNs with the most generality possible.

    Of course, this means I'll be designing, testing, and comparing various NN models, so stay tuned!

  • How can the world around us be represented digitally? How can a computer learn to see as we do? I'm examining the current framework for understanding these long-standing questions through the lens of functional analysis. I hope to add some insight through the Littlewood-Paley theory and the theory of wavelets.

  • I think that sharing knowledge is just as important as gaining it. I am energized by the spark of understanding that I see in students' eyes when they grasp a difficult concept. I have designed curricula for several courses, including a graduate course on the connections between geometry, art, and nature, a course in Python fundamentals, (ICT 778 at the University of Calgary), a course in Python for data analysis (ICT 779 at the University of Calgary), and an intensive course to bridge the gap between high school and university mathematics. I also post lighthearted videos about mathematics and other STEM topics on my YouTube channel.

Recent Work

Gene Golub SIAM Summer School

Theory and Practice of Deep Learning

image

G2S3 2020/2021, Cape Town, South Africa

Theory and Practice of Deep Learning

I was accepted to participate in this awesome summer school in early 2020, but then coronavirus came along and canceled everyone's travel plans! Even so, this experience was amazing. Through the two-week period of the summer school, I attending the following 5 mini-courses:

  • Theory and Practice of Deep Learning by Dr. Bubacarr Bah
    • Focus on network architectures and implementation.
  • Perspectives on the Theoretical Understanding of Deep Networks by Dr. Jared Tanner
    • Focus on network initialization and choosing the network activation function.
  • Large-scale Optimization for Deep Learning by Dr. Coralia Cartis
    • Focus on algorithms for minimizing the network loss function.
  • Functional Analysis and Approximation Theory for Deep Learning by Dr. Kasso Okoudjou
    • Focus on approximation of Sobolev functions by various methods, as well as methods for nonlinear dimensionality reduction
  • The Modern Mathematics of Deep Learning by Dr. Gitta Kutyniok
    • Focus on image processing and computer vision applications of deep learning and shearlet transforms for learning image representations.

SSVM 2019

Computer Vision and Image Analysis

image

SSVM 2019, Hofgeismar, Germany

Computer Vision and Image Analysis

I was thrilled to travel to Hofgeismar, Germany to present my paper on the use of fractional derivatives in a classic image feature detector. The Harris-Laplace feature detector was defined in its simplest form in the late 1980s and has been widely used ever since. My work demonstrated that using fractional derivatives in the place of traditional first-order derivatives improves the repeatability of detected features.

This paper was a milestone: my first publication! I thoroughly enjoyed the conference, particularly the keynote addresses by Gitta Kutyniok and Julia Schnabel on the use and future of machine learning algorithms in computer vision. I was accompanied by my wonderful wife, and we enjoyed a few day outings in the beautiful communities surrounding Hofgeismar.

Photo of Schlösschen Schönburg, Gesundbrunnen, Hofgeismar © Raimond Spekking / CC BY-SA 4.0 (via Wikimedia Commons)

Fractional Derivatives in Python

Numerical Analysis and Programming

image

Fractional Derivatives in Python

Numerical Analysis and Programming

While working on my master's thesis, I noticed that it was difficult to find code implementations of fractional derivatives. There were a few scattered around the web, but I wanted a convenient repository for a variety of algorithms. Hence, I wrote differint: A Python Package for Numerical Fractional Calculus.

This Python package collects 4 numerical approximations to fractional derivatives. I had been sitting on this for two years before I finally uploaded it to the arXiv. I guess late is better than never!