EEC289Q (Winter 2026): Neurally Inspired Algorithms and Architectures

Course Summary

This course provides an overview of topics at the intersection of neural computation and algorithm design and is intended for students with a solid background in linear algebra and probability. Prior exposure to the basic principles of statistical estimation and optimization will be helpful, but not strictly required. The goals for the course are for students to learn about:

  • Mathematically rigorous tools for understanding and analyzing algorithms in neural computation.
  • The use of randomized methods in neural computation and algorithms.
  • Other paradigms for learning in addition to the currently dominant statistical one.
  • Connections between topics in the analysis of algorithms and neuroscience/neural computation.
  • Applications of some of the above to developing new kinds of computer hardware.

Course grades will be based on a mix of problem sets, in-class presentations, paper-readings, and a final project.

Note: this is not a class about deep learning. There are many excellent courses at UCD that cover deep learning, but this is not one of them. This course will focus primarily on the mathematical analysis of algorithms in neural computation and related topics in computing.

Pre-Requisites

EEC161 (or equivalent background in probability), MAT 22A (or equivalent background in linear algebra). Prior exposure to machine learning, statistics, and optimization will be helpful but not strictly required.

Grading

Grades will be based on the following components:

  • Problem sets (3): 30%
  • Final Paper: 30%
  • Final Presentation: 30%
  • Class participation and feedback on presentations: 10%

You are expected to regularly attend class and engage with paper discussions. You are also required to submit feedback forms for presentations by your peers.

Final Project

The course final project is fairly open ended. Examples may include:

  • A thorough literature review on a topic (related to the course) of your choosing
  • A replication study of a paper of your choosing
  • Implementation of an algorithm from the course in hardware
  • Application of one or more concepts from the course to your own research

Depending on the class size, final projects may end up being completed in groups.

Tentative Course Outline

A tentative list of topics is as follows. Topics are subject to change depending on the interests and pace of the class. The boundaries between units are not sharp and some may blend into each other.

Unit 1: Introduction

Unit 2: Algorithmic and Statistical Perspectives on Learning

Unit 3: Perceptrons and Hebbian Learning

  • Perceptrons
  • PCA and Eigenvalue Problems:
    • Power method and friends
    • Oja’s rule and friends
  • Related topics in ML:
    • Kernel machines
    • Kernel smoothers

Unit 4: Random Projection and Friends

Unit 5: Sparse Recovery and Friends

Unit 6: (Time permitting) Beyond McCulloch, Pitts, Rosenblatt

  • Sigma-Pi Neurons
    • Relationship to tensor-sketching
  • Spiking neural networks

Additional Suggested Papers for Final Projects (will be updated periodically)