As computational power grows faster, the problems become more complex. Optimization is at the heart of success when it comes to solving computer problems efficiently and with ease.

ULethbridge post-doctoral fellow Sajad Fathi Hafshejani aims to advance optimization to keep up with the ever-evolving computational power of the future.

Deep learning is modeled after the brain

Deep learning algorithms are essential to the services and processes we use in our daily lives. Essentially deep learning is modeled after the brain, by running data through neural networks the next layer of data passes simplified versions of data to the next layer. Services such as Google, Netflix, Facial Recognition Software and Virtual Assistance all utilize these types of algorithms.

“Nowadays, optimization algorithms are one of the most important subjects in mathematics,” says Sajad . “Finding algorithms that can converge to the optimal solution in the least number of iterations has become one of the challenges for researchers.”

With an impressive academic career behind him, Sajad currently holds a post-doctoral fellowship from the Pacific Institute for the Mathematical Sciences (PIMS) 2021-2023. In addition, his research is supported by his supervisor, Dr. Robert Benkoczi (National Sciences and Engineering Research Council of Canada), and Dr. Daya Gaur (Alberta Major Innovation Fund for Quantum Technologies). Sajad’s research has enabled him to become proficient in the Python programming environment.  His current project focuses on an efficient stochastic gradient descent algorithm. Simply put, a stochastic gradient descent algorithm focuses on finding the lowest approximate value of a function. With efficiency at the forefront, Sajad aims to solve these functions with the smallest number of steps possible for maximum optimization.

“Deep learning is one of the most popular algorithms in the world,” he says. “The efficiency of this algorithm depends on the amount of step size in each repetition. Therefore, finding an efficient step size is crucial for this algorithm. In this project, we are looking to find an efficient step size for this algorithm.”

In practice, Sajad  works as part of a team consisting of Dr. Robert Benkoczi,  Dr. Daya Gaur and Shahadat Hossain on the implementation of said algorithms to test the performance and make sure they are the fastest they possibly can be.

Sajad has a natural passion for optimizing algorithms.  Having worked on them during his master’s and PhD programs, the familiarity of the algorithms aids his current work. Sajad has used past research papers to prove convergence of algorithms. Having years of experience in this field and many projects behind him has proved very useful to Sajad, and this current project gives him important skills to support his career trajectory.