## My courses, my interests

Most of my courses target efficient computations in computational linear algebra. Solving large systems of linear equations are often a focal point of many efforts of modern computational science. And, of course, mathematical formulation of this task is inseparably connected not only to numerical analysis and numerical algorithms, but also to theoretical background and algorithms of computer science. While an important item used in this field is that of a matrix, in order to solve problems efficiently one needs to consider matrices not as black boxes, but looking into their structure. Roughly said, we need to exploit matrix sparsity, as indicated in the following figures.

Once we try to do this, we enter a completely new world. In order to understand i, we need not only to use concepts of graph theory and other tools of computer science, but also understand, at least roughly, development and trends related to computer architectures. The two course I am involved in are about this.

The courses can be considered to be oriented a bit more in an algorithmic direction. In order to construct new computational algorithms, we need to get understanding. And the main stress is put exactly to get some understanding, believing that having obtained a sufficient theoretical background from general mathematics in other courses.

### Sparse matrices in direct methods (2022/2023)

The goal of this course is to discuss sparse matrices in the numerical linear algebra tasks like
• Solving systems of linear algebraic equations by direct methods (variants of elimination)
• Solving systems of linear algebraic equations by preconditioned iterative methods (discussing preconditioner constructions)

Exercises are prepared in order to demonstrate discussed algebraic approaches. They are based on Matlab. But, even inside Matlab, data structures that are considered are often such that production codes are based on.

### Parallel matrix computations (2022/2023)

To be described here (for summer semester)