EECS Seminar: Digging Performance From Patterns in Data - Case Studies on FFT and Convolution
Department of Electrical and Computer Engineering
University of Delaware
Abstract: Fast Fourier Transform (FFT) and convolution are some of the most fundamental computation routines conducted when trying to understand data. For example, convolutional neural networks (CNN), a widely used deep-learning tool, typically spend more than 96% of execution time on the steps of convolution. Despite fruitful exploration of mapping FFT and convolution algorithms onto computer hardware in more efficient ways, their efficiency has a theoretical ceiling, i.e., their computation if input oblivious. No matter how richly patterned in the data that is fed into these steps, the algorithms are stubbornly unchanging. The same operations and same orders are prescribed in a fixed way for different inputs. The overlook of data patterns is an unfortunate but unavoidable conclusion from theory, though it always makes us feel that there is a huge hidden waste of performance.
We can’t break away from the theoretic input obliviousness. However, in this talk, we are going to examine some artificially induced patterns in data, either from the way we invoke these computation workloads, or from the pre-processing of data in applications. After some interesting mathematical and software plays, the patterns can be exploited to significantly accelerate their real-world performance. Specifically, we present our case studies on streaming FFT and convolution in CNN, with special focus on the motivation, the formulation of the data patterns and the high/low levels of transformations for better performance.
Bio: Xiaoming Li is an associate professor in the Department of Electrical and Computer Engineering at University of Delaware. The main motivation of his research is to extract mathematical structures from computation and convert them into more efficient forms. Along these lines, he has contributed to some fundamental computation problems including sorting, linear algebra, FFT and Convolutional Neural Networks.