Rafael Boloix-Tortosa
Tue 02 Oct 2018, 13:15 - 14:00
Hudson Beare Building, Classroom 8

If you have a question about this talk, please contact: Ardimas Purwita (s1600157)

Image for Widely Linear Complex-Valued Kernel Methods for Regression

Rafael Boloix-Tortosa is currently an Associate Professor at the Universidad de Sevilla, Dep. of Signal Processing and Communications.

Abstract:
Complex-valued signal processing finds application in a vast range of current systems in telecommunications, optics, electromagnetism, and acoustics, among others. It has been extensively studied in the linear case. Furthermore, it is well known that in many cases the linear estimation does not provide the best solution, and it is necessary to use a more versatile widely-linear or augmented formulation. In the widely-linear formulation we have two linear terms, one that depends on the complex-valued inputs and the other that depends on their complex conjugate. It improves the solution in cases where the complex-valued signals are non-proper, i.e., have a non-zero pseudo-covariance.  

Regarding the non-linear processing of complex-valued signals, it has been addressed from the point of view of neural networks, and recently using reproducing kernel Hilbert spaces (RKHS). Complex kernel-based algorithms have been proposed for classification, regression or kernel principal component analysis. There have been attempts to set a 'widely linear' or augmented version of these algorithms, but the results were not as good as expected. In this seminar, I present a novel augmented or 'widely-linear' regression formulation within the framework of RKHS that provides full representation capabilities. This formulation includes a second kernel term that we denote as pseudo-kernel. I will discuss about the need for this pseudo-kernel and wether it, or the kernel term itself, must be complex-valued or real-valued. The conclusions are that the use of a pseudo-kernel can be justified in cases where the real and imaginary parts are correlated and learning them independently is, at best, suboptimal, also it is needed when the real and imaginary parts are not best represented by the same kernel, i.e, the same measure of similarity.

To conclude the talk, I will present two examples that remark the necessity of the pseudo-kernel term in any general complex-valued regression algorithm. This two examples are the well-known Gaussian Process for Regression and the kernel least-mean-square algorithms.

Biography:
Rafael Boloix-Tortosa received the M.Sc. degree in telecommunication engineering in 2000, and the Ph.D. degree in 2005, from the Universidad de Sevilla, Spain. He is currently an Associate Professor at the Universidad de Sevilla, Dep. of Signal Processing and Communications. He has also been a research scientist at the Department of Electronic Engineering (University of Seville), a visiting researcher at the University of Edinburgh (2012), and worked as independent consulting engineer. His research interests include algorithms for complex-valued machine learning and its application to communications and image processing.