entropy-logo

Journal Browser

Journal Browser

Information Theoretic Learning and Kernel Methods II

A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Information Theory, Probability and Statistics".

Deadline for manuscript submissions: 31 January 2025 | Viewed by 13

Special Issue Editors


E-Mail Website
Guest Editor
Department of Computer Science, Universitat Politècnica de Catalunya, 08034 Barcelona, Catalonia, Spain
Interests: feed-forward neural networks; support vector machines; kernel functions; similarity measures
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
Computational NeuroEngineering Lab, University of Florida, Gainesville, FL 32611, USA
Interests: information theoretic learning; kernel methods; adaptive signal processing; brain machine interfaces
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

The extraction of information from data and its feedback to the machine is a crucial step in successfully building a general framework for machine learning. We believe that information descriptors like entropy and divergence aptly suit this role, since these scalar quantifiers of information in data are easy to work with to derive various learning rules. Information theoretic learning (ITL) was originally derived for supervised learning applications. The idea is that the error distribution in supervised learning is often non-Gaussian, therefore traditional mean square error (MSE) is not the optimal criterion to use, and in such cases the information theoretic descriptors-based costs can provide better nonlinear models in a range of problems from system identification to classification. The popular ITL criteria include the minimum error entropy (MEE) criterion, maximum (or minimum) mutual information criterion, minimum divergence criterion, maximum correntropy criterion (MCC), etc. On the other hand, the kernel methods are powerful tools for nonlinear systems modeling in machine learning community. So far, many kernel learning algorithms have been developed, including SVM, kernel PCA, kernel adaptive filtering (KAF), and so on. Therefore, ITL and kernel methods are efficient approaches for learning nonlinear map** in non-Gaussian environments. It is also worth noting that there are close relationships between the ITL and kernel methods. For example, the sample estimator of the Renyi’s quadratic entropy can be viewed as a similarity measure in kernel space.

In this Special Issue, we seek contributions that apply either information theoretic descriptors or kernel methods to deal with various machine learning problems. The scope of the contributions will be very broad, including theoretical studies and practical applications to regression, classification, system identification, deep learning, unsupervised learning and reinforcement learning, etc.

Dr. Lluís A. Belanche Muñoz
Prof. Dr. Jose C. Principe
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at mdpi.longhoe.net by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • machine learning
  • information theoretic learning
  • kernel methods
  • entropy
  • nonlinear systems
  • non-Gaussian signals

Published Papers

This special issue is now open for submission.
Back to TopTop