entropy-logo

Journal Browser

Journal Browser

Robust Methods in Complex Scenarios and Data Visualization

A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Signal and Data Analysis".

Deadline for manuscript submissions: closed (30 September 2023) | Viewed by 9300

Special Issue Editor


E-Mail Website
Guest Editor
Department of Economics and Management and Interdepartmental Centre for Robust Statistics, University of Parma, Parma, Italy
Interests: all aspects of robust statistics (regression, multivariate analysis, classification and time series)
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

I am pleased to announce a Special Issue on the use of robust methods in complex scenarios combined with data visualization. Robust analytics has as its purpose the development of methods that are not sensitive to the presence of outliers and/or atypical observations. The use of robust tools is of paramount importance in the analysis of complex data affected by different forms of contamination. In robust analytics, the role of data visualization is crucial in order to highlight the effect of aberrant observations and/or masked outliers. As effective illustrations help users to access, understand, and interpret complex data, visual representation techniques have been evolved in the analysis of complex data. The objective of this Special Issue is to welcome papers that deal with novel robust methods in complex non-linear scenarios affected by different types of contamination and make use of advanced visualization tools to communicate information in a clear and concise way

We invite manuscripts dealing with a wide range of topics, from regression to multivariate analyses, and from time series to supervised and unsupervised classification, which combine novel data visualization methods and robust data analysis, such as presenting new visualization tools, software developments, and novel graphical or tabular ways of representing complex data structures, robust methodologies, or outputs.

Prof. Dr. Marco Riani
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at mdpi.longhoe.net by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • regression
  • multivariate analysis
  • time series data
  • cluster discriminant analysis
  • data visualization
  • robust statistics
  • complex data

Published Papers (5 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

17 pages, 14131 KiB  
Article
Denoising Non-Stationary Signals via Dynamic Multivariate Complex Wavelet Thresholding
by Kim C. Raath, Katherine B. Ensor, Alena Crivello and David W. Scott
Entropy 2023, 25(11), 1546; https://doi.org/10.3390/e25111546 - 16 Nov 2023
Viewed by 1005
Abstract
Over the past few years, we have seen an increased need to analyze the dynamically changing behaviors of economic and financial time series. These needs have led to significant demand for methods that denoise non-stationary time series across time and for specific investment [...] Read more.
Over the past few years, we have seen an increased need to analyze the dynamically changing behaviors of economic and financial time series. These needs have led to significant demand for methods that denoise non-stationary time series across time and for specific investment horizons (scales) and localized windows (blocks) of time. Wavelets have long been known to decompose non-stationary time series into their different components or scale pieces. Recent methods satisfying this demand first decompose the non-stationary time series using wavelet techniques and then apply a thresholding method to separate and capture the signal and noise components of the series. Traditionally, wavelet thresholding methods rely on the discrete wavelet transform (DWT), which is a static thresholding technique that may not capture the time series of the estimated variance in the additive noise process. We introduce a novel continuous wavelet transform (CWT) dynamically optimized multivariate thresholding method (WaveL2E). Applying this method, we are simultaneously able to separate and capture the signal and noise components while estimating the dynamic noise variance. Our method shows improved results when compared to well-known methods, especially for high-frequency signal-rich time series, typically observed in finance. Full article
(This article belongs to the Special Issue Robust Methods in Complex Scenarios and Data Visualization)
Show Figures

Figure 1

29 pages, 8326 KiB  
Article
Towards Data-Driven Decision-Making in the Korean Film Industry: An XAI Model for Box Office Analysis Using Dimension Reduction, Clustering, and Classification
by Subeen Leem, Jisong Oh, Dayeong So and Jihoon Moon
Entropy 2023, 25(4), 571; https://doi.org/10.3390/e25040571 - 27 Mar 2023
Cited by 2 | Viewed by 2871
Abstract
The Korean film market has been rapidly growing, and the importance of explainable artificial intelligence (XAI) in the film industry is also increasing. In this highly competitive market, where producing a movie incurs substantial costs, it is crucial for film industry professionals to [...] Read more.
The Korean film market has been rapidly growing, and the importance of explainable artificial intelligence (XAI) in the film industry is also increasing. In this highly competitive market, where producing a movie incurs substantial costs, it is crucial for film industry professionals to make informed decisions. To assist these professionals, we propose DRECE (short for Dimension REduction, Clustering, and classification for Explainable artificial intelligence), an XAI-powered box office classification and trend analysis model that provides valuable insights and data-driven decision-making opportunities for the Korean film industry. The DRECE framework starts with transforming multi-dimensional data into two dimensions through dimensionality reduction techniques, grou** similar data points through K-means clustering, and classifying movie clusters through machine-learning models. The XAI techniques used in the model make the decision-making process transparent, providing valuable insights for film industry professionals to improve the box office performance and maximize profits. With DRECE, the Korean film market can be understood in new and exciting ways, and decision-makers can make informed decisions to achieve success. Full article
(This article belongs to the Special Issue Robust Methods in Complex Scenarios and Data Visualization)
Show Figures

Figure 1

19 pages, 2620 KiB  
Article
Performance Evaluation of Complex Equipment Considering Resume Information
by **angyi Zhou, Zhijie Zhou, Guanyu Hu, **aoxia Han and Leiyu Chen
Entropy 2022, 24(12), 1811; https://doi.org/10.3390/e24121811 - 12 Dec 2022
Cited by 1 | Viewed by 1059
Abstract
It is of great significance to obtain the performance state of complex equipment to protect equipment and maintain its normal operation. The majority of the performance evaluation methods are based on test data, but resume information is not considered. With its wide applicability [...] Read more.
It is of great significance to obtain the performance state of complex equipment to protect equipment and maintain its normal operation. The majority of the performance evaluation methods are based on test data, but resume information is not considered. With its wide applicability and completeness, the resume information can be used in the comprehensive evaluation of equipment in various non-testing situations. By incorporating resume information into the performance evaluation of complex equipment, the flexible use of test data and resume information can result in a more comprehensive and accurate evaluation. Therefore, this paper focuses on the evaluation method of complex equipment performance based on evidential reasoning (ER) considering resume information. In order to unify the test data and resume information in the same framework, a novel method is proposed to transform them into the ER-based performance evaluation. On this basis, according to the index types, different reliability calculation methods are put forward, with one being based on the first-order fitting coefficient of variation, and the other being based on average time to failure; the index weight is analyzed based on the method of expert weight construction. Then, the transformed information with reliability and weight are fused by the ER rule. Finally, a performance evaluation case of a certain inertial measurement unit (IMU) is conducted to verify the effectiveness of the proposed method. Full article
(This article belongs to the Special Issue Robust Methods in Complex Scenarios and Data Visualization)
Show Figures

Figure 1

12 pages, 6403 KiB  
Article
Dynamic Mixed Data Analysis and Visualization
by Aurea Grané, Giancarlo Manzi and Silvia Salini
Entropy 2022, 24(10), 1399; https://doi.org/10.3390/e24101399 - 1 Oct 2022
Viewed by 1721
Abstract
One of the consequences of the big data revolution is that data are more heterogeneous than ever. A new challenge appears when mixed-type data sets evolve over time and we are interested in the comparison among individuals. In this work, we propose a [...] Read more.
One of the consequences of the big data revolution is that data are more heterogeneous than ever. A new challenge appears when mixed-type data sets evolve over time and we are interested in the comparison among individuals. In this work, we propose a new protocol that integrates robust distances and visualization techniques for dynamic mixed data. In particular, given a time tT={1,2,,N}, we start by measuring the proximity of n individuals in heterogeneous data by means of a robustified version of Gower’s metric (proposed by the authors in a previous work) yielding to a collection of distance matrices {D(t),tT}. To monitor the evolution of distances and outlier detection over time, we propose several graphical tools: First, we track the evolution of pairwise distances via line graphs; second, a dynamic box plot is obtained to identify individuals which showed minimum or maximum disparities; third, to visualize individuals that are systematically far from the others and detect potential outliers, we use the proximity plots, which are line graphs based on a proximity function computed on {D(t),tT}; fourth, the evolution of the inter-distances between individuals is analyzed via dynamic multiple multidimensional scaling maps. These visualization tools were implemented in the Shinny application in R, and the methodology is illustrated on a real data set related to COVID-19 healthcare, policy and restriction measures about the 2020–2021 COVID-19 pandemic across EU Member States. Full article
(This article belongs to the Special Issue Robust Methods in Complex Scenarios and Data Visualization)
Show Figures

Figure 1

16 pages, 2179 KiB  
Article
Introducing Robust Statistics in the Uncertainty Quantification of Nuclear Safeguards Measurements
by Andrea Cerasa
Entropy 2022, 24(8), 1160; https://doi.org/10.3390/e24081160 - 19 Aug 2022
Cited by 2 | Viewed by 1394
Abstract
The monitoring of nuclear safeguards measurements consists of verifying the coherence between the operator declarations and the corresponding inspector measurements on the same nuclear items. Significant deviations may be present in the data, as consequence of problems with the operator and/or inspector measurement [...] Read more.
The monitoring of nuclear safeguards measurements consists of verifying the coherence between the operator declarations and the corresponding inspector measurements on the same nuclear items. Significant deviations may be present in the data, as consequence of problems with the operator and/or inspector measurement systems. However, they could also be the result of data falsification. In both cases, quantitative analysis and statistical outcomes may be negatively affected by their presence unless robust statistical methods are used. This article aims to investigate the benefits deriving from the introduction of robust procedures in the nuclear safeguards context. In particular, we will introduce a robust estimator for the estimation of the uncertainty components of the measurement error model. The analysis will prove the capacity of robust procedures to limit the bias in simulated and empirical contexts to provide more sounding statistical outcomes. For these reasons, the introduction of robust procedures may represent a step forward in the still ongoing development of reliable uncertainty quantification methods for error variance estimation. Full article
(This article belongs to the Special Issue Robust Methods in Complex Scenarios and Data Visualization)
Show Figures

Figure 1

Back to TopTop