This book explains how to perform data de-noising, in large scale, with
a satisfactory level of accuracy. Three main issues are considered.
Firstly, how to eliminate the error propagation from one stage to next
stages while developing a filtered model. Secondly, how to maintain the
positional importance of data whilst purifying it. Finally, preservation
of memory in the data is crucial to extract smart data from noisy big
data. If, after the application of any form of smoothing or filtering,
the memory of the corresponding data changes heavily, then the final
data may lose some important information. This may lead to wrong or
erroneous conclusions. But, when anticipating any loss of information
due to smoothing or filtering, one cannot avoid the process of denoising
as on the other hand any kind of analysis of big data in the presence of
noise can be misleading. So, the entire process demands very careful
execution with efficient and smart models in order to effectively deal
with it.