Various Approaches Proposed for Eliminating Duplicate Data in a System

https://doi.org/10.26552/com.C.2021.4.A223-A232

Keywords: duplication data, distributed data processing, software architecture

Abstract

The growth of big data processing market led to an increase in the overload of computation data centers, change of methods used in storing the data, communication between the computing units and computational time needed to process or edit the data. Methods of distributed or parallel data processing brought new problems related to computations with data which need to be examined. Unlike the conventional cloud services, a tight connection between the data and the computations is one of the main characteristics of the big data services. The computational tasks can be done only if relevant data are available. Three factors, which influence the speed and efficiency of data processing are - data duplicity, data integrity and data security. We are motivated to study the problems related to the growing time needed for data processing by optimizing these three factors in geographically distributed data centers.

     

Author Biographies

Roman Čerešňák

Department of Informatics, Faculty of Management Science and Informatics, University of Zilina, Zilina, Slovakia

Karol Matiaško

Department of Informatics, Faculty of Management Science and Informatics, University of Zilina, Zilina, Slovakia

Adam Dudáš

Department of Computer Science, Faculty of Natural Siences, Matej Bel University, Banska Bystrica, Slovakia

Published
2021-10-01
How to Cite
Roman Čerešňák, Karol Matiaško, & Adam Dudáš. (2021). Various Approaches Proposed for Eliminating Duplicate Data in a System. Communications - Scientific Letters of the University of Zilina, 23(4), A223-A232. https://doi.org/10.26552/com.C.2021.4.A223-A232
Section
Operation and Economics in Transport