Depending on the amount of data to process, file generation may take longer.

If it takes too long to generate, you can limit the data by, for example, reducing the range of years.

Article

Download file Download BibTeX

Title

Pre-Processing and Modeling Tools for Bigdata

Authors

Year of publication

2016

Published in

Foundations of Computing and Decision Sciences

Journal year: 2016 | Journal volume: vol. 41 | Journal number: no. 3

Article type

scientific article

Publication language

english

Keywords
EN
  • data modeling
  • BigData
  • NoSQL
  • MapReduce
  • pre-processing
Abstract

EN Modeling tools and operators help the user / developer to identify the processing field on the top of the sequence and to send into the computing module only the data related to the requested result. The remaining data is not relevant and it will slow down the processing. The biggest challenge nowadays is to get high quality processing results with a reduced computing time and costs. To do so, we must review the processing sequence, by adding several modeling tools. The existing processing models do not take in consideration this aspect and focus on getting high calculation performances which will increase the computing time and costs. In this paper we provide a study of the main modeling tools for BigData and a new model based on pre-processing.

Pages (from - to)

151 - 162

DOI

10.1515/fcds-2016-0009

URL

https://www.sciendo.com/article/10.1515/fcds-2016-0009

License type

CC BY-NC-ND (attribution - noncommercial - no derivatives)

Full text of article

Download file

Access level to full text

public

Ministry points / journal

15

This website uses cookies to remember the authenticated session of the user. For more information, read about Cookies and Privacy Policy.