'Handbook of Data Intensive Computing' evaluates the state-of-the-art in new field
'The Handbook of Data Intensive Computing' is a collection of essays featuring contributions from world experts in the field of data intensive computing from academia, research laboratories and private industry. Real-world examples are provided throughout the book. The book was edited by Armando Escalante, CTO of LexisNexis Risk Solutions and head of HPCC Systems, an open source, enterprise-proven Big Data analytics platform, and Dr. Borko Furht, professor and chairman of the Department of Computer Science and Engineering at Florida Atlantic University.
Data intensive computing refers to capturing, managing, analyzing and understanding data at volumes and rates that push the frontiers of current technologies. The challenge of data intensive computing is to provide the hardware architectures and related software systems and techniques which are capable of transforming ultra-large data into valuable knowledge. Data intensive computing demands a fundamentally different set of principles than mainstream computing. Data-intensive applications typically are well suited for large-scale parallelism over the data and also require an extremely high degree of fault-tolerance, reliability and availability.
Handbook of Data Intensive Computing is designed as a reference for practitioners and researchers, including programmers, computer and system infrastructure designers and developers. This book is also beneficial for business managers, entrepreneurs and investors. Escalante and Furht are also the editors of the Handbook of Cloud Computing (Springer 2010).
Provided by Springer