PERHAPS A GIFT VOUCHER FOR MUM?: MOTHER'S DAY

Close Notification

Your cart does not contain any items

A Hadoop Based Framework for Secure Medical Datasets

Harinder Singh

$69.95   $59.49

Paperback

Not in-store but you can order this
How long will it take?

QTY:

English
Mohammed Abdul Sattar
10 February 2024
The development in medical field leads to produce massive amount of medical data. In 2002, more than 12000 images a day were produced by the Department of Radiology of a hospital in Geneva. The medical datasets are available for further exploration and research which have far reaching impact on the progress and execution of health programs. The information archived from exploring medical datasets paves the way for health administration, e-health diagnosis and therapy. So, there is urgent need to accentuate the research in medical data.

The medical data is a huge growing industry and its size normally lies in terabytes. Such a big data puts forward many challenges and issues due to its large volume, variety, velocity, value and variability. Moreover, the working of traditional file management systems is slowing down due to its incapability of managing unstructured, variable and complex big data. The managing of such big data is very cumbersome and time-consuming task which requires new computing techniques. So, the exponential growth of medical data has necessitated a paradigm shift in the way the data is managed and processed. The recent technological advancements influenced the way of storing and processing big data. This motivated us to think about finding new solutions for managing volumetric medical datasets and to obtain valuable information efficiently.

Hadoop is a top-level Apache project and is written in Java. Hadoop was developed by Doug Cutting as a collection of open-source projects. It is presently used on massive amount of unstructured data. With Hadoop, the data can be harnessed that was previously difficult to analyze. Hadoop has the ability to process extremely large data with changing structure. Hadoop is composed of different modules like HBase, Pig, HCatalog, Hive, Zookeeper, Oozie and Kafka, but the most common paradigms for big data are Hadoop Distributed File sSystem (HDFS) and MapReduce.

By:  
Imprint:   Mohammed Abdul Sattar
Dimensions:   Height: 279mm,  Width: 216mm,  Spine: 7mm
Weight:   304g
ISBN:   9798224231140
Pages:   124
Publication Date:  
Audience:   General/trade ,  ELT Advanced
Format:   Paperback
Publisher's Status:   Active

See Also