Big data - Hadoop in MetaCentrum
Big data - Hadoop in MetaCentrum
It is our pleasure to announce that MetaCentrum has commissioned a dedicated Hadoop cluster for big data processing. The environment is intended primarily for computing Map-Reduce jobs to process big, usually unstructured data. The service comes with usual extensions (Pig, Hive, Hbase, YARN, …) and is fully integrated with the MetaCentrum infrastructure. It is available to all MetaCentrum users who register with a dedicated 'hadoop' group. The cluster currently consists of 27 nodes with a total of 432 CPUs, 3.5 TB of RAM and 1 PB of disk space in HDFS. Please find additional information, including links to a registration form and to a growing Wiki at http://www.metacentrum.cz/en/hadoop/
With best regards,
Ivana Krenkova & Zdenek Sustr, MetaCentrum
Ivana Křenková, Mon Mar 09 13:57:00 CET 2015

