Cloud Journal

 

 



Oracle Scales Its Big Data Platform And Unveils Newer NoSQL Database


Written by  Ravi Vadugu | 20 December 2012
E-mail PDF

oracleOracle revises its big data platform with improved hardware, software, hadoop connectors and a revised Oracle NoSQL Database. The big data platform revision aims at increased performance, flexible Application Development and Tighter Integration with Oracle Database and Hadoop.

The big data appliance now runs 8-core Intel Xeon processors E5-2600 series and also claims an increase of 33% more processing power with its 288 CPU cores as compared to 18 compute and storage servers with 648 TB raw storage previously. The upgrade further includes 33 percent more memory per node with 1.1 TB of main memory and increased energy efficiency by 30%.

The software updates include support for latest version of Cloudera CDH4.1 including software upgrades developed collaboratively with Cloudera to simplify NameNode High Availability in Hadoop, eliminating the single point of failure in a Hadoop cluster. Further latest versions of Oracle Linux distribution, Oracle JDK and an updated distribution of open source R, optimized to work with high performance multi-threaded math libraries are part of the revised big data platform.

Built atop Oracle Berkeley DB as the underlying storage engine, the next version of Oracle NoSQL Database version 2.0 sees life. The newer version of this database adds efficient support for storage and retrieval of large objects such as documents and images, as well as dynamic elasticity and automatic rebalancing for allocating storage and compute resources in response to changing production data processing requirements. A "C" based API library to manage large objects and a more matured integration with Oracle database with an ability to query and view NoSQL data using SQL queries are also part of the revised version of the NoSQL data store.

“An influx of raw datasets is flooding every enterprise. However, before businesses can take advantage of the potential opportunity, they face a significant challenge in organizing these diverse data sources,” said Cetin Ozbutun, vice president, Data Warehousing and Big Data Technologies, Oracle. “The latest updates further improve the abilities of our customers to optimize big data workloads and integrate them with their data warehouses to easily analyze all data throughout the enterprise.”

Ravi Vadugu

Ravi Vadugu

An IT professional with over 12 years of experience. Project management is what i do. Curious/Interested in upcoming technologies, trends, software methodologies(e.g. Agile) and software tools. Love sharing knowledge with rest of the community.

blog comments powered by Disqus