Symantec Corp. has come up with an add-on solution for Symantec’s Cluster File System working together with Hortonworks that enables customers to run Big Data analytics on their existing infrastructure. Since the Hadoop file system is replaced with Symantec’s Cluster File System, each node in the cluster can also access data simultaneously, eliminating both the performance bottleneck and single point of failure.
With an ability to scale up to 16 PB of data including structured and unstructured data, enterprises can leverage existing infrastructure integrating existing storage assets into Hadoop processing framework. This will enable organizations to run analytics where the data resides there by avoiding painful data migrations and related costs.
The Symantec Enterprise Solution for Hadoop is available now to existing Cluster File System customers at no additional charge. Symantec Enterprise Solution for Hadoop supports Hortonworks Data Platform (HDP) 1.0 and Apache Hadoop 1.0.2. Customers running HDP 1.0 will be able to get Hadoop support and training from Symantec’s Hadoop partner Hortonworks, a leading commercial vendor promoting the innovation, development and support of Apache Hadoop.
“Our Enterprise Solution for Hadoop helps connect Hadoop’s business analytics to the existing storage environment while addressing key challenges of server sprawl and high availability for critical applications. It’s now entirely possible to get the Big Data solution you want from the infrastructure you’ve got.” said Don Angspatt, vice president of product management, Storage and Availability Management Group, Symantec Corp.