NetApp FAS NFS Connector for Hadoop

Run big data analytics on existing data stored on NFS-based systems using NetApp FAS NFS Connector for Hadoop.

Use NetApp® FAS NFS Connector for Hadoop to run big data analytics on NFSv3 data—without moving the data, creating a separate analytics silo, or setting up a Hadoop cluster. You can start analyzing existing data with Hadoop right away. Your IT staff can support Hadoop with ease. Workflows are simplified because you don’t have to copy and manage data across silos.

Leverage NetApp FAS NFS Connector for Hadoop to run proof-of-concept projects, then set up a Hadoop cluster using NetApp Solutions for Hadoop for data from the NetApp Utility Toolchest.

FAS NFS Connector for Hadoop lets you swap out of Hadoop Distributed Filesystem (HDFS) for NFS or run NFS alongside HDFS. NetApp FAS NFS Connector for Hadoop works with MapReduce for compute or processing and supports other Apache projects, including HBase (columnar database) and Spark (processing engine compatible with Hadoop). These capabilities let FAS NFS Connector support diverse workloads—including batch, in-memory, streaming, and more.

To learn more, download the NetApp FAS NFS Connector for Hadoop, or get the NetApp FAS NFS Connector for Hadoop PDF technical report or the Apache Hadoop on Data Fabric Enabled by NetApp PDF technical report.