Snappy Installation on Ubuntu:
1. g++, make and build-essential packages should have been installed prior to installation of snappy.
Installing these packages ,follow the below commands in the same order.
$ sudo apt-get update
$ sudo apt-get install g++ make build-essential
Download snappy tar ball from http://google.github.io/snappy/
issue below commands from terminal in the same order from the folder of snappy.
$ ./configure
$ make
$ sudo make install
Snappy Compression Configuration For Hadoop:
Solution:
After installation of snappy on Ubuntu, we need to copy the libsnappy*.so* files from /usr/local/lib into $HADOOP_HOME/lib/native/ library location and need to set LD_LIBRARY_PATH , JAVA_LIBRARY_PATH environment variables(Try this approach).
Otherwise the best and simple method is that, now a days, latest hadoop distributions (try this hadoop-3.0.0-alpha4) are coming with snappy installed already. We cam copy all lib*.* files from latest/lib/native/ directory into our hadoop native library $HADOOP_HOME/lib/native/ location.
Define these two environment variables in .bashrc file.
export LD_LIBRARY_PATH="$HADOOP_HOME/lib/native/"
export JAVA_LIBRARY_PATH="$HADOOP_HOME/lib/native/"
Add below property into core-site.xml:
<property>
<name>io.compression.codecs</name>
<value>
org.apache.hadoop.io.compress.GzipCodec,
org.apache.hadoop.io.compress.DefaultCodec,
org.apache.hadoop.io.compress.BZip2Codec,
org.apache.hadoop.io.compress.SnappyCodec
</value>
</property>
Restart the hadoop and try
Gpt info and thanks to site ==> http://hadooptutorial.info/snappy-compressiondecompression-tool/
1. g++, make and build-essential packages should have been installed prior to installation of snappy.
Installing these packages ,follow the below commands in the same order.
$ sudo apt-get update
$ sudo apt-get install g++ make build-essential
Download snappy tar ball from http://google.github.io/snappy/
issue below commands from terminal in the same order from the folder of snappy.
$ ./configure
$ make
$ sudo make install
Snappy Compression Configuration For Hadoop:
Solution:
After installation of snappy on Ubuntu, we need to copy the libsnappy*.so* files from /usr/local/lib into $HADOOP_HOME/lib/native/ library location and need to set LD_LIBRARY_PATH , JAVA_LIBRARY_PATH environment variables(Try this approach).
Otherwise the best and simple method is that, now a days, latest hadoop distributions (try this hadoop-3.0.0-alpha4) are coming with snappy installed already. We cam copy all lib*.* files from latest/lib/native/ directory into our hadoop native library $HADOOP_HOME/lib/native/ location.
Define these two environment variables in .bashrc file.
export LD_LIBRARY_PATH="$HADOOP_HOME/lib/native/"
export JAVA_LIBRARY_PATH="$HADOOP_HOME/lib/native/"
Add below property into core-site.xml:
<property>
<name>io.compression.codecs</name>
<value>
org.apache.hadoop.io.compress.GzipCodec,
org.apache.hadoop.io.compress.DefaultCodec,
org.apache.hadoop.io.compress.BZip2Codec,
org.apache.hadoop.io.compress.SnappyCodec
</value>
</property>
Restart the hadoop and try
Gpt info and thanks to site ==> http://hadooptutorial.info/snappy-compressiondecompression-tool/
Really nice blog post.provided a helpful information.I hope that you will post more updates like this Big Data Hadoop Online Training Hyderabad
ReplyDeleteAs we know there are many companies which are converting into AWS big data consultant. with the right direction we can definitely predict the future.
ReplyDelete