Get output from scans in hbase shell

I know that this post is quite old but i was searching something about HBase myself and came across with it. Well i don’t know if this is the best way to do it, but you can definitely use the scripting option HBase gives you. Just open a shell (preferably go to the directory bin … Read more

How to connect to remote HBase in Java?

Here’s a snippet from a system we use to create an HTable we use to connect to HBase Configuration hConf = HBaseConfiguration.create(conf); hConf.set(Constants.HBASE_CONFIGURATION_ZOOKEEPER_QUORUM, hbaseZookeeperQuorum); hConf.setInt(Constants.HBASE_CONFIGURATION_ZOOKEEPER_CLIENTPORT, hbaseZookeeperClientPort); HTable hTable = new HTable(hConf, tableName); HTH EDIT: Example Values: public static final String HBASE_CONFIGURATION_ZOOKEEPER_QUORUM = “hbase.zookeeper.quorum”; public static final String HBASE_CONFIGURATION_ZOOKEEPER_CLIENTPORT = “hbase.zookeeper.property.clientPort”; … hbaseZookeeperQuorum=”PDHadoop1.corp.CompanyName.com,PDHadoop2.corp.CompanyName.com”; hbaseZookeeperClientPort=10000; tableName=”HBaseTableName”;

How to read from hbase using spark

A Basic Example to Read the HBase data using Spark (Scala), You can also wrtie this in Java : import org.apache.hadoop.hbase.client.{HBaseAdmin, Result} import org.apache.hadoop.hbase.{ HBaseConfiguration, HTableDescriptor } import org.apache.hadoop.hbase.mapreduce.TableInputFormat import org.apache.hadoop.hbase.io.ImmutableBytesWritable import org.apache.spark._ object HBaseRead { def main(args: Array[String]) { val sparkConf = new SparkConf().setAppName(“HBaseRead”).setMaster(“local[2]”) val sc = new SparkContext(sparkConf) val conf = HBaseConfiguration.create() val … Read more