HBase SQL statement fails with Insufficient permissions for user

ISSUE: Below error comes up when creating new hbase table hbase(main):001:0> create 'anoop','cf1' ERROR: org.apache.hadoop.hbase.security.AccessDeniedException: org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient permissions for user 'anoop' (global, action=CREATE) at org.apache.hadoop.hbase.security.access.AccessController.requirePermission(AccessController.java:426) at org.apache.hadoop.hbase.security.access.AccessController.preCreateTable(AccessController.java:563) Solution: - Once the cluster has been secured, a user has to authenticate itself to kerberos by doing a kinit. By default, hbase is a superuser who was … Continue reading HBase SQL statement fails with Insufficient permissions for user

Advertisements

HBASE snapshots How To

What is a Snapshot? A snapshot is a set of metadata information that allows an admin to get back to a previous state of the table. A snapshot is not a copy of the table; it’s just a list of file names and doesn’t copy the data. A full snapshot restore means that you get … Continue reading HBASE snapshots How To

Crontab not working for kerberized hadoop

ISSUE:- Cronjobs not working for kerberos enabled hadoop. Throwing below error. ERROR tool.ImportTool: Encountered IOException running import job: java.io.IOException: Failed on local exception: java.io.IOException: javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)];   Solution:- This is because there is no TTY, Profile setup for the … Continue reading Crontab not working for kerberized hadoop

OOZIE -Sample workflow creation with multiple actions

Requirement:- To create an oozie workflow with 4 actions. Actions in this workflow Pig action Shell action Pig action A mapreduce action STEP1:Created 2 simple pig scripts as below A = LOAD '/user/anoopk/input1/input_an_ri.txt'; STORE A INTO 'out2'; A = LOAD '/user/anoopk/sites.txt'; STORE A INTO 'out3'; Loaded hadoop-mapreduce-examples.jar to the HDFS folder Workflow/mr/lib/hadoop-mapreduce-examples.jar The above script … Continue reading OOZIE -Sample workflow creation with multiple actions

Running Hadoop Benchmarking TestDFSIO on Cloudera Clusters

Hadoop provides a benchmarking mechanism for the cluster. The steps to benchmark cloudera cluster file system is below. set the HADOOP_HOME. HADOOP_HOME=/opt/cloudera/parcels/CDH/lib/hadoop/ Run TestDFSIO as below. #hadoop jar /opt/cloudera/parcels/CDH/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.0.0-cdh4.3.0-tests.jar TestDFSIO -write -nrFiles 10 -fileSize 1000 #hadoop jar /opt/cloudera/parcels/CDH/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.0.0-cdh4.3.0-tests.jar TestDFSIO -read -nrFiles 10 -fileSize 1000 Once you run the test you will see TestDFSIO_results.log  file in … Continue reading Running Hadoop Benchmarking TestDFSIO on Cloudera Clusters