SAP Knowledge Base Article - Preview

2813515 - Folder Permissions Problem on HDFS /tmp/hive SAP Cloud Platform BDS

Symptom

After setting the correct folder permissions in Ranger for HDFS /tmp/hive, user gets the following error when running hive job:

play.api.http.HttpErrorHandlerExceptions$$anon$1: Execution exception[[AnalysisException: java.lang.RuntimeException: java.lang.RuntimeException: The root scratch dir: /tmp/hive on HDFS should be writable. Current permissions are: rwxr-xr-x;]]         at play.api.http.HttpErrorHandlerExceptions$.throwableToUsefulException(HttpErrorHandler.scala:255)         at play.api.http.DefaultHttpErrorHandler.onServerError(HttpErrorHandler.scala:180)         at play.core.server.AkkaHttpServer$$anonfun$3.applyOrElse(AkkaHttpServer.scala:320)         at play.core.server.AkkaHttpServer$$anonfun$3.applyOrElse(AkkaHttpServer.scala:318)         at scala.concurrent.Future$$anonfun$recoverWith$1.apply(Future.scala:346)         at scala.concurrent.Future$$anonfun$recoverWith$1.apply(Future.scala:345)         at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:32)         at akka.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:55)         at akka.dispatch.BatchingExecutor$BlockableBatch$$anonfun$run$1.apply$mcV$sp(BatchingExecutor.scala:91)         at akka.dispatch.BatchingExecutor$BlockableBatch$$anonfun$run$1.apply(BatchingExecutor.scala:91) Caused by: org.apache.spark.sql.AnalysisException: java.lang.RuntimeException: java.lang.RuntimeException: The root scratch dir: /tmp/hive on HDFS should be writable. Current permissions are: rwxr-xr-x;         at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:106)         at org.apache.spark.sql.hive.HiveExternalCatalog.databaseExists(HiveExternalCatalog.scala:194)         at org.apache.spark.sql.internal.SharedState.externalCatalog$lzycompute(SharedState.scala:114)         at org.apache.spark.sql.internal.SharedState.externalCatalog(SharedState.scala:102)         at org.apache.spark.sql.hive.HiveSessionStateBuilder.externalCatalog(HiveSessionStateBuilder.scala:39)         at org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog$lzycompute(HiveSessionStateBuilder.scala:54)         at org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog(HiveSessionStateBuilder.scala:52)         at org.apache.spark.sql.hive.HiveSessionStateBuilder$$anon$1.<init>(HiveSessionStateBuilder.scala:69)         at org.apache.spark.sql.hive.HiveSessionStateBuilder.analyzer(HiveSessionStateBuilder.scala:69)         at org.apache.spark.sql.internal.BaseSessionStateBuilder$$anonfun$build$2.apply(BaseSessionStateBuilder.scala:293) Caused by: java.lang.RuntimeException: java.lang.RuntimeException: The root scratch dir: /tmp/hive on HDFS should be writable. Current permissions are: rwxr-xr-x         at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:522)         at org.apache.spark.sql.hive.client.HiveClientImpl.newState(HiveClientImpl.scala:180)         at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:114)         at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)         at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)         at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)         at java.lang.reflect.Constructor.newInstance(Constructor.java:423)         at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:264)         at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:385)         at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:287) Caused by: java.lang.RuntimeException: The root scratch dir: /tmp/hive on HDFS should be writable. Current permissions are: rwxr-xr-x         at org.apache.hadoop.hive.ql.session.SessionState.createRootHDFSDir(SessionState.java:612)         at org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SessionState.java:554)

 

 


Read more...

Environment

  • SAP Cloud Platform Big Data Services (BDS) 4.3
  • SAP Cloud Platform Big Data Services (BDS) 5.0

Product

SAP Big Data Services all versions ; SAP S/4HANA Cloud 1905

Keywords

Altiscale, Portal, HDFS, Ranger Hive, Spark, Pig, Tez, Oozie, Alation, MapReduce, SparkContext, YARN , KBA , BC-NEO-BDS , HCP Big Data Service , Problem

About this page

This is a preview of a SAP Knowledge Base Article. Click more to access the full version on SAP for Me (Login required).

Search for additional results

Visit SAP Support Portal's SAP Notes and KBA Search.