public class LocalHiveSparkClient extends Object implements HiveSparkClient
| Modifier and Type | Field and Description |
|---|---|
protected static org.slf4j.Logger |
LOG |
| Modifier and Type | Method and Description |
|---|---|
void |
close() |
SparkJobRef |
execute(DriverContext driverContext,
SparkWork sparkWork)
HiveSparkClient should generate Spark RDD graph by given sparkWork and driverContext,
and submit RDD graph to Spark cluster.
|
int |
getDefaultParallelism()
For standalone mode, this can be used to get total number of cores.
|
int |
getExecutorCount() |
static LocalHiveSparkClient |
getInstance(org.apache.spark.SparkConf sparkConf,
HiveConf hiveConf) |
org.apache.spark.SparkConf |
getSparkConf() |
public static LocalHiveSparkClient getInstance(org.apache.spark.SparkConf sparkConf, HiveConf hiveConf) throws FileNotFoundException, MalformedURLException
public org.apache.spark.SparkConf getSparkConf()
getSparkConf in interface HiveSparkClientpublic int getExecutorCount()
getExecutorCount in interface HiveSparkClientpublic int getDefaultParallelism()
throws Exception
HiveSparkClientgetDefaultParallelism in interface HiveSparkClientExceptionpublic SparkJobRef execute(DriverContext driverContext, SparkWork sparkWork) throws Exception
HiveSparkClientexecute in interface HiveSparkClientExceptionpublic void close()
close in interface Closeableclose in interface AutoCloseableCopyright © 2019 The Apache Software Foundation. All Rights Reserved.