|
||||||||||
PREV NEXT | FRAMES NO FRAMES |
Uses of ExecException in org.apache.pig |
---|
Methods in org.apache.pig that throw ExecException | |
---|---|
protected PigStats |
PigServer.launchPlan(LogicalPlan lp,
String jobName)
A common method for launching the jobs according to the logical plan |
Constructors in org.apache.pig that throw ExecException | |
---|---|
PigServer(ExecType execType)
|
|
PigServer(ExecType execType,
org.apache.hadoop.conf.Configuration conf)
|
|
PigServer(ExecType execType,
Properties properties)
|
|
PigServer(PigContext context)
|
|
PigServer(PigContext context,
boolean connect)
|
|
PigServer(Properties properties)
|
|
PigServer(String execTypeString)
|
Uses of ExecException in org.apache.pig.backend.executionengine |
---|
Methods in org.apache.pig.backend.executionengine that throw ExecException | |
---|---|
String |
ExecJob.getAlias()
Returns the alias of relation generated by this job |
void |
ExecJob.getLogs(OutputStream log)
Collecting various forms of outputs |
Iterator<Tuple> |
ExecJob.getResults()
if query has executed successfully we want to retrieve the results via iterating over them. |
void |
ExecJob.getSTDError(OutputStream error)
|
void |
ExecJob.getSTDOut(OutputStream out)
|
boolean |
ExecJob.hasCompleted()
true is the physical plan has executed successfully and results are ready to be retrieved |
void |
ExecutionEngine.init()
This method is responsible for the initialization of the ExecutionEngine. |
void |
ExecJob.kill()
Kills current job. |
PigStats |
ExecutionEngine.launchPig(LogicalPlan lp,
String grpName,
PigContext pc)
This method is responsible for the actual execution of a LogicalPlan. |
void |
ExecutionEngine.setConfiguration(Properties newConfiguration)
Responsible for updating the properties for the ExecutionEngine. |
Uses of ExecException in org.apache.pig.backend.hadoop |
---|
Methods in org.apache.pig.backend.hadoop that throw ExecException | |
---|---|
static byte |
HDataType.findTypeFromNullableWritable(PigNullableWritable o)
|
static PigNullableWritable |
HDataType.getWritableComparable(String className)
|
static PigNullableWritable |
HDataType.getWritableComparableTypes(byte type)
|
static PigNullableWritable |
HDataType.getWritableComparableTypes(Object o,
byte keyType)
|
Uses of ExecException in org.apache.pig.backend.hadoop.accumulo |
---|
Methods in org.apache.pig.backend.hadoop.accumulo that throw ExecException | |
---|---|
protected Collection<org.apache.accumulo.core.data.Mutation> |
AccumuloStorage.getMutations(Tuple tuple)
|
protected abstract Collection<org.apache.accumulo.core.data.Mutation> |
AbstractAccumuloStorage.getMutations(Tuple tuple)
|
void |
AbstractAccumuloStorage.putNext(Tuple tuple)
|
Uses of ExecException in org.apache.pig.backend.hadoop.executionengine |
---|
Methods in org.apache.pig.backend.hadoop.executionengine that throw ExecException | |
---|---|
String |
HJob.getAlias()
|
org.apache.hadoop.mapred.JobConf |
HExecutionEngine.getExecConf(Properties properties)
|
void |
HJob.getLogs(OutputStream log)
|
Iterator<Tuple> |
HJob.getResults()
|
void |
HJob.getSTDError(OutputStream error)
|
void |
HJob.getSTDOut(OutputStream out)
|
boolean |
HJob.hasCompleted()
|
void |
HExecutionEngine.init()
|
void |
HJob.kill()
|
PigStats |
HExecutionEngine.launchPig(LogicalPlan lp,
String grpName,
PigContext pc)
|
void |
HExecutionEngine.setConfiguration(Properties newConfiguration)
|
Uses of ExecException in org.apache.pig.backend.hadoop.executionengine.mapReduceLayer |
---|
Methods in org.apache.pig.backend.hadoop.executionengine.mapReduceLayer that throw ExecException | |
---|---|
PigStats |
MapReduceLauncher.launchPig(PhysicalPlan php,
String grpName,
PigContext pc)
|
Constructors in org.apache.pig.backend.hadoop.executionengine.mapReduceLayer that throw ExecException | |
---|---|
MergeJoinIndexer(String funcSpec,
String innerPlan,
String serializedPhyPlan,
String udfCntxtSignature,
String scope,
String ignoreNulls)
|
Uses of ExecException in org.apache.pig.backend.hadoop.executionengine.physicalLayer |
---|
Methods in org.apache.pig.backend.hadoop.executionengine.physicalLayer that throw ExecException | |
---|---|
Result |
PhysicalOperator.getNext(byte dataType)
Implementations that call into the different versions of getNext are often identical, differing only in the signature of the getNext() call they make. |
Result |
PhysicalOperator.getNextBigDecimal()
|
Result |
PhysicalOperator.getNextBigInteger()
|
Result |
PhysicalOperator.getNextBoolean()
|
Result |
PhysicalOperator.getNextDataBag()
|
Result |
PhysicalOperator.getNextDataByteArray()
|
Result |
PhysicalOperator.getNextDateTime()
|
Result |
PhysicalOperator.getNextDouble()
|
Result |
PhysicalOperator.getNextFloat()
|
Result |
PhysicalOperator.getNextInteger()
|
Result |
PhysicalOperator.getNextLong()
|
Result |
PhysicalOperator.getNextMap()
|
Result |
PhysicalOperator.getNextString()
|
Result |
PhysicalOperator.getNextTuple()
|
Result |
PhysicalOperator.processInput()
A generic method for parsing input that either returns the attached input if it exists or fetches it from its predecessor. |
Uses of ExecException in org.apache.pig.backend.hadoop.executionengine.physicalLayer.expressionOperators |
---|
Uses of ExecException in org.apache.pig.backend.hadoop.executionengine.physicalLayer.relationalOperators |
---|
Methods in org.apache.pig.backend.hadoop.executionengine.physicalLayer.relationalOperators that throw ExecException | |
---|---|
protected Result |
POCounter.addCounterValue(Result input)
Add current task id and local counter value. |
Result |
PORank.addRank(Result input)
Reads the output tuple from POCounter and the cumulative sum previously calculated. |
void |
Packager.attachInput(Object key,
DataBag[] bags,
boolean[] readOnce)
|
void |
JoinPackager.attachInput(Object key,
DataBag[] bags,
boolean[] readOnce)
|
protected Tuple |
POLocalRearrange.constructLROutput(List<Result> resLst,
List<Result> secondaryResLst,
Tuple value)
|
protected Tuple |
POPreCombinerLocalRearrange.constructLROutput(List<Result> resLst,
Tuple value)
|
protected Tuple |
POCollectedGroup.constructOutput(List<Result> resLst,
Tuple value)
|
protected DataBag |
POPartitionRearrange.constructPROutput(List<Result> resLst,
Tuple value)
|
protected Tuple |
POForEach.createTuple(Object[] data)
|
Object |
Packager.getKey(PigNullableWritable key)
|
protected Object |
POLocalRearrange.getKeyFromResult(List<Result> resLst,
byte type)
|
Result |
Packager.getNext()
|
Result |
MultiQueryPackager.getNext()
Constructs the output tuple from the inputs. |
Result |
LitePackager.getNext()
Similar to POPackage.getNext except that only one input is expected with index 0 and ReadOnceBag is used instead of DefaultDataBag. |
Result |
JoinPackager.getNext()
Calls getNext to get next ForEach result. |
Result |
CombinerPackager.getNext()
|
Result |
POStream.getNextHelper(Tuple t)
|
Result |
POUnion.getNextTuple()
The code below, tries to follow our single threaded shared execution model with execution being passed around each non-drained input |
Result |
POStream.getNextTuple()
|
Result |
POStore.getNextTuple()
|
Result |
POSplit.getNextTuple()
|
Result |
POSortedDistinct.getNextTuple()
|
Result |
POSort.getNextTuple()
|
Result |
PORank.getNextTuple()
|
Result |
POPreCombinerLocalRearrange.getNextTuple()
Calls getNext on the generate operator inside the nested physical plan. |
Result |
POPartitionRearrange.getNextTuple()
Calls getNext on the generate operator inside the nested physical plan. |
Result |
POPartialAgg.getNextTuple()
|
Result |
POPackage.getNextTuple()
From the inputs, constructs the output tuple for this co-group in the required format which is (key, {bag of tuples from input 1}, {bag of tuples from input 2}, ...) |
Result |
POOptimizedForEach.getNextTuple()
Calls getNext on the generate operator inside the nested physical plan and returns it maintaining an additional state to denote the begin and end of the nested plan processing. |
Result |
POMergeJoin.getNextTuple()
|
Result |
POMergeCogroup.getNextTuple()
|
Result |
POLocalRearrange.getNextTuple()
Calls getNext on the generate operator inside the nested physical plan. |
Result |
POLoad.getNextTuple()
The main method used by this operator's successor to read tuples from the specified file using the specified load function. |
Result |
POLimit.getNextTuple()
Counts the number of tuples processed into static variable soFar, if the number of tuples processed reach the limit, return EOP; Otherwise, return the tuple |
Result |
POForEach.getNextTuple()
Calls getNext on the generate operator inside the nested physical plan and returns it maintaining an additional state to denote the begin and end of the nested plan processing. |
Result |
POFilter.getNextTuple()
Attaches the proccesed input tuple to the expression plan and checks if comparison operator returns a true. |
Result |
POFRJoin.getNextTuple()
|
Result |
PODistinct.getNextTuple()
|
Result |
PODemux.getNextTuple()
|
Result |
POCross.getNextTuple()
|
Result |
POCounter.getNextTuple()
|
Result |
POCollectedGroup.getNextTuple()
|
Tuple |
Packager.getValueTuple(PigNullableWritable keyWritable,
NullableTuple ntup,
int index)
|
Tuple |
MultiQueryPackager.getValueTuple(PigNullableWritable keyWritable,
NullableTuple ntup,
int index)
|
Tuple |
LitePackager.getValueTuple(PigNullableWritable keyWritable,
NullableTuple ntup,
int index)
Makes use of the superclass method, but this requires an additional parameter key passed by ReadOnceBag. |
Tuple |
CombinerPackager.getValueTuple(PigNullableWritable keyWritable,
NullableTuple ntup,
int index)
|
protected Result |
POForEach.processPlan()
|
void |
POLocalRearrange.setIndex(int index)
Sets the co-group index of this operator |
void |
POLocalRearrange.setMultiQueryIndex(int index)
Sets the multi-query index of this operator |
void |
POMergeJoin.throwProcessingException(boolean withCauseException,
Exception e)
|
Constructors in org.apache.pig.backend.hadoop.executionengine.physicalLayer.relationalOperators that throw ExecException | |
---|---|
POFRJoin(OperatorKey k,
int rp,
List<PhysicalOperator> inp,
List<List<PhysicalPlan>> ppLists,
List<List<Byte>> keyTypes,
FileSpec[] replFiles,
int fragment,
boolean isLeftOuter,
Tuple nullTuple)
|
|
POFRJoin(OperatorKey k,
int rp,
List<PhysicalOperator> inp,
List<List<PhysicalPlan>> ppLists,
List<List<Byte>> keyTypes,
FileSpec[] replFiles,
int fragment,
boolean isLeftOuter,
Tuple nullTuple,
Schema[] inputSchemas,
Schema[] keySchemas)
|
Uses of ExecException in org.apache.pig.backend.hadoop.executionengine.util |
---|
Methods in org.apache.pig.backend.hadoop.executionengine.util that throw ExecException | |
---|---|
static FileSpec |
MapRedUtil.checkLeafIsStore(PhysicalPlan plan,
PigContext pigContext)
|
Uses of ExecException in org.apache.pig.backend.hadoop.streaming |
---|
Methods in org.apache.pig.backend.hadoop.streaming that throw ExecException | |
---|---|
void |
HadoopExecutableManager.configure(POStream stream)
|
Uses of ExecException in org.apache.pig.builtin |
---|
Methods in org.apache.pig.builtin that throw ExecException | |
---|---|
protected static Tuple |
LongAvg.combine(DataBag values)
|
protected static Tuple |
IntAvg.combine(DataBag values)
|
protected static Tuple |
FloatAvg.combine(DataBag values)
|
protected static Tuple |
DoubleAvg.combine(DataBag values)
|
protected static Tuple |
BigIntegerAvg.combine(DataBag values)
|
protected static Tuple |
BigDecimalAvg.combine(DataBag values)
|
protected static Tuple |
AVG.combine(DataBag values)
|
static void |
CubeDimensions.convertNullToUnknown(Tuple tuple)
|
protected static long |
LongAvg.count(Tuple input)
|
protected static long |
IntAvg.count(Tuple input)
|
protected static long |
FloatAvg.count(Tuple input)
|
protected static long |
DoubleAvg.count(Tuple input)
|
protected static BigInteger |
BigIntegerAvg.count(Tuple input)
|
protected static BigDecimal |
BigDecimalAvg.count(Tuple input)
|
protected static long |
AVG.count(Tuple input)
|
protected static Long |
AlgebraicLongMathBase.doTupleWork(Tuple input,
AlgebraicMathBase.KnownOpProvider opProvider)
|
protected static Integer |
AlgebraicIntMathBase.doTupleWork(Tuple input,
AlgebraicMathBase.KnownOpProvider opProvider)
|
protected static Float |
AlgebraicFloatMathBase.doTupleWork(Tuple input,
AlgebraicMathBase.KnownOpProvider opProvider)
|
protected static Double |
AlgebraicDoubleMathBase.doTupleWork(Tuple input,
AlgebraicMathBase.KnownOpProvider opProvider)
|
protected static BigInteger |
AlgebraicBigIntegerMathBase.doTupleWork(Tuple input,
AlgebraicMathBase.KnownOpProvider opProvider)
|
protected static BigDecimal |
AlgebraicBigDecimalMathBase.doTupleWork(Tuple input,
AlgebraicMathBase.KnownOpProvider opProvider)
|
protected static Double |
AlgebraicByteArrayMathBase.doTupleWork(Tuple input,
AlgebraicMathBase.KnownOpProvider opProvider,
byte expectedType)
|
protected static String |
StringMax.max(Tuple input)
|
protected static org.joda.time.DateTime |
DateTimeMax.max(Tuple input)
|
protected static String |
StringMin.min(Tuple input)
|
protected static org.joda.time.DateTime |
DateTimeMin.min(Tuple input)
|
protected static Long |
LongAvg.sum(Tuple input)
|
protected static Long |
IntAvg.sum(Tuple input)
|
protected static Double |
FloatAvg.sum(Tuple input)
|
protected static Double |
DoubleAvg.sum(Tuple input)
|
protected static Long |
COUNT_STAR.sum(Tuple input)
|
protected static Long |
COUNT.sum(Tuple input)
|
protected static BigInteger |
BigIntegerAvg.sum(Tuple input)
|
protected static BigDecimal |
BigDecimalAvg.sum(Tuple input)
|
protected static Double |
AVG.sum(Tuple input)
|
Uses of ExecException in org.apache.pig.data |
---|
Subclasses of ExecException in org.apache.pig.data | |
---|---|
class |
FieldIsNullException
|
Methods in org.apache.pig.data that throw ExecException | |
---|---|
static Schema.FieldSchema |
DataType.determineFieldSchema(Object o)
Determine the field schema of an object |
static Schema.FieldSchema |
DataType.determineFieldSchema(ResourceSchema.ResourceFieldSchema rcFieldSchema)
Determine the field schema of an ResourceFieldSchema |
protected abstract BigDecimal |
SchemaTuple.generatedCodeGetBigDecimal(int fieldNum)
|
protected abstract BigInteger |
SchemaTuple.generatedCodeGetBigInteger(int fieldNum)
|
protected abstract boolean |
SchemaTuple.generatedCodeGetBoolean(int fieldNum)
|
protected abstract byte[] |
SchemaTuple.generatedCodeGetBytes(int fieldNum)
|
protected abstract DataBag |
SchemaTuple.generatedCodeGetDataBag(int fieldNum)
|
protected abstract org.joda.time.DateTime |
SchemaTuple.generatedCodeGetDateTime(int fieldNum)
|
protected abstract double |
SchemaTuple.generatedCodeGetDouble(int fieldNum)
|
abstract Object |
SchemaTuple.generatedCodeGetField(int fieldNum)
|
protected abstract float |
SchemaTuple.generatedCodeGetFloat(int fieldNum)
|
protected abstract int |
SchemaTuple.generatedCodeGetInt(int fieldNum)
|
protected abstract long |
SchemaTuple.generatedCodeGetLong(int fieldNum)
|
protected abstract Map<String,Object> |
SchemaTuple.generatedCodeGetMap(int fieldNum)
|
protected abstract String |
SchemaTuple.generatedCodeGetString(int fieldNum)
|
protected abstract Tuple |
SchemaTuple.generatedCodeGetTuple(int fieldNum)
|
protected abstract SchemaTuple<T> |
SchemaTuple.generatedCodeSet(SchemaTuple<?> t,
boolean checkType)
|
protected abstract void |
SchemaTuple.generatedCodeSetBigDecimal(int fieldNum,
BigDecimal val)
|
protected abstract void |
SchemaTuple.generatedCodeSetBigInteger(int fieldNum,
BigInteger val)
|
protected abstract void |
SchemaTuple.generatedCodeSetBoolean(int fieldNum,
boolean val)
|
protected abstract void |
SchemaTuple.generatedCodeSetBytes(int fieldNum,
byte[] val)
|
protected abstract void |
SchemaTuple.generatedCodeSetDataBag(int fieldNum,
DataBag val)
|
protected abstract void |
SchemaTuple.generatedCodeSetDateTime(int fieldNum,
org.joda.time.DateTime val)
|
protected abstract void |
SchemaTuple.generatedCodeSetDouble(int fieldNum,
double val)
|
abstract void |
SchemaTuple.generatedCodeSetField(int fieldNum,
Object val)
|
protected abstract void |
SchemaTuple.generatedCodeSetFloat(int fieldNum,
float val)
|
protected abstract void |
SchemaTuple.generatedCodeSetInt(int fieldNum,
int val)
|
protected abstract void |
SchemaTuple.generatedCodeSetIterator(Iterator<Object> l)
|
protected abstract void |
SchemaTuple.generatedCodeSetLong(int fieldNum,
long val)
|
protected abstract void |
SchemaTuple.generatedCodeSetMap(int fieldNum,
Map<String,Object> val)
|
protected abstract void |
SchemaTuple.generatedCodeSetString(int fieldNum,
String val)
|
protected abstract void |
SchemaTuple.generatedCodeSetTuple(int fieldNum,
Tuple val)
|
Object |
Tuple.get(int fieldNum)
Get the value in a given field. |
Object |
TargetedTuple.get(int fieldNum)
|
Object |
SchemaTuple.get(int fieldNum)
|
Object |
DefaultTuple.get(int fieldNum)
Get the value in a given field. |
Object |
AppendableSchemaTuple.get(int fieldNum)
|
protected Object |
AppendableSchemaTuple.getAppendedField(int i)
|
BigDecimal |
TypeAwareTuple.getBigDecimal(int idx)
|
BigDecimal |
SchemaTuple.getBigDecimal(int fieldNum)
|
BigInteger |
TypeAwareTuple.getBigInteger(int idx)
|
BigInteger |
SchemaTuple.getBigInteger(int fieldNum)
|
boolean |
TypeAwareTuple.getBoolean(int idx)
|
boolean |
SchemaTuple.getBoolean(int fieldNum)
|
byte[] |
TypeAwareTuple.getBytes(int idx)
|
byte[] |
SchemaTuple.getBytes(int fieldNum)
|
DataBag |
TypeAwareTuple.getDataBag(int idx)
|
DataBag |
SchemaTuple.getDataBag(int fieldNum)
|
org.joda.time.DateTime |
TypeAwareTuple.getDateTime(int idx)
|
org.joda.time.DateTime |
SchemaTuple.getDateTime(int fieldNum)
|
double |
TypeAwareTuple.getDouble(int idx)
|
double |
SchemaTuple.getDouble(int fieldNum)
|
float |
TypeAwareTuple.getFloat(int idx)
|
float |
SchemaTuple.getFloat(int fieldNum)
|
abstract byte |
SchemaTuple.getGeneratedCodeFieldType(int fieldNum)
|
int |
TypeAwareTuple.getInt(int idx)
|
int |
SchemaTuple.getInt(int fieldNum)
|
long |
TypeAwareTuple.getLong(int idx)
|
long |
SchemaTuple.getLong(int fieldNum)
|
Map<String,Object> |
TypeAwareTuple.getMap(int idx)
|
Map<String,Object> |
SchemaTuple.getMap(int fieldNum)
|
String |
TypeAwareTuple.getString(int idx)
|
String |
SchemaTuple.getString(int fieldNum)
|
Tuple |
TypeAwareTuple.getTuple(int idx)
|
Tuple |
SchemaTuple.getTuple(int fieldNum)
|
byte |
Tuple.getType(int fieldNum)
Find the type of a given field. |
byte |
TargetedTuple.getType(int fieldNum)
|
byte |
SchemaTuple.getType(int fieldNum)
|
byte |
AppendableSchemaTuple.getType(int fieldNum)
|
byte |
AbstractTuple.getType(int fieldNum)
Find the type of a given field. |
protected Object |
SchemaTuple.getTypeAwareBase(int fieldNum,
String type)
|
protected Object |
AppendableSchemaTuple.getTypeAwareBase(int fieldNum,
String type)
|
abstract boolean |
SchemaTuple.isGeneratedCodeFieldNull(int fieldNum)
|
boolean |
Tuple.isNull(int fieldNum)
Find out if a given field is null. |
boolean |
SchemaTuple.isNull(int fieldNum)
|
boolean |
AppendableSchemaTuple.isNull(int fieldNum)
|
boolean |
AbstractTuple.isNull(int fieldNum)
Find out if a given field is null. |
Object |
InterSedes.readDatum(DataInput in)
Get the next object from DataInput in |
static Object |
DataReaderWriter.readDatum(DataInput in)
|
Object |
BinInterSedes.readDatum(DataInput in)
|
Object |
InterSedes.readDatum(DataInput in,
byte type)
Get the next object from DataInput in of the type of type argument The type information has been read from DataInput. |
static Object |
DataReaderWriter.readDatum(DataInput in,
byte type)
|
Object |
BinInterSedes.readDatum(DataInput in,
byte type)
Expects binInterSedes data types (NOT DataType types!) |
void |
Tuple.set(int fieldNum,
Object val)
Set the value in a given field. |
void |
TargetedTuple.set(int fieldNum,
Object val)
|
void |
SchemaTuple.set(int fieldNum,
Object val)
|
void |
DefaultTuple.set(int fieldNum,
Object val)
Set the value in a given field. |
void |
AppendableSchemaTuple.set(int fieldNum,
Object val)
|
SchemaTuple<T> |
SchemaTuple.set(List<Object> l)
|
SchemaTuple<T> |
AppendableSchemaTuple.set(List<Object> l)
|
SchemaTuple<T> |
SchemaTuple.set(SchemaTuple<?> t)
|
protected SchemaTuple<T> |
SchemaTuple.set(SchemaTuple<?> t,
boolean checkType)
|
protected SchemaTuple<T> |
AppendableSchemaTuple.set(SchemaTuple<?> t,
boolean checkType)
|
SchemaTuple<T> |
SchemaTuple.set(Tuple t)
|
protected SchemaTuple<T> |
SchemaTuple.set(Tuple t,
boolean checkType)
|
void |
TypeAwareTuple.setBigDecimal(int idx,
BigDecimal val)
|
void |
SchemaTuple.setBigDecimal(int fieldNum,
BigDecimal val)
|
void |
TypeAwareTuple.setBigInteger(int idx,
BigInteger val)
|
void |
SchemaTuple.setBigInteger(int fieldNum,
BigInteger val)
|
void |
TypeAwareTuple.setBoolean(int idx,
boolean val)
|
void |
SchemaTuple.setBoolean(int fieldNum,
boolean val)
|
void |
TypeAwareTuple.setBytes(int idx,
byte[] val)
|
void |
SchemaTuple.setBytes(int fieldNum,
byte[] val)
|
void |
TypeAwareTuple.setDataBag(int idx,
DataBag val)
|
void |
SchemaTuple.setDataBag(int fieldNum,
DataBag val)
|
void |
TypeAwareTuple.setDateTime(int idx,
org.joda.time.DateTime val)
|
void |
SchemaTuple.setDateTime(int fieldNum,
org.joda.time.DateTime val)
|
void |
TypeAwareTuple.setDouble(int idx,
double val)
|
void |
SchemaTuple.setDouble(int fieldNum,
double val)
|
void |
TypeAwareTuple.setFloat(int idx,
float val)
|
void |
SchemaTuple.setFloat(int fieldNum,
float val)
|
void |
TypeAwareTuple.setInt(int idx,
int val)
|
void |
SchemaTuple.setInt(int fieldNum,
int val)
|
void |
TypeAwareTuple.setLong(int idx,
long val)
|
void |
SchemaTuple.setLong(int fieldNum,
long val)
|
void |
TypeAwareTuple.setMap(int idx,
Map<String,Object> val)
|
void |
SchemaTuple.setMap(int fieldNum,
Map<String,Object> val)
|
void |
TypeAwareTuple.setString(int idx,
String val)
|
void |
SchemaTuple.setString(int fieldNum,
String val)
|
void |
TypeAwareTuple.setTuple(int idx,
Tuple val)
|
void |
SchemaTuple.setTuple(int fieldNum,
Tuple val)
|
protected void |
SchemaTuple.setTypeAwareBase(int fieldNum,
Object val,
String type)
|
protected void |
AppendableSchemaTuple.setTypeAwareBase(int fieldNum,
Object val,
String type)
|
static DataBag |
DataType.toBag(Object o)
If this object is a bag, return it as a bag. |
static BigDecimal |
DataType.toBigDecimal(Object o)
|
static BigDecimal |
DataType.toBigDecimal(Object o,
byte type)
|
static BigInteger |
DataType.toBigInteger(Object o)
|
static BigInteger |
DataType.toBigInteger(Object o,
byte type)
|
static Boolean |
DataType.toBoolean(Object o)
|
static Boolean |
DataType.toBoolean(Object o,
byte type)
Force a data object to a Boolean, if possible. |
static byte[] |
DataType.toBytes(Object o)
|
static byte[] |
DataType.toBytes(Object o,
byte type)
|
static org.joda.time.DateTime |
DataType.toDateTime(Object o)
|
static org.joda.time.DateTime |
DataType.toDateTime(Object o,
byte type)
Force a data object to a DateTime, if possible. |
String |
Tuple.toDelimitedString(String delim)
Write a tuple of values into a string. |
String |
AbstractTuple.toDelimitedString(String delim)
Write a tuple of values into a string. |
static Double |
DataType.toDouble(Object o)
Force a data object to a Double, if possible. |
static Double |
DataType.toDouble(Object o,
byte type)
Force a data object to a Double, if possible. |
static Float |
DataType.toFloat(Object o)
Force a data object to a Float, if possible. |
static Float |
DataType.toFloat(Object o,
byte type)
Force a data object to a Float, if possible. |
static Integer |
DataType.toInteger(Object o)
Force a data object to an Integer, if possible. |
static Integer |
DataType.toInteger(Object o,
byte type)
Force a data object to an Integer, if possible. |
static Long |
DataType.toLong(Object o)
Force a data object to a Long, if possible. |
static Long |
DataType.toLong(Object o,
byte type)
Force a data object to a Long, if possible. |
static Map<String,Object> |
DataType.toMap(Object o)
If this object is a map, return it as a map. |
static String |
DataType.toString(Object o)
Force a data object to a String, if possible. |
static String |
DataType.toString(Object o,
byte type)
Force a data object to a String, if possible. |
static Tuple |
DataType.toTuple(Object o)
If this object is a tuple, return it as a tuple. |
Uses of ExecException in org.apache.pig.impl |
---|
Methods in org.apache.pig.impl that throw ExecException | ||
---|---|---|
void |
PigContext.connect()
|
|
ExecutableManager |
PigContext.createExecutableManager()
Create a new ExecutableManager depending on the ExecType. |
|
static
|
PigContext.instantiateObjectFromParams(org.apache.hadoop.conf.Configuration conf,
String classParamKey,
String argParamKey,
Class<T> clazz)
A common Pig pattern for initializing objects via system properties is to support passing something like this on the command line: -Dpig.notification.listener=MyClass
-Dpig.notification.listener.arg=myConstructorStringArg
This method will properly initialize the class with the args, if they exist. |
Uses of ExecException in org.apache.pig.impl.builtin |
---|
Methods in org.apache.pig.impl.builtin that throw ExecException | |
---|---|
void |
SampleLoader.computeSamples(ArrayList<Pair<FileSpec,Boolean>> inputs,
PigContext pc)
|
Object |
IdentityColumn.exec(Tuple input)
|
Constructors in org.apache.pig.impl.builtin that throw ExecException | |
---|---|
StreamingUDF(String language,
String filePath,
String funcName,
String outputSchemaString,
String schemaLineNumber,
String execType,
String isIllustrate)
|
Uses of ExecException in org.apache.pig.impl.streaming |
---|
Subclasses of ExecException in org.apache.pig.impl.streaming | |
---|---|
class |
StreamingUDFException
|
Methods in org.apache.pig.impl.streaming that throw ExecException | |
---|---|
void |
ExecutableManager.configure(POStream stream)
Configure and initialize the ExecutableManager . |
static InputHandler |
HandlerFactory.createInputHandler(StreamingCommand command)
Create an InputHandler for the given input specification
of the StreamingCommand . |
static OutputHandler |
HandlerFactory.createOutputHandler(StreamingCommand command)
Create an OutputHandler for the given output specification
of the StreamingCommand . |
Constructors in org.apache.pig.impl.streaming that throw ExecException | |
---|---|
FileInputHandler(StreamingCommand.HandleSpec handleSpec)
|
|
FileOutputHandler(StreamingCommand.HandleSpec handleSpec)
|
Uses of ExecException in org.apache.pig.impl.util.avro |
---|
Methods in org.apache.pig.impl.util.avro that throw ExecException | |
---|---|
Object |
AvroTupleWrapper.get(int pos)
|
static byte |
AvroStorageSchemaConversionUtilities.getPigType(org.apache.avro.Schema s)
Determines the pig object type of the Avro schema. |
byte |
AvroTupleWrapper.getType(int arg0)
|
boolean |
AvroTupleWrapper.isNull(int arg0)
|
void |
AvroTupleWrapper.set(int arg0,
Object arg1)
|
String |
AvroTupleWrapper.toDelimitedString(String arg0)
|
Uses of ExecException in org.apache.pig.pen |
---|
Methods in org.apache.pig.pen that throw ExecException | |
---|---|
Map<LOLoad,DataBag> |
AugmentBaseDataVisitor.getNewBaseData()
|
Uses of ExecException in org.apache.pig.pen.util |
---|
Methods in org.apache.pig.pen.util that throw ExecException | |
---|---|
Object |
ExampleTuple.get(int fieldNum)
|
byte |
ExampleTuple.getType(int fieldNum)
|
boolean |
ExampleTuple.isNull(int fieldNum)
|
void |
ExampleTuple.set(int fieldNum,
Object val)
|
Uses of ExecException in org.apache.pig.piggybank.evaluation |
---|
Methods in org.apache.pig.piggybank.evaluation that throw ExecException | |
---|---|
protected static Tuple |
ExtremalTupleByNthField.extreme(int pind,
int psign,
Tuple input,
PigProgressable reporter)
|
protected static Tuple |
MaxTupleBy1stField.max(Tuple input,
PigProgressable reporter)
|
protected static int |
ExtremalTupleByNthField.parseFieldIndex(String inputFieldIndex)
|
Constructors in org.apache.pig.piggybank.evaluation that throw ExecException | |
---|---|
ExtremalTupleByNthField.HelperClass()
|
|
ExtremalTupleByNthField.HelperClass(String fieldIndexString)
|
|
ExtremalTupleByNthField.HelperClass(String fieldIndexString,
String order)
|
|
ExtremalTupleByNthField()
Constructors |
|
ExtremalTupleByNthField(String fieldIndexString)
|
|
ExtremalTupleByNthField(String fieldIndexString,
String order)
|
Uses of ExecException in org.apache.pig.piggybank.evaluation.datetime.truncate |
---|
Methods in org.apache.pig.piggybank.evaluation.datetime.truncate that throw ExecException | |
---|---|
static org.joda.time.DateTime |
ISOHelper.parseDateTime(Tuple input)
|
Uses of ExecException in org.apache.pig.scripting |
---|
Methods in org.apache.pig.scripting that throw ExecException | |
---|---|
Map<String,List<PigStats>> |
ScriptEngine.run(PigContext pigContext,
String scriptFile)
Runs a script file. |
Uses of ExecException in org.apache.pig.scripting.groovy |
---|
Methods in org.apache.pig.scripting.groovy that throw ExecException | |
---|---|
static Object |
GroovyUtils.groovyToPig(Object groovyObject)
Converts an object created on the Groovy side to its Pig counterpart. |
static Object |
GroovyUtils.pigToGroovy(Object pigObject)
Converts an object created on the Pig side to its Groovy counterpart. |
Uses of ExecException in org.apache.pig.scripting.jruby |
---|
Methods in org.apache.pig.scripting.jruby that throw ExecException | ||
---|---|---|
void |
RubyDataBag.add(org.jruby.runtime.ThreadContext context,
org.jruby.runtime.builtin.IRubyObject[] args)
The add method accepts a varargs argument; each argument can be either a random object, a DataBag, or a RubyArray. |
|
org.jruby.runtime.builtin.IRubyObject |
RubyDataBag.each(org.jruby.runtime.ThreadContext context,
org.jruby.runtime.Block block)
This is an implementation of the each method which opens up the Enumerable interface, and makes it very convenient to iterate over the elements of a DataBag. |
|
org.jruby.runtime.builtin.IRubyObject |
RubyDataBag.flatten(org.jruby.runtime.ThreadContext context,
org.jruby.runtime.Block block)
This is a convenience method which will run the given block on the first element of each tuple contained. |
|
static
|
PigJrubyLibrary.pigToRuby(org.jruby.Ruby ruby,
Map<T,?> object)
A type specific conversion routine for Pig Maps. |
|
static org.jruby.runtime.builtin.IRubyObject |
PigJrubyLibrary.pigToRuby(org.jruby.Ruby ruby,
Object object)
This is the method which provides conversion from Pig to Ruby. |
|
static org.jruby.RubyArray |
PigJrubyLibrary.pigToRuby(org.jruby.Ruby ruby,
Tuple object)
A type specific conversion routine. |
|
static Object |
PigJrubyLibrary.rubyToPig(org.jruby.runtime.builtin.IRubyObject rbObject)
This method facilitates conversion from Ruby objects to Pig objects. |
|
static Tuple |
PigJrubyLibrary.rubyToPig(org.jruby.RubyArray rbObject)
A type specific conversion routine. |
|
static Map<String,?> |
PigJrubyLibrary.rubyToPig(org.jruby.RubyHash rbObject)
A type specific conversion routine. |
Uses of ExecException in org.apache.pig.scripting.jython |
---|
Methods in org.apache.pig.scripting.jython that throw ExecException | |
---|---|
static Object |
JythonUtils.pythonToPig(org.python.core.PyObject pyObject)
|
Uses of ExecException in org.apache.pig.tools |
---|
Methods in org.apache.pig.tools that throw ExecException | |
---|---|
List<ExecJob> |
ToolsPigServer.runPlan(LogicalPlan newPlan,
String jobName)
Given a (modified) new logical plan, run the script. |
Constructors in org.apache.pig.tools that throw ExecException | |
---|---|
ToolsPigServer(ExecType execType,
Properties properties)
|
|
ToolsPigServer(PigContext ctx)
|
|
ToolsPigServer(String execTypeString)
|
Uses of ExecException in org.apache.pig.tools.grunt |
---|
Constructors in org.apache.pig.tools.grunt that throw ExecException | |
---|---|
Grunt(BufferedReader in,
PigContext pigContext)
|
|
||||||||||
PREV NEXT | FRAMES NO FRAMES |