MOST IMPORTANT ISSUES What's this?

  Overhead  Problem 
Section 2. Thread throwing OutOfMemoryError:  found 
 foundThread name: "main"
Section 3. Where Memory Goes, by Class
 1,106,967Kb (68.0%)High memory usage by byte[]
 243,821Kb (15.0%)High memory usage by j.u.HashSet
Section 4. Where Memory Goes, by GC Root:  Leak candidate(s) found 
 1,047,625Kb (64.3%)High memory amount retained by Object tree for GC root(s) Java Static java.beans.ThreadGroupContext.contexts
 571,788Kb (35.1%)High memory amount retained by Object tree for GC root(s) Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap
 Found leak candidate(s)in Object tree for GC root(s) Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap
Section 9. Bad Collections:  overhead 15.0% 
 243,793Kb (15.0%)High overhead of type empty j.u.HashSet
Section 11. Bad Primitive Arrays:  overhead 64.6% 
 1,047,628Kb (64.3%)High overhead of type empty byte[]
Section 13. Duplicate Objects:  overhead 3.7% 
 60,519Kb (3.7%)High overhead of all duplicate objects




1. Top-Level Stats What's this?
Generated by JXRay version 2.6a

Heap dump hs2-hive-jira-20153.hprof created on Wed Jul 11 08:07:13 PDT 2018
JVM version: 1.8.0_144

     Instances   Object arrays   Primitive arrays  Total 
 Objects 17,932,614 999,814 601,29419,533,722
 Bytes 456,336Kb (28.0%) 55,852Kb (3.4%) 1,116,733Kb (68.6%)1,628,922Kb (100.0%)


     Live   Garbage  Total 
 Objects 19,517,030 16,69219,533,722
 Bytes 1,628,172Kb (100.0%) 749Kb (< 0.1%)1,628,922Kb (100.0%)


  Number of classes  Number of threads 
 6,87011


  JVM pointer size  Object header size 
 412




2. Thread throwing OutOfMemoryError:  found  What's this?
This dump was created after OutOfMemoryError in the following thread:

Thread name: "main", daemon: false
java.lang.OutOfMemoryError.<init>(OutOfMemoryError.java:48)
java.util.regex.Matcher.<init>(Matcher.java:225)
  Local variables java.util.regex.Matcher(parentPattern : java.util.regex.Pattern@c0939488, groups : null, from : 0, to : 0, lookbehindTo : 0, text : "@Kcat@VArts & Entertainment > Hobbies & Creative Arts > Artwork > Posters@V", acceptMode : 0, first : -1, last : 0, oldLast : -1, lastAppendPosition : 0, locals : null, hitEnd : false, requireEnd : false, transparentBounds : false, anchoringBounds : true)

java.util.regex.Pattern(pattern : "@K(.*?)@V(.*?)@V", flags : 0, compiled : true, normalizedPattern : "@K(.*?)@V(.*?)@V", root : java.util.regex.Pattern$Start@c0939518, matchRoot : java.util.regex.Pattern$Slice@c0939530, buffer : null, namedGroups : null, groupNodes : null, temp : null, capturingGroupCount : 3, localCount : 2, cursor : 16, patternLength : 0, hasSupplementary : false)

java.util.regex.Pattern.matcher(Pattern.java:1093)
com.criteo.hadoop.hive.udf.UDFExtraDataToMap.toMap(UDFExtraDataToMap.java:42)
  Local variables com.criteo.hadoop.hive.udf.UDFExtraDataToMap(container : j.u.LinkedHashMap(size: 1))

com.criteo.hadoop.hive.udf.UDFExtraDataToMap.evaluate(UDFExtraDataToMap.java:82)
org.apache.hadoop.hive.ql.exec.ExprNodeGenericFuncEvaluator._evaluate(ExprNodeGenericFuncEvaluator.java:187)
org.apache.hadoop.hive.ql.exec.ExprNodeEvaluator.evaluate(ExprNodeEvaluator.java:80)
org.apache.hadoop.hive.ql.exec.ExprNodeGenericFuncEvaluator$DeferredExprObject.get(ExprNodeGenericFuncEvaluator.java:88)
  Local variables org.apache.hadoop.hive.ql.exec.ExprNodeGenericFuncEvaluator$DeferredExprObject(eager : false, eval : org.apache.hadoop.hive.ql.exec.ExprNodeGenericFuncEvaluator@c4f50b30, evaluated : false, version : 4412885, obj : j.u.LinkedHashMap(size: 1), this$0 : org.apache.hadoop.hive.ql.exec.ExprNodeGenericFuncEvaluator@c4f51398)

org.apache.hadoop.hive.ql.udf.generic.GenericUDFIndex.evaluate(GenericUDFIndex.java:100)
  Local variables "stars"

org.apache.hadoop.hive.ql.exec.ExprNodeGenericFuncEvaluator._evaluate(ExprNodeGenericFuncEvaluator.java:187)
org.apache.hadoop.hive.ql.exec.ExprNodeEvaluator.evaluate(ExprNodeEvaluator.java:80)
org.apache.hadoop.hive.ql.exec.ExprNodeGenericFuncEvaluator$DeferredExprObject.get(ExprNodeGenericFuncEvaluator.java:88)
  Local variables org.apache.hadoop.hive.ql.exec.ExprNodeGenericFuncEvaluator$DeferredExprObject(eager : false, eval : org.apache.hadoop.hive.ql.exec.ExprNodeGenericFuncEvaluator@c4f51398, evaluated : false, version : 4412885, obj : null, this$0 : org.apache.hadoop.hive.ql.exec.ExprNodeGenericFuncEvaluator@c4f51340)

org.apache.hadoop.hive.ql.udf.generic.GenericUDFWhen.evaluate(GenericUDFWhen.java:104)
  Local variables org.apache.hadoop.hive.ql.udf.generic.GenericUDFWhen(argumentOIs : org.apache.hadoop.hive.serde2.objectinspector.ObjectInspector[](size: 3), returnOIResolver : org.apache.hadoop.hive.ql.udf.generic.GenericUDFUtils$ReturnObjectInspectorResolver@c4f509e8)

org.apache.hadoop.hive.ql.exec.ExprNodeGenericFuncEvaluator._evaluate(ExprNodeGenericFuncEvaluator.java:187)
org.apache.hadoop.hive.ql.exec.ExprNodeEvaluator.evaluate(ExprNodeEvaluator.java:80)
org.apache.hadoop.hive.ql.exec.ExprNodeEvaluatorHead._evaluate(ExprNodeEvaluatorHead.java:44)
org.apache.hadoop.hive.ql.exec.ExprNodeEvaluator.evaluate(ExprNodeEvaluator.java:80)
org.apache.hadoop.hive.ql.exec.ExprNodeEvaluator.evaluate(ExprNodeEvaluator.java:68)
org.apache.hadoop.hive.ql.exec.SelectOperator.process(SelectOperator.java:88)
  Local variables org.apache.hadoop.hive.ql.exec.SelectOperator(configuration : org.apache.hadoop.mapred.JobConf@80045688, cContext : null, childOperators : j.u.ArrayList(size: 1), parentOperators : j.u.ArrayList(size: 1), operatorId : "SEL_1", abortOp : java.util.concurrent.atomic.AtomicBoolean@c073b3e8, execContext : org.apache.hadoop.hive.ql.exec.mr.ExecMapperContext@c073d798, rootInitializeCalled : true, runTimeNumRows : 4412884, indexForTezUnion : -1, hconf : org.apache.hadoop.mapred.JobConf@80045688, asyncInitOperations : j.u.HashSet(size: 0), state : org.apache.hadoop.hive.ql.exec.Operator$State@c0713228, useBucketizedHiveInputFormat : false, conf : org.apache.hadoop.hive.ql.plan.SelectDesc@c073b438, done : false, rowSchema : org.apache.hadoop.hive.ql.exec.RowSchema@c076aa60, statsMap : j.u.HashMap(size: 0), out : null, LOG : org.slf4j.impl.Log4jLoggerAdapter@c05255d0, PLOG : org.slf4j.impl.Log4jLoggerAdapter@c05227a0, isLogInfoEnabled : false, isLogDebugEnabled : false, isLogTraceEnabled : false, alias : null, reporter : org.apache.hadoop.mapred.Task$TaskReporter@c0502cb0, id : "1", inputObjInspectors : org.apache.hadoop.hive.serde2.objectinspector.ObjectInspector[](size: 1), outputObjInspector : org.apache.hadoop.hive.serde2.objectinspector.StandardStructObjectInspector@c4b385d8, colExprMap : j.u.HashMap(size: 12), jobCloseDone : false, childOperatorsArray : org.apache.hadoop.hive.ql.exec.Operator[](size: 1), childOperatorsTag : int[](size: 1), groupKeyObject : null, eval : org.apache.hadoop.hive.ql.exec.ExprNodeEvaluator[](size: 12), output : Object[](size: 12), isSelectStarNoCompute : false)

Object[2]{org.apache.hadoop.hive.serde2.columnar.ColumnarStruct(lengthNullSequence : 2, ...), Object[](1)@c0aa5078}

org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:897)
  Local variables org.apache.hadoop.hive.ql.exec.TableScanOperator(configuration : org.apache.hadoop.mapred.JobConf@80045688, cContext : null, childOperators : j.u.ArrayList(size: 1), parentOperators : j.u.ArrayList(size: 1), operatorId : "TS_0", abortOp : java.util.concurrent.atomic.AtomicBoolean@c076b2b0, execContext : org.apache.hadoop.hive.ql.exec.mr.ExecMapperContext@c073d798, rootInitializeCalled : true, runTimeNumRows : 4412885, indexForTezUnion : -1, hconf : org.apache.hadoop.mapred.JobConf@80045688, asyncInitOperations : j.u.HashSet(size: 0), state : org.apache.hadoop.hive.ql.exec.Operator$State@c0713228, useBucketizedHiveInputFormat : false, conf : org.apache.hadoop.hive.ql.plan.TableScanDesc@c076b300, done : false, rowSchema : org.apache.hadoop.hive.ql.exec.RowSchema@c076b490, statsMap : j.u.HashMap(size: 0), out : null, LOG : org.slf4j.impl.Log4jLoggerAdapter@c05226c8, PLOG : org.slf4j.impl.Log4jLoggerAdapter@c05227a0, isLogInfoEnabled : false, isLogDebugEnabled : false, isLogTraceEnabled : false, alias : null, reporter : org.apache.hadoop.mapred.Task$TaskReporter@c0502cb0, id : "0", inputObjInspectors : org.apache.hadoop.hive.serde2.objectinspector.ObjectInspector[](size: 1), outputObjInspector : org.apache.hadoop.hive.serde2.objectinspector.UnionStructObjectInspector@c06c6a00, colExprMap : null, jobCloseDone : false, childOperatorsArray : org.apache.hadoop.hive.ql.exec.Operator[](size: 1), childOperatorsTag : int[](size: 1), groupKeyObject : null, jc : null, inputFileChanged : true, tableDesc : null, currentStat : null, stats : null, rowLimit : -1, currCount : 0, insideView : false, defaultPartitionName : null, schemaEvolutionColumns : "bigimage,category,componentid,createddate,description,discount,extra,filtered,hashcode,id,instock,lasttouch,name,partnerid,price,producturl, ...[length 232]", schemaEvolutionColumnsTypes : "string,array<struct<h:int,coltp:int,catname:string,cattype:string>>,int,string,string,int,string,boolean,int,string,boolean,string,string,in ...[length 210]")

Object[2]{org.apache.hadoop.hive.serde2.columnar.ColumnarStruct(lengthNullSequence : 2, ...), Object[](1)@c0aa5078}

org.apache.hadoop.hive.ql.exec.TableScanOperator.process(TableScanOperator.java:130)
org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx.forward(MapOperator.java:148)
org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:547)
  Local variables org.apache.hadoop.hive.ql.exec.MapOperator(configuration : org.apache.hadoop.mapred.JobConf@80045688, cContext : org.apache.hadoop.hive.ql.CompilationOpContext@c07366f8, childOperators : j.u.ArrayList(size: 1), parentOperators : j.u.ArrayList(size: 0), operatorId : "MAP_0", abortOp : java.util.concurrent.atomic.AtomicBoolean@c076c660, execContext : org.apache.hadoop.hive.ql.exec.mr.ExecMapperContext@c073d798, rootInitializeCalled : true, runTimeNumRows : 0, indexForTezUnion : -1, hconf : org.apache.hadoop.mapred.JobConf@80045688, asyncInitOperations : j.u.HashSet(size: 0), state : org.apache.hadoop.hive.ql.exec.Operator$State@c0713228, useBucketizedHiveInputFormat : false, conf : org.apache.hadoop.hive.ql.plan.MapWork@c076c6c0, done : false, rowSchema : null, statsMap : j.u.HashMap(size: 2), out : null, LOG : org.slf4j.impl.Log4jLoggerAdapter@c0523508, PLOG : org.slf4j.impl.Log4jLoggerAdapter@c05227a0, isLogInfoEnabled : false, isLogDebugEnabled : false, isLogTraceEnabled : false, alias : null, reporter : org.apache.hadoop.mapred.Task$TaskReporter@c0502cb0, id : "0", inputObjInspectors : org.apache.hadoop.hive.serde2.objectinspector.ObjectInspector[](size: 1), outputObjInspector : null, colExprMap : null, jobCloseDone : false, childOperatorsArray : org.apache.hadoop.hive.ql.exec.Operator[](size: 0), childOperatorsTag : int[](size: 0), groupKeyObject : null, deserialize_error_count : org.apache.hadoop.io.LongWritable@c0740358, recordCounter : org.apache.hadoop.io.LongWritable@c0740340, numRows : 4412884, connectedOperators : j.u.TreeMap(size: 0), normalizedPaths : j.u.HashMap(size: 674), cntr : 1, logEveryNRows : 0, opCtxMap : j.u.HashMap(size: 1,000), childrenOpToOI : j.u.HashMap(size: 1), currentCtxs : org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx[](size: 1))

org.apache.hadoop.hive.serde2.columnar.BytesRefArrayWritable(bytesRefWritables : org.apache.hadoop.hive.serde2.columnar.BytesRefWritable[](size: 23), valid : 23)

org.apache.hadoop.hive.ql.exec.mr.ExecMapperContext(lastInputPath : org.apache.hadoop.fs.Path@c0503200, currentInputPath : org.apache.hadoop.fs.Path@c0503200, inputFileChecked : true, fileId : null, localWork : null, fetchOperators : null, jc : org.apache.hadoop.mapred.JobConf@80045688, ioCxt : org.apache.hadoop.hive.ql.io.IOContext@c06ff8c0, currentBigBucketFile : null)

org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx[1]{(alias : "$hdt$_0:partnerdb_catalogs", tableName : "bi_data.partnerdb_catalogs", partName : "{partner_partition=704}", ...)}

org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx(alias : "$hdt$_0:partnerdb_catalogs", op : org.apache.hadoop.hive.ql.exec.TableScanOperator@c0736768, partDesc : org.apache.hadoop.hive.ql.plan.PartitionDesc@c0aa4770, partObjectInspector : org.apache.hadoop.hive.serde2.objectinspector.StandardStructObjectInspector@c06c6958, vcsObjectInspector : null, rowObjectInspector : org.apache.hadoop.hive.serde2.objectinspector.UnionStructObjectInspector@c06c6a00, partTblObjectInspectorConverter : org.apache.hadoop.hive.serde2.objectinspector.ObjectInspectorConverters$IdentityConverter@c0aa5050, rowWithPart : Object[](size: 2), rowWithPartAndVC : null, deserializer : org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe@c0aa50a0, tableName : "bi_data.partnerdb_catalogs", partName : "{partner_partition=704}", vcs : null, vcValues : null)

Object[2]{org.apache.hadoop.hive.serde2.columnar.ColumnarStruct(lengthNullSequence : 2, ...), Object[](1)@c0aa5078}

org.apache.hadoop.hive.ql.exec.mr.ExecMapper.map(ExecMapper.java:160)
  Local variables org.apache.hadoop.hive.ql.exec.mr.ExecMapper(mo : org.apache.hadoop.hive.ql.exec.MapOperator@c0736640, oc : org.apache.hadoop.mapred.MapTask$OldOutputCollector@c5008b80, jc : org.apache.hadoop.mapred.JobConf@80045688, abort : false, rp : org.apache.hadoop.mapred.Task$TaskReporter@c0502cb0, localWork : null, execContext : org.apache.hadoop.hive.ql.exec.mr.ExecMapperContext@c073d798)

org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
  Local variables org.apache.hadoop.mapred.MapRunner(mapper : org.apache.hadoop.hive.ql.exec.mr.ExecMapper@c05d69f8, incrProcCount : false)

org.apache.hadoop.mapred.MapTask$TrackedRecordReader(rawIn : org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileRecordReader@c05d6c68, fileInputByteCounter : org.apache.hadoop.mapred.Counters$Counter@c057eac8, inputRecordCounter : org.apache.hadoop.mapred.Counters$Counter@c057ec28, reporter : org.apache.hadoop.mapred.Task$TaskReporter@c0502cb0, bytesInPrev : 0, bytesInCurr : 0, fsStats : null, this$0 : org.apache.hadoop.mapred.MapTask@801532d8)

org.apache.hadoop.mapred.MapTask$OldOutputCollector(partitioner : org.apache.hadoop.hive.ql.io.DefaultHivePartitioner@c5008b98, collector : org.apache.hadoop.mapred.MapTask$MapOutputBuffer@c0612f40, numPartitions : 1009)

org.apache.hadoop.mapred.Task$TaskReporter(umbilical : com.sun.proxy.$Proxy9@80183238, split : org.apache.hadoop.hive.ql.io.CombineHiveInputFormat$CombineHiveInputSplit@c0502ce0, taskProgress : org.apache.hadoop.util.Progress@8015bd88, pingThread : j.l.Thread@c0503a50, done : false, lock : Object@c0503e00, progressFlag : java.util.concurrent.atomic.AtomicBoolean@c0503e10, this$0 : org.apache.hadoop.mapred.MapTask@801532d8)

org.apache.hadoop.hive.shims.CombineHiveKey(key : org.apache.hadoop.io.LongWritable@db2fc0b8)

org.apache.hadoop.hive.serde2.columnar.BytesRefArrayWritable(bytesRefWritables : org.apache.hadoop.hive.serde2.columnar.BytesRefWritable[](size: 23), valid : 23)

org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:459)
  Local variables org.apache.hadoop.mapred.MapTask(jobFile : "job.xml", user : "sz.ho", taskId : org.apache.hadoop.mapred.TaskAttemptID@801533b8, partition : 1303, encryptedSpillKey : byte[](size: 1), taskStatus : org.apache.hadoop.mapred.MapTaskStatus@80153498, jobRunStateForCleanup : null, jobCleanup : false, jobSetup : false, taskCleanup : false, extraData : org.apache.hadoop.io.BytesWritable@80198e20, skipRanges : org.apache.hadoop.mapred.SortedRanges@80198e50, skipping : false, writeSkipRecs : true, currentRecStartIndex : 0, currentRecIndexIterator : org.apache.hadoop.mapred.SortedRanges$SkipRangeIterator@80198eb8, pTree : org.apache.hadoop.yarn.util.ProcfsBasedProcessTree@c0502ad8, initCpuCumulativeTime : 9080, conf : org.apache.hadoop.mapred.JobConf@80045688, mapOutputFile : org.apache.hadoop.mapred.YarnOutputFiles@80198f18, lDirAlloc : org.apache.hadoop.fs.LocalDirAllocator@80198f40, jobContext : org.apache.hadoop.mapred.JobContextImpl@c0502c70, taskContext : org.apache.hadoop.mapred.TaskAttemptContextImpl@c0503e20, outputFormat : null, committer : org.apache.hadoop.hive.ql.io.HiveFileFormatUtils$NullOutputCommitter@c0503e78, spilledRecordsCounter : org.apache.hadoop.mapred.Counters$Counter@80172d48, failedShuffleCounter : org.apache.hadoop.mapred.Counters$Counter@80172cb8, mergedMapOutputsCounter : org.apache.hadoop.mapred.Counters$Counter@80172dd8, numSlotsRequired : 1, umbilical : com.sun.proxy.$Proxy9@80183238, tokenSecret : javax.crypto.spec.SecretKeySpec@80198f50, shuffleSecret : javax.crypto.spec.SecretKeySpec@801915d8, gcUpdater : org.apache.hadoop.mapred.Task$GcTimeUpdater@80173280, taskProgress : org.apache.hadoop.util.Progress@8015bd88, counters : org.apache.hadoop.mapred.Counters@80172b58, taskDone : java.util.concurrent.atomic.AtomicBoolean@80172ad0, statisticUpdaters : j.u.HashMap(size: 3), splitMetaInfo : org.apache.hadoop.mapreduce.split.JobSplit$TaskSplitIndex@8015b858, mapPhase : org.apache.hadoop.util.Progress@c0517640, sortPhase : org.apache.hadoop.util.Progress@c05175b0)

org.apache.hadoop.mapred.JobConf(quietmode : true, allowNullValueProperties : false, resources : j.u.ArrayList(size: 1), finalParameters : j.u.Collections$SetFromMap@800457f0, loadDefaults : true, updatingResource : java.util.concurrent.ConcurrentHashMap(size: 1,913), properties : j.u.Properties(size: 1,915), overlay : j.u.Properties(size: 28), classLoader : sun.misc.Launcher$AppClassLoader@8001d198, credentials : org.apache.hadoop.security.Credentials@8001fbc8)

org.apache.hadoop.mapreduce.split.JobSplit$TaskSplitIndex(splitLocation : "viewfs://root/tmp/hadoop-yarn/sz.ho/.staging/job_1531301354100_57311/job.split", startOffset : 909032)

com.sun.proxy.$Proxy9(h : org.apache.hadoop.ipc.WritableRpcEngine$Invoker@80183248)

org.apache.hadoop.mapred.Task$TaskReporter(umbilical : com.sun.proxy.$Proxy9@80183238, split : org.apache.hadoop.hive.ql.io.CombineHiveInputFormat$CombineHiveInputSplit@c0502ce0, taskProgress : org.apache.hadoop.util.Progress@8015bd88, pingThread : j.l.Thread@c0503a50, done : false, lock : Object@c0503e00, progressFlag : java.util.concurrent.atomic.AtomicBoolean@c0503e10, this$0 : org.apache.hadoop.mapred.MapTask@801532d8)

org.apache.hadoop.hive.ql.io.CombineHiveInputFormat$CombineHiveInputSplit(paths : null, startoffset : null, lengths : null, locations : null, totLength : 0, job : null, shrinkedLength : 0, _isShrinked : false, inputFormatClassName : "org.apache.hadoop.hive.ql.io.RCFileInputFormat", inputSplitShim : org.apache.hadoop.hive.shims.HadoopShimsSecure$InputSplitShim@c0502da8, pathToPartitionInfo : null)

org.apache.hadoop.mapred.MapTask$TrackedRecordReader(rawIn : org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileRecordReader@c05d6c68, fileInputByteCounter : org.apache.hadoop.mapred.Counters$Counter@c057eac8, inputRecordCounter : org.apache.hadoop.mapred.Counters$Counter@c057ec28, reporter : org.apache.hadoop.mapred.Task$TaskReporter@c0502cb0, bytesInPrev : 0, bytesInCurr : 0, fsStats : null, this$0 : org.apache.hadoop.mapred.MapTask@801532d8)

org.apache.hadoop.mapred.MapTask$MapOutputBuffer(partitions : 1009, job : org.apache.hadoop.mapred.JobConf@80045688, reporter : org.apache.hadoop.mapred.Task$TaskReporter@c0502cb0, keyClass : j.l.Class@c0613010, valClass : j.l.Class@801a87e0, comparator : org.apache.hadoop.hive.ql.io.HiveKey$Comparator@c0613080, serializationFactory : org.apache.hadoop.io.serializer.SerializationFactory@c06130a0, keySerializer : org.apache.hadoop.io.serializer.WritableSerialization$WritableSerializer@c0613140, valSerializer : org.apache.hadoop.io.serializer.WritableSerialization$WritableSerializer@c06131c0, combinerRunner : null, combineCollector : null, codec : org.apache.hadoop.io.compress.SnappyCodec@c06131d8, kvmeta : java.nio.ByteBufferAsIntBufferL@c06131e8, kvstart : 268173308, kvend : 268173308, kvindex : 268173308, equator : 0, bufstart : 0, bufend : 0, bufmark : 0, bufindex : 0, bufvoid : 1072693248, kvbuffer : byte[](size: 1,072,693,248), b0 : byte[](size: 0), maxRec : 67043328, softLimit : 858154624, spillInProgress : false, bufferRemaining : 858154624, sortSpillException : null, numSpills : 0, minSpillsForCombine : 3, sorter : org.apache.hadoop.util.QuickSort@c0613260, spillLock : java.util.concurrent.locks.ReentrantLock@c0613270, spillDone : java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject@c0613280, spillReady : java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject@c0612da0, bb : org.apache.hadoop.mapred.MapTask$MapOutputBuffer$BlockingBuffer@c0613158, spillThreadRunning : true, spillThread : org.apache.hadoop.mapred.MapTask$MapOutputBuffer$SpillThread@c0612c20, rfs : org.apache.hadoop.fs.RawLocalFileSystem@c0613298, mapOutputByteCounter : org.apache.hadoop.mapred.Counters$Counter@c057ebc8, mapOutputRecordCounter : org.apache.hadoop.mapred.Counters$Counter@c057ebf8, fileOutputByteCounter : org.apache.hadoop.mapred.Counters$Counter@c057eb98, indexCacheList : j.u.ArrayList(size: 0), totalIndexCacheMemory : 0, indexCacheMemoryLimit : 1048576, mapTask : org.apache.hadoop.mapred.MapTask@801532d8, mapOutputFile : org.apache.hadoop.mapred.YarnOutputFiles@80198f18, sortPhase : org.apache.hadoop.util.Progress@c05175b0, spilledRecordsCounter : org.apache.hadoop.mapred.Counters$Counter@80172d48, META_BUFFER_TMP : byte[](size: 16))

org.apache.hadoop.mapred.MapRunner(mapper : org.apache.hadoop.hive.ql.exec.mr.ExecMapper@c05d69f8, incrProcCount : false)

org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
  Local variables org.apache.hadoop.mapred.MapTask(jobFile : "job.xml", user : "sz.ho", taskId : org.apache.hadoop.mapred.TaskAttemptID@801533b8, partition : 1303, encryptedSpillKey : byte[](size: 1), taskStatus : org.apache.hadoop.mapred.MapTaskStatus@80153498, jobRunStateForCleanup : null, jobCleanup : false, jobSetup : false, taskCleanup : false, extraData : org.apache.hadoop.io.BytesWritable@80198e20, skipRanges : org.apache.hadoop.mapred.SortedRanges@80198e50, skipping : false, writeSkipRecs : true, currentRecStartIndex : 0, currentRecIndexIterator : org.apache.hadoop.mapred.SortedRanges$SkipRangeIterator@80198eb8, pTree : org.apache.hadoop.yarn.util.ProcfsBasedProcessTree@c0502ad8, initCpuCumulativeTime : 9080, conf : org.apache.hadoop.mapred.JobConf@80045688, mapOutputFile : org.apache.hadoop.mapred.YarnOutputFiles@80198f18, lDirAlloc : org.apache.hadoop.fs.LocalDirAllocator@80198f40, jobContext : org.apache.hadoop.mapred.JobContextImpl@c0502c70, taskContext : org.apache.hadoop.mapred.TaskAttemptContextImpl@c0503e20, outputFormat : null, committer : org.apache.hadoop.hive.ql.io.HiveFileFormatUtils$NullOutputCommitter@c0503e78, spilledRecordsCounter : org.apache.hadoop.mapred.Counters$Counter@80172d48, failedShuffleCounter : org.apache.hadoop.mapred.Counters$Counter@80172cb8, mergedMapOutputsCounter : org.apache.hadoop.mapred.Counters$Counter@80172dd8, numSlotsRequired : 1, umbilical : com.sun.proxy.$Proxy9@80183238, tokenSecret : javax.crypto.spec.SecretKeySpec@80198f50, shuffleSecret : javax.crypto.spec.SecretKeySpec@801915d8, gcUpdater : org.apache.hadoop.mapred.Task$GcTimeUpdater@80173280, taskProgress : org.apache.hadoop.util.Progress@8015bd88, counters : org.apache.hadoop.mapred.Counters@80172b58, taskDone : java.util.concurrent.atomic.AtomicBoolean@80172ad0, statisticUpdaters : j.u.HashMap(size: 3), splitMetaInfo : org.apache.hadoop.mapreduce.split.JobSplit$TaskSplitIndex@8015b858, mapPhase : org.apache.hadoop.util.Progress@c0517640, sortPhase : org.apache.hadoop.util.Progress@c05175b0)

org.apache.hadoop.mapred.JobConf(quietmode : true, allowNullValueProperties : false, resources : j.u.ArrayList(size: 1), finalParameters : j.u.Collections$SetFromMap@800457f0, loadDefaults : true, updatingResource : java.util.concurrent.ConcurrentHashMap(size: 1,913), properties : j.u.Properties(size: 1,915), overlay : j.u.Properties(size: 28), classLoader : sun.misc.Launcher$AppClassLoader@8001d198, credentials : org.apache.hadoop.security.Credentials@8001fbc8)

com.sun.proxy.$Proxy9(h : org.apache.hadoop.ipc.WritableRpcEngine$Invoker@80183248)

org.apache.hadoop.mapred.Task$TaskReporter(umbilical : com.sun.proxy.$Proxy9@80183238, split : org.apache.hadoop.hive.ql.io.CombineHiveInputFormat$CombineHiveInputSplit@c0502ce0, taskProgress : org.apache.hadoop.util.Progress@8015bd88, pingThread : j.l.Thread@c0503a50, done : false, lock : Object@c0503e00, progressFlag : java.util.concurrent.atomic.AtomicBoolean@c0503e10, this$0 : org.apache.hadoop.mapred.MapTask@801532d8)

org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
  Local variables org.apache.hadoop.mapred.YarnChild$2(val$taskFinal : org.apache.hadoop.mapred.MapTask@801532d8, val$job : org.apache.hadoop.mapred.JobConf@80045688, val$umbilical : com.sun.proxy.$Proxy9@80183238)

java.security.AccessController.doPrivileged(Native method)
javax.security.auth.Subject.doAs(Subject.java:422)
  Local variables javax.security.auth.Subject(principals : j.u.Collections$SynchronizedSet@80171fd8, pubCredentials : j.u.Collections$SynchronizedSet@801720a0, privCredentials : j.u.Collections$SynchronizedSet@801720f0, readOnly : false)

org.apache.hadoop.mapred.YarnChild$2(val$taskFinal : org.apache.hadoop.mapred.MapTask@801532d8, val$job : org.apache.hadoop.mapred.JobConf@80045688, val$umbilical : com.sun.proxy.$Proxy9@80183238)

java.security.AccessControlContext(context : java.security.ProtectionDomain[](size: 2), isPrivileged : false, isAuthorized : true, privilegedContext : null, combiner : null, permissions : null, parent : null, isWrapped : false, isLimited : false, limitedContext : null)

org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1920)
  Local variables org.apache.hadoop.security.UserGroupInformation(subject : javax.security.auth.Subject@80171fb8, user : org.apache.hadoop.security.User@80172040, isKeytab : false, isKrbTkt : false, isLoginExternal : false)

org.apache.hadoop.mapred.YarnChild$2(val$taskFinal : org.apache.hadoop.mapred.MapTask@801532d8, val$job : org.apache.hadoop.mapred.JobConf@80045688, val$umbilical : com.sun.proxy.$Proxy9@80183238)

org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
  Local variables String[4]{"10.224.22.18", "46104", "attempt_1531301354100_57311_m_001303_1", "285873023226794"}

org.apache.hadoop.mapred.JobConf(quietmode : true, allowNullValueProperties : false, resources : j.u.ArrayList(size: 1), finalParameters : j.u.Collections$SetFromMap@800457f0, loadDefaults : true, updatingResource : java.util.concurrent.ConcurrentHashMap(size: 1,913), properties : j.u.Properties(size: 1,915), overlay : j.u.Properties(size: 28), classLoader : sun.misc.Launcher$AppClassLoader@8001d198, credentials : org.apache.hadoop.security.Credentials@8001fbc8)

"10.224.22.18"

java.net.InetSocketAddress(holder : java.net.InetSocketAddress$InetSocketAddressHolder@800469b0)

org.apache.hadoop.mapred.TaskAttemptID(id : 1, taskId : org.apache.hadoop.mapred.TaskID@8004ee28)

org.apache.hadoop.mapred.JVMId(isMap : true, jobId : org.apache.hadoop.mapred.JobID@8004edd8, jvmId : 285873023226794)

org.apache.hadoop.security.Credentials(secretKeysMap : j.u.HashMap(size: 1), tokenMap : j.u.HashMap(size: 5))

org.apache.hadoop.security.UserGroupInformation(subject : javax.security.auth.Subject@80046a18, user : org.apache.hadoop.security.User@80046aa0, isKeytab : false, isKrbTkt : false, isLoginExternal : false)

org.apache.hadoop.security.token.Token(identifier : byte[](size: 24), password : byte[](size: 20), kind : org.apache.hadoop.io.Text@80046e68, service : org.apache.hadoop.io.Text@80046db8, renewer : null)

com.sun.proxy.$Proxy9(h : org.apache.hadoop.ipc.WritableRpcEngine$Invoker@80183248)

org.apache.hadoop.mapred.JvmContext(jvmId : org.apache.hadoop.mapred.JVMId@8004ed70, pid : "-1000")

org.apache.hadoop.mapred.MapTask(jobFile : "job.xml", user : "sz.ho", taskId : org.apache.hadoop.mapred.TaskAttemptID@801533b8, partition : 1303, encryptedSpillKey : byte[](size: 1), taskStatus : org.apache.hadoop.mapred.MapTaskStatus@80153498, jobRunStateForCleanup : null, jobCleanup : false, jobSetup : false, taskCleanup : false, extraData : org.apache.hadoop.io.BytesWritable@80198e20, skipRanges : org.apache.hadoop.mapred.SortedRanges@80198e50, skipping : false, writeSkipRecs : true, currentRecStartIndex : 0, currentRecIndexIterator : org.apache.hadoop.mapred.SortedRanges$SkipRangeIterator@80198eb8, pTree : org.apache.hadoop.yarn.util.ProcfsBasedProcessTree@c0502ad8, initCpuCumulativeTime : 9080, conf : org.apache.hadoop.mapred.JobConf@80045688, mapOutputFile : org.apache.hadoop.mapred.YarnOutputFiles@80198f18, lDirAlloc : org.apache.hadoop.fs.LocalDirAllocator@80198f40, jobContext : org.apache.hadoop.mapred.JobContextImpl@c0502c70, taskContext : org.apache.hadoop.mapred.TaskAttemptContextImpl@c0503e20, outputFormat : null, committer : org.apache.hadoop.hive.ql.io.HiveFileFormatUtils$NullOutputCommitter@c0503e78, spilledRecordsCounter : org.apache.hadoop.mapred.Counters$Counter@80172d48, failedShuffleCounter : org.apache.hadoop.mapred.Counters$Counter@80172cb8, mergedMapOutputsCounter : org.apache.hadoop.mapred.Counters$Counter@80172dd8, numSlotsRequired : 1, umbilical : com.sun.proxy.$Proxy9@80183238, tokenSecret : javax.crypto.spec.SecretKeySpec@80198f50, shuffleSecret : javax.crypto.spec.SecretKeySpec@801915d8, gcUpdater : org.apache.hadoop.mapred.Task$GcTimeUpdater@80173280, taskProgress : org.apache.hadoop.util.Progress@8015bd88, counters : org.apache.hadoop.mapred.Counters@80172b58, taskDone : java.util.concurrent.atomic.AtomicBoolean@80172ad0, statisticUpdaters : j.u.HashMap(size: 3), splitMetaInfo : org.apache.hadoop.mapreduce.split.JobSplit$TaskSplitIndex@8015b858, mapPhase : org.apache.hadoop.util.Progress@c0517640, sortPhase : org.apache.hadoop.util.Progress@c05175b0)

org.apache.hadoop.security.UserGroupInformation(subject : javax.security.auth.Subject@80171fb8, user : org.apache.hadoop.security.User@80172040, isKeytab : false, isKrbTkt : false, isLoginExternal : false)

java.util.concurrent.Executors$DelegatedScheduledExecutorService(e : java.util.concurrent.ScheduledThreadPoolExecutor@8001f838, e : java.util.concurrent.ScheduledThreadPoolExecutor@8001f838)

org.apache.hadoop.mapred.JvmTask(t : org.apache.hadoop.mapred.MapTask@801532d8, shouldDie : false)

org.apache.hadoop.mapred.MapTask(jobFile : "job.xml", user : "sz.ho", taskId : org.apache.hadoop.mapred.TaskAttemptID@801533b8, partition : 1303, encryptedSpillKey : byte[](size: 1), taskStatus : org.apache.hadoop.mapred.MapTaskStatus@80153498, jobRunStateForCleanup : null, jobCleanup : false, jobSetup : false, taskCleanup : false, extraData : org.apache.hadoop.io.BytesWritable@80198e20, skipRanges : org.apache.hadoop.mapred.SortedRanges@80198e50, skipping : false, writeSkipRecs : true, currentRecStartIndex : 0, currentRecIndexIterator : org.apache.hadoop.mapred.SortedRanges$SkipRangeIterator@80198eb8, pTree : org.apache.hadoop.yarn.util.ProcfsBasedProcessTree@c0502ad8, initCpuCumulativeTime : 9080, conf : org.apache.hadoop.mapred.JobConf@80045688, mapOutputFile : org.apache.hadoop.mapred.YarnOutputFiles@80198f18, lDirAlloc : org.apache.hadoop.fs.LocalDirAllocator@80198f40, jobContext : org.apache.hadoop.mapred.JobContextImpl@c0502c70, taskContext : org.apache.hadoop.mapred.TaskAttemptContextImpl@c0503e20, outputFormat : null, committer : org.apache.hadoop.hive.ql.io.HiveFileFormatUtils$NullOutputCommitter@c0503e78, spilledRecordsCounter : org.apache.hadoop.mapred.Counters$Counter@80172d48, failedShuffleCounter : org.apache.hadoop.mapred.Counters$Counter@80172cb8, mergedMapOutputsCounter : org.apache.hadoop.mapred.Counters$Counter@80172dd8, numSlotsRequired : 1, umbilical : com.sun.proxy.$Proxy9@80183238, tokenSecret : javax.crypto.spec.SecretKeySpec@80198f50, shuffleSecret : javax.crypto.spec.SecretKeySpec@801915d8, gcUpdater : org.apache.hadoop.mapred.Task$GcTimeUpdater@80173280, taskProgress : org.apache.hadoop.util.Progress@8015bd88, counters : org.apache.hadoop.mapred.Counters@80172b58, taskDone : java.util.concurrent.atomic.AtomicBoolean@80172ad0, statisticUpdaters : j.u.HashMap(size: 3), splitMetaInfo : org.apache.hadoop.mapreduce.split.JobSplit$TaskSplitIndex@8015b858, mapPhase : org.apache.hadoop.util.Progress@c0517640, sortPhase : org.apache.hadoop.util.Progress@c05175b0)




3. Where Memory Goes, by Class What's this?

  # instances   Shallow size   Impl-incl. size   Direct retained  Class name 
 492,169 1,106,967Kb (68.0%) 1,106,967Kb (68.0%) 1,106,967Kb (68.0%)byte[]
Reference chains
Expensive data fields

1,047,552Kb (64.3%): byte[]: 1 objects

 Random sample 
byte[1072693248]{0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, ...}

 ↖org.apache.hadoop.mapred.MapTask$MapOutputBuffer.kvbuffer
48,016Kb (2.9%): byte[]: 488,662 objects

 Random sample 
byte[46]{'M', 'A', 'I', 'S', 'O', 'N', ' ', 'M', 'A', 'R', 'G', 'I', 'E', 'L', 'A', ' ', 'c', 'o', 'n', 't', 'r', 'a', 's', 't', ' ', 's', 'l', 'e', 'e', 'v', ...}
byte[5]{'s', 'z', '.', 'h', 'o'}
byte[54]{'O', 'c', 'c', 'i', 'd', 'e', 'n', 't', 'a', 'l', ' ', 'L', 'e', 'a', 't', 'h', 'e', 'r', ' ', '5', '0', '1', '9', ' ', 'P', 'r', 'o', ' ', 'L', 'e', ...}
byte[15]{'o', 'f', 'f', 'i', 'c', 'i', 'a', 'l', 'r', 'a', 't', 'i', 'n', 'g', 0}
byte[64]{'G', 'i', 'c', 'l', 'e', 'e', ' ', 'P', 'r', 'i', 'n', 't', ':', ' ', 'B', 'l', 'a', 'c', 'k', ' ', 'D', 'o', 'g', ''', 's', ' ', 'T', 'r', 'e', 'a', ...}
byte[12]{'h', 'a', '-', 'h', 'd', 'f', 's', ':', 'r', 'o', 'o', 't'}
byte[2]{'\', 'N'}
byte[2]{'\', 'N'}
byte[2]{'\', 'N'}
byte[73]{'S', 'k', 'e', 't', 's', 'a', ' ', 'B', 'e', 'a', 'r', ' ', 'A', 'r', 't', ' ', 'P', 'o', 'l', 'a', ' ', 'P', 'h', 'o', 'n', 'e', ' ', 'C', 'a', 's', ...}
byte[2]{'\', 'N'}
byte[55]{'A', 'r', 't', ' ', 'P', 'r', 'i', 'n', 't', ':', ' ', 'H', 'o', 'c', 'k', 'e', 'y', ' ', 'S', 'h', 'o', 'e', ' ', 'P', 'a', 't', 'e', 'n', 't', ' ', ...}
byte[41]{'J', 'u', 'b', 'i', 'l', 'e', 'e', ' ', 'C', 'r', 'o', 'w', 'n', ' ', 'C', 'h', 'a', 'n', 'd', 'e', 'l', 'i', 'e', 'r', ' ', '-', ' ', '3', ' ', 'L', ...}
byte[27]{'O', 'u', 'r', ' ', 'L', 'e', 'g', 'a', 'c', 'y', ' ', 'p', 'u', 'l', 'l', 'o', 'v', 'e', 'r', ' ', 's', 'w', 'e', 'a', 't', 'e', 'r'}
byte[2]{'\', 'N'}
byte[2]{'\', 'N'}
byte[62]{'D', 'o', 'm', 'p', 'e', 't', ' ', 'F', 'l', 'i', 'p', ' ', 'L', 'e', 'a', 't', 'h', 'e', 'r', ' ', 'C', 'a', 's', 'e', ' ', 'u', 'n', 't', 'u', 'k', ...}
byte[2]{'\', 'N'}
byte[64]{'A', 'r', 't', ' ', 'P', 'r', 'i', 'n', 't', ':', ' ', 'L', 'o', 's', ' ', 'A', 'n', 'g', 'e', 'l', 'e', 's', ' ', 'W', 'a', 't', 'e', 'r', 'c', 'o', ...}
byte[2]{'\', 'N'}

 ↖org.apache.hadoop.io.Text.bytes
8,719Kb (0.5%): byte[]: 11 objects

 Random sample 
byte[74736]{'\', 'N', '\', 'N', '\', 'N', '\', 'N', '\', 'N', '\', 'N', '\', 'N', '\', 'N', '\', 'N', '\', 'N', '\', 'N', '\', 'N', '\', 'N', '\', 'N', '\', 'N', ...}
byte[115190]{'2', '3', '7', '0', '4', '2', '3', '7', '0', '4', '2', '3', '7', '0', '4', '2', '3', '7', '0', '4', '2', '3', '7', '0', '4', '2', '3', '7', '0', '4', ...}
byte[88284]{'1', '0', '2', '0', '0', '0', '\', 'N', '\', 'N', '1', '6', '1', '0', '0', '0', '\', 'N', '\', 'N', '\', 'N', '\', 'N', '2', '8', '5', '0', '0', '0', ...}
byte[292730]{'\', 'N', '\', 'N', '\', 'N', '\', 'N', '\', 'N', '\', 'N', '\', 'N', '\', 'N', '\', 'N', '\', 'N', '\', 'N', '\', 'N', '\', 'N', '\', 'N', '\', 'N', ...}
byte[527936]{'C', 'h', 'e', 'v', 'r', 'o', 'n', ' ', 'A', 'n', 'c', 'h', 'o', 'r', ' ', 'B', 'o', 'a', 't', ' ', 'P', 'o', 'l', 'a', ' ', 'P', 'h', 'o', 'n', 'e', ...}
byte[2753314]{'@', 'K', 's', 'a', 'm', 's', 'u', 'n', 'g', '_', 'i', 'd', 's', '@', 'V', '0', '@', 'V', '@', 'K', 's', 'e', 'l', 'l', 'e', 'r', '_', 'i', 'd', '@', ...}
byte[1343412]{'h', 't', 't', 'p', 's', ':', '/', '/', 'i', 'd', '-', 'l', 'i', 'v', 'e', '-', '0', '1', '.', 's', 'l', 'a', 't', 'i', 'c', '.', 'n', 'e', 't', '/', ...}

 ↖org.apache.hadoop.hive.serde2.lazy.ByteArrayRef.data


Full reference chains

1,047,552Kb (64.3%): byte[]: 1 objects

 Random sample 
byte[1072693248]{0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, ...}

org.apache.hadoop.mapred.MapTask$MapOutputBuffer.kvbuffer org.apache.hadoop.mapred.MapTask$MapOutputBuffer$SpillThread.this$0
j.l.Thread[]
j.l.ThreadGroup.threads
java.beans.WeakIdentityMap$Entry.referent
java.beans.WeakIdentityMap$Entry[]
java.beans.ThreadGroupContext$1.table
↖Java Static java.beans.ThreadGroupContext.contexts
47,987Kb (2.9%): byte[]: 487,567 objects

 Random sample 
byte[46]{'M', 'A', 'I', 'S', 'O', 'N', ' ', 'M', 'A', 'R', 'G', 'I', 'E', 'L', 'A', ' ', 'c', 'o', 'n', 't', 'r', 'a', 's', 't', ' ', 's', 'l', 'e', 'e', 'v', ...}
byte[110]{'C', 'a', 'h', 'a', 'y', 'a', ' ', 'T', 'e', 'r', 'a', 'n', 'g', ' ', 'U', 'P', ' ', 'F', 'l', 'a', 's', 'h', ' ', 'L', 'E', 'D', ' ', 'S', 'e', 'l', ...}
byte[54]{'O', 'c', 'c', 'i', 'd', 'e', 'n', 't', 'a', 'l', ' ', 'L', 'e', 'a', 't', 'h', 'e', 'r', ' ', '5', '0', '1', '9', ' ', 'P', 'r', 'o', ' ', 'L', 'e', ...}
byte[58]{'1', '4', 'k', ' ', 'G', 'o', 'l', 'd', ' ', 'o', 'v', 'e', 'r', ' ', 'S', 't', 'e', 'r', 'l', 'i', 'n', 'g', ' ', 'S', 'i', 'l', 'v', 'e', 'r', ' ', ...}
byte[64]{'G', 'i', 'c', 'l', 'e', 'e', ' ', 'P', 'r', 'i', 'n', 't', ':', ' ', 'B', 'l', 'a', 'c', 'k', ' ', 'D', 'o', 'g', ''', 's', ' ', 'T', 'r', 'e', 'a', ...}
byte[52]{'A', 'r', 't', ' ', 'P', 'r', 'i', 'n', 't', ':', ' ', 'F', 'l', 'o', 'r', 'a', 'l', ' ', 'E', 'c', 'l', 'i', 'p', 's', 'e', ' ', 'I', ' ', 'b', 'y', ...}
byte[56]{'2', '0', '1', '2', ' ', 'M', 'a', 'z', 'd', 'a', ' ', '2', ' ', '1', '.', '5', ' ', '(', -32, -72, -101, -32, -72, -75, ' ', '0', '9', '-', '1', '4', ...}
byte[99]{'B', 'Y', 'T', ' ', 'U', 'l', 't', 'r', 'a', ' ', 'T', 'i', 'p', 'i', 's', ' ', 'W', 'i', 'n', 'd', 'o', 'w', ' ', 'F', 'l', 'i', 'p', ' ', 'C', 'o', ...}
byte[58]{'R', '-', 'J', 'u', 's', 't', ' ', 'M', 'e', 't', 'a', 'l', ' ', 'A', 'l', 'u', 'm', 'i', 'n', 'u', 'm', ' ', 'P', 'h', 'o', 'n', 'e', ' ', 'C', 'a', ...}
byte[73]{'S', 'k', 'e', 't', 's', 'a', ' ', 'B', 'e', 'a', 'r', ' ', 'A', 'r', 't', ' ', 'P', 'o', 'l', 'a', ' ', 'P', 'h', 'o', 'n', 'e', ' ', 'C', 'a', 's', ...}
byte[60]{'S', 't', 'r', 'e', 't', 'c', 'h', 'e', 'd', ' ', 'C', 'a', 'n', 'v', 'a', 's', ' ', 'P', 'r', 'i', 'n', 't', ':', ' ', 'L', 'i', 'g', 'h', 't', 'h', ...}
byte[55]{'A', 'r', 't', ' ', 'P', 'r', 'i', 'n', 't', ':', ' ', 'H', 'o', 'c', 'k', 'e', 'y', ' ', 'S', 'h', 'o', 'e', ' ', 'P', 'a', 't', 'e', 'n', 't', ' ', ...}
byte[41]{'J', 'u', 'b', 'i', 'l', 'e', 'e', ' ', 'C', 'r', 'o', 'w', 'n', ' ', 'C', 'h', 'a', 'n', 'd', 'e', 'l', 'i', 'e', 'r', ' ', '-', ' ', '3', ' ', 'L', ...}
byte[27]{'O', 'u', 'r', ' ', 'L', 'e', 'g', 'a', 'c', 'y', ' ', 'p', 'u', 'l', 'l', 'o', 'v', 'e', 'r', ' ', 's', 'w', 'e', 'a', 't', 'e', 'r'}
byte[99]{'9', ' ', 'H', ' ', 'T', 'e', 'm', 'p', 'e', 'r', 'e', 'd', ' ', 'S', 't', 'e', 'e', 'l', ' ', 'G', 'l', 'a', 's', 's', ' ', 'U', 'l', 't', 'r', 'a', ...}
byte[80]{'O', 'c', 'e', 'a', 'n', ' ', 'S', 'p', 'r', 'i', 'n', 'g', 'b', 'e', 'd', ' ', 'M', 'a', 'g', 'i', 'c', ' ', 'W', 'o', 'n', 'd', 'e', 'r', ' ', 'S', ...}
byte[62]{'D', 'o', 'm', 'p', 'e', 't', ' ', 'F', 'l', 'i', 'p', ' ', 'L', 'e', 'a', 't', 'h', 'e', 'r', ' ', 'C', 'a', 's', 'e', ' ', 'u', 'n', 't', 'u', 'k', ...}
byte[63]{'S', 't', 'r', 'e', 't', 'c', 'h', 'e', 'd', ' ', 'C', 'a', 'n', 'v', 'a', 's', ' ', 'P', 'r', 'i', 'n', 't', ':', ' ', 'A', 'v', 'o', 'c', 'a', 'd', ...}
byte[64]{'A', 'r', 't', ' ', 'P', 'r', 'i', 'n', 't', ':', ' ', 'L', 'o', 's', ' ', 'A', 'n', 'g', 'e', 'l', 'e', 's', ' ', 'W', 'a', 't', 'e', 'r', 'c', 'o', ...}
byte[34]{'M', -31, -70, -73, 't', ' ', 'n', -31, -70, -95, ' ', 'L', 'a', 'n', 'e', 'i', 'g', 'e', ' ', 'c', 'l', 'e', 'a', 'r', ' ', 'C', ' ', 'm', 'i', -31, ...}

org.apache.hadoop.io.Text.bytes Object[]
org.apache.hadoop.hive.ql.exec.KeyWrapperFactory$ListKeyWrapper.keys
{j.u.HashMap}.keys
org.apache.hadoop.hive.ql.exec.GroupByOperator.hashAggregations
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.SelectOperator.childOperators
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.TableScanOperator.childOperators
{j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.aliasToWork
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap
4,942Kb (0.3%): byte[]: 7 objects

 Random sample 
byte[74736]{'\', 'N', '\', 'N', '\', 'N', '\', 'N', '\', 'N', '\', 'N', '\', 'N', '\', 'N', '\', 'N', '\', 'N', '\', 'N', '\', 'N', '\', 'N', '\', 'N', '\', 'N', ...}
byte[186840]{'1', '5', '5', '9', '8', '1', '5', '5', '9', '8', '1', '5', '5', '9', '8', '1', '5', '5', '9', '8', '1', '5', '5', '9', '8', '1', '5', '5', '9', '8', ...}
byte[88284]{'1', '0', '2', '0', '0', '0', '\', 'N', '\', 'N', '1', '6', '1', '0', '0', '0', '\', 'N', '\', 'N', '\', 'N', '\', 'N', '2', '8', '5', '0', '0', '0', ...}
byte[86628]{'9', '7', '0', '0', '0', '\', 'N', '\', 'N', '1', '5', '4', '0', '0', '0', '\', 'N', '\', 'N', '\', 'N', '\', 'N', '2', '7', '1', '0', '0', '0', '\', ...}
byte[527936]{'C', 'h', 'e', 'v', 'r', 'o', 'n', ' ', 'A', 'n', 'c', 'h', 'o', 'r', ' ', 'B', 'o', 'a', 't', ' ', 'P', 'o', 'l', 'a', ' ', 'P', 'h', 'o', 'n', 'e', ...}
byte[2753314]{'@', 'K', 's', 'a', 'm', 's', 'u', 'n', 'g', '_', 'i', 'd', 's', '@', 'V', '0', '@', 'V', '@', 'K', 's', 'e', 'l', 'l', 'e', 'r', '_', 'i', 'd', '@', ...}
byte[1343412]{'h', 't', 't', 'p', 's', ':', '/', '/', 'i', 'd', '-', 'l', 'i', 'v', 'e', '-', '0', '1', '.', 's', 'l', 'a', 't', 'i', 'c', '.', 'n', 'e', 't', '/', ...}

org.apache.hadoop.hive.serde2.lazy.ByteArrayRef.data org.apache.hadoop.hive.serde2.columnar.ColumnarStructBase$FieldInfo.cachedByteArrayRef
org.apache.hadoop.hive.serde2.columnar.ColumnarStructBase$FieldInfo[]
org.apache.hadoop.hive.serde2.columnar.ColumnarStruct.fieldInfoList
Object[]
org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx.rowWithPart
{j.u.LinkedHashMap}.values
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.MapOperator.opCtxMap
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.TableScanOperator.parentOperators
{j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.aliasToWork
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap
3,776Kb (0.2%): byte[]: 4 objects

 Random sample 
byte[2344694]{'h', 't', 't', 'p', ':', '/', '/', 'i', '1', '.', 's', 't', 'a', 't', 'i', 'c', '-', 's', 'h', 'o', 'p', 'c', 'a', 'd', 'e', '.', 'c', 'o', 'm', '/', ...}
byte[115190]{'2', '3', '7', '0', '4', '2', '3', '7', '0', '4', '2', '3', '7', '0', '4', '2', '3', '7', '0', '4', '2', '3', '7', '0', '4', '2', '3', '7', '0', '4', ...}
byte[1114812]{'\', 'N', '\', 'N', '\', 'N', '\', 'N', '\', 'N', '\', 'N', '\', 'N', '\', 'N', '\', 'N', '\', 'N', '\', 'N', '\', 'N', '\', 'N', '\', 'N', '\', 'N', ...}
byte[292730]{'\', 'N', '\', 'N', '\', 'N', '\', 'N', '\', 'N', '\', 'N', '\', 'N', '\', 'N', '\', 'N', '\', 'N', '\', 'N', '\', 'N', '\', 'N', '\', 'N', '\', 'N', ...}

org.apache.hadoop.hive.serde2.lazy.ByteArrayRef.data org.apache.hadoop.hive.serde2.columnar.ColumnarStructBase$FieldInfo.cachedByteArrayRef
org.apache.hadoop.hive.serde2.columnar.ColumnarStructBase$FieldInfo[]
org.apache.hadoop.hive.serde2.columnar.ColumnarStruct.fieldInfoList
Object[]
org.apache.hadoop.hive.ql.exec.ExprNodeGenericFuncEvaluator.rowObject
org.apache.hadoop.hive.ql.exec.ExprNodeEvaluator[]
org.apache.hadoop.hive.ql.exec.SelectOperator.eval
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.TableScanOperator.childOperators
{j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.aliasToWork
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap


 3,900,708 60,948Kb (3.7%) 243,821Kb (15.0%) 243,839Kb (15.0%)j.u.HashSet
Reference chains
Expensive data fields

182,843Kb (11.2%): j.u.HashSet: 2,925,498 objects

 Random sample 
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}

 ↖org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg.uniqueObjects
60,947Kb (3.7%): j.u.HashSet: 975,166 objects

 Random sample 
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}

 ↖org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg.uniqueObjects


Full reference chains

182,843Kb (11.2%): j.u.HashSet: 2,925,492 objects

 Random sample 
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}

org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg.uniqueObjects org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[]
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GroupByOperator.hashAggregations
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.SelectOperator.childOperators
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.TableScanOperator.childOperators
{j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.aliasToWork
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap
60,947Kb (3.7%): j.u.HashSet: 975,164 objects

 Random sample 
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}

org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg.uniqueObjects org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[]
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GroupByOperator.hashAggregations
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.SelectOperator.childOperators
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.TableScanOperator.childOperators
{j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.aliasToWork
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap
10Kb (< 0.1%): j.u.HashSet: 1 objects

 Random sample 
j.u.HashSet<j.l.Class>(size: 275, capacity: 512) {j.l.Class@c4c99470, ...}

j.u.Collections$SynchronizedSet.c org.apache.hadoop.hive.ql.exec.Registry.builtIns
↖Java Static org.apache.hadoop.hive.ql.exec.FunctionRegistry.system


 2,925,498 68,566Kb (4.2%) 68,566Kb (4.2%) 251,409Kb (15.4%)org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg
Reference chains
Expensive data fields

68,566Kb (4.2%): org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg: 2,925,492 objects (37% of all objects referenced here)

 Random sample 
(uniqueObjects : j.u.HashSet(size: 0), value : 1)
(uniqueObjects : (size: 0), value : 1)
(uniqueObjects : (size: 0), value : 1)
(uniqueObjects : (size: 0), value : 1)
(uniqueObjects : (size: 0), value : 0)
(uniqueObjects : (size: 0), value : 0)
(uniqueObjects : (size: 0), value : 1)
(uniqueObjects : (size: 0), value : 1)
(uniqueObjects : (size: 0), value : 4)
(uniqueObjects : (size: 0), value : 1)
(uniqueObjects : (size: 0), value : 0)
(uniqueObjects : (size: 0), value : 0)
(uniqueObjects : (size: 0), value : 0)
(uniqueObjects : (size: 0), value : 1)
(uniqueObjects : (size: 0), value : 1)
(uniqueObjects : (size: 0), value : 1)
(uniqueObjects : (size: 0), value : 3)
(uniqueObjects : (size: 0), value : 1)
(uniqueObjects : (size: 0), value : 1)
(uniqueObjects : (size: 0), value : 1)

org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[] {j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GroupByOperator.hashAggregations


Full reference chains

68,566Kb (4.2%): org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg: 2,925,492 objects (37% of all objects referenced here)

 Random sample 
(uniqueObjects : j.u.HashSet(size: 0), value : 1)
(uniqueObjects : (size: 0), value : 1)
(uniqueObjects : (size: 0), value : 1)
(uniqueObjects : (size: 0), value : 1)
(uniqueObjects : (size: 0), value : 0)
(uniqueObjects : (size: 0), value : 0)
(uniqueObjects : (size: 0), value : 1)
(uniqueObjects : (size: 0), value : 1)
(uniqueObjects : (size: 0), value : 4)
(uniqueObjects : (size: 0), value : 1)
(uniqueObjects : (size: 0), value : 0)
(uniqueObjects : (size: 0), value : 0)
(uniqueObjects : (size: 0), value : 0)
(uniqueObjects : (size: 0), value : 1)
(uniqueObjects : (size: 0), value : 1)
(uniqueObjects : (size: 0), value : 1)
(uniqueObjects : (size: 0), value : 3)
(uniqueObjects : (size: 0), value : 1)
(uniqueObjects : (size: 0), value : 1)
(uniqueObjects : (size: 0), value : 1)

org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[] {j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GroupByOperator.hashAggregations
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.SelectOperator.childOperators
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.TableScanOperator.childOperators
{j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.aliasToWork
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap
144b (< 0.1%): org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg: 6 objects (37% of all objects referenced here)

 Random sample 
(uniqueObjects : j.u.HashSet(size: 0), value : 0)
(uniqueObjects : (size: 0), value : 0)
(uniqueObjects : (size: 0), value : 0)
(uniqueObjects : (size: 0), value : 0)
(uniqueObjects : (size: 0), value : 0)
(uniqueObjects : (size: 0), value : 0)

org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[] org.apache.hadoop.hive.ql.exec.GroupByOperator.aggregations
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.SelectOperator.childOperators
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.TableScanOperator.childOperators
{j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.aliasToWork
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap


 487,583 38,092Kb (2.3%) 38,092Kb (2.3%) 190,462Kb (11.7%)org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[]
Reference chains
Expensive data fields

38,092Kb (2.3%): org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[]: 487,582 objects

 Random sample 
org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[16]{org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(1), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(0), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 0, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg@cf685df8, ...}
org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[16]{org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(1), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(0), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 0, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg@c35408b8, ...}
org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[16]{org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(1), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(0), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 0, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg@c5b4a8b0, ...}
org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[16]{org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(1), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(0), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 0, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg@c7c77758, ...}
org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[16]{org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(1), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(0), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 0, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg@c9da5058, ...}
org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[16]{org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(1), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(0), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 0, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg@cbf1e5c0, ...}
org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[16]{org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(1), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(0), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 0, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg@ce05fc28, ...}
org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[16]{org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(1), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(0), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 0, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg@d019c970, ...}
org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[16]{org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(1), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(0), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 0, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg@c404fe58, ...}
org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[16]{org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(1), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(0), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 0, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg@c6658518, ...}
org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[16]{org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(1), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(0), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 0, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg@c878e338, ...}
org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[16]{org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(1), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(0), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 0, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg@ca8b0ca0, ...}
org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[16]{org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(1), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(0), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 0, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg@cca34190, ...}
org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[16]{org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(1), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(0), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 0, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg@ceb734d8, ...}
org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[16]{org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(0), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(0), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 0, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg@c2a317e8, ...}
org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[16]{org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(1), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(0), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 0, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg@c505b138, ...}
org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[16]{org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(1), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(0), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 0, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg@c7167a58, ...}
org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[16]{org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(1), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(0), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 0, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg@c929a340, ...}
org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[16]{org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(1), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(0), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 0, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg@cb3c3788, ...}
org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[16]{org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(1), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(0), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 0, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg@cd548fa0, ...}

{j.u.HashMap}.values org.apache.hadoop.hive.ql.exec.GroupByOperator.hashAggregations


Full reference chains

38,092Kb (2.3%): org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[]: 487,582 objects

 Random sample 
org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[16]{org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(1), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(0), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 0, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg@cf685df8, ...}
org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[16]{org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(1), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(0), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 0, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg@c35408b8, ...}
org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[16]{org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(1), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(0), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 0, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg@c5b4a8b0, ...}
org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[16]{org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(1), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(0), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 0, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg@c7c77758, ...}
org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[16]{org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(1), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(0), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 0, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg@c9da5058, ...}
org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[16]{org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(1), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(0), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 0, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg@cbf1e5c0, ...}
org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[16]{org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(1), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(0), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 0, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg@ce05fc28, ...}
org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[16]{org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(1), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(0), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 0, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg@d019c970, ...}
org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[16]{org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(1), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(0), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 0, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg@c404fe58, ...}
org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[16]{org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(1), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(0), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 0, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg@c6658518, ...}
org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[16]{org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(1), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(0), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 0, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg@c878e338, ...}
org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[16]{org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(1), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(0), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 0, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg@ca8b0ca0, ...}
org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[16]{org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(1), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(0), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 0, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg@cca34190, ...}
org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[16]{org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(1), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(0), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 0, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg@ceb734d8, ...}
org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[16]{org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(0), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(0), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 0, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg@c2a317e8, ...}
org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[16]{org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(1), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(0), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 0, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg@c505b138, ...}
org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[16]{org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(1), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(0), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 0, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg@c7167a58, ...}
org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[16]{org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(1), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(0), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 0, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg@c929a340, ...}
org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[16]{org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(1), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(0), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 0, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg@cb3c3788, ...}
org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[16]{org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(1), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(0), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 0, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg@cd548fa0, ...}

{j.u.HashMap}.values org.apache.hadoop.hive.ql.exec.GroupByOperator.hashAggregations
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.SelectOperator.childOperators
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.TableScanOperator.childOperators
{j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.aliasToWork
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap
80b (< 0.1%): org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[]: 1 objects

 Random sample 
org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[16]{org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : true, sum : j.l.Long(0), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : true, sum : j.l.Long(0), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 0, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 0, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 0, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 0, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 0, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 0, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg@c501bcf8, ...}

org.apache.hadoop.hive.ql.exec.GroupByOperator.aggregations {j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.SelectOperator.childOperators
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.TableScanOperator.childOperators
{j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.aliasToWork
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap


 1,950,332 30,473Kb (1.9%) 30,473Kb (1.9%) 32,585Kb (2.0%)org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMax$GenericUDAFMaxEvaluator$MaxAgg
Reference chains
Expensive data fields

30,473Kb (1.9%): org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMax$GenericUDAFMaxEvaluator$MaxAgg: 1,950,328 objects (25% of all objects referenced here)

 Random sample 
(o : null)
(o : null)
(o : null)
(o : null)
(o : null)
(o : null)
(o : null)
(o : null)
(o : null)
(o : null)
(o : null)
(o : null)
(o : null)
(o : null)
(o : null)
(o : null)
(o : null)
(o : null)
(o : null)
(o : null)

org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[] {j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GroupByOperator.hashAggregations


Full reference chains

30,473Kb (1.9%): org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMax$GenericUDAFMaxEvaluator$MaxAgg: 1,950,328 objects (25% of all objects referenced here)

 Random sample 
(o : null)
(o : null)
(o : null)
(o : null)
(o : null)
(o : null)
(o : null)
(o : null)
(o : null)
(o : null)
(o : null)
(o : null)
(o : null)
(o : null)
(o : null)
(o : null)
(o : null)
(o : null)
(o : null)
(o : null)

org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[] {j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GroupByOperator.hashAggregations
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.SelectOperator.childOperators
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.TableScanOperator.childOperators
{j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.aliasToWork
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap
64b (< 0.1%): org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMax$GenericUDAFMaxEvaluator$MaxAgg: 4 objects (25% of all objects referenced here)

 Random sample 
(o : null)
(o : null)
(o : null)
(o : null)

org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[] org.apache.hadoop.hive.ql.exec.GroupByOperator.aggregations
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.SelectOperator.childOperators
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.TableScanOperator.childOperators
{j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.aliasToWork
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap


 1,950,332 30,473Kb (1.9%) 30,473Kb (1.9%) 32,585Kb (2.0%)org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg
Reference chains
Expensive data fields

30,473Kb (1.9%): org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg: 1,950,328 objects (25% of all objects referenced here)

 Random sample 
(o : null)
(o : null)
(o : null)
(o : null)
(o : null)
(o : null)
(o : null)
(o : null)
(o : null)
(o : null)
(o : null)
(o : null)
(o : null)
(o : null)
(o : null)
(o : null)
(o : null)
(o : null)
(o : null)
(o : null)

org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[] {j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GroupByOperator.hashAggregations


Full reference chains

30,473Kb (1.9%): org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg: 1,950,328 objects (25% of all objects referenced here)

 Random sample 
(o : null)
(o : null)
(o : null)
(o : null)
(o : null)
(o : null)
(o : null)
(o : null)
(o : null)
(o : null)
(o : null)
(o : null)
(o : null)
(o : null)
(o : null)
(o : null)
(o : null)
(o : null)
(o : null)
(o : null)

org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[] {j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GroupByOperator.hashAggregations
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.SelectOperator.childOperators
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.TableScanOperator.childOperators
{j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.aliasToWork
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap
64b (< 0.1%): org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg: 4 objects (25% of all objects referenced here)

 Random sample 
(o : null)
(o : null)
(o : null)
(o : null)

org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[] org.apache.hadoop.hive.ql.exec.GroupByOperator.aggregations
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.SelectOperator.childOperators
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.TableScanOperator.childOperators
{j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.aliasToWork
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap


 975,166 22,855Kb (1.4%) 22,855Kb (1.4%) 83,805Kb (5.1%)org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg
Reference chains
Expensive data fields

22,855Kb (1.4%): org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg: 975,164 objects (12% of all objects referenced here)

 Random sample 
(empty : false, sum : j.l.Long(0), uniqueObjects : j.u.HashSet(size: 0))
(empty : false, sum : (0), uniqueObjects : (size: 0))
(empty : false, sum : (1), uniqueObjects : (size: 0))
(empty : false, sum : (0), uniqueObjects : (size: 0))
(empty : false, sum : (0), uniqueObjects : (size: 0))
(empty : false, sum : (1), uniqueObjects : (size: 0))
(empty : false, sum : (0), uniqueObjects : (size: 0))
(empty : false, sum : (1), uniqueObjects : (size: 0))
(empty : false, sum : (0), uniqueObjects : (size: 0))
(empty : false, sum : (1), uniqueObjects : (size: 0))
(empty : false, sum : (0), uniqueObjects : (size: 0))
(empty : false, sum : (0), uniqueObjects : (size: 0))
(empty : false, sum : (1), uniqueObjects : (size: 0))
(empty : false, sum : (1), uniqueObjects : (size: 0))
(empty : false, sum : (0), uniqueObjects : (size: 0))
(empty : false, sum : (0), uniqueObjects : (size: 0))
(empty : false, sum : (1), uniqueObjects : (size: 0))
(empty : false, sum : (1), uniqueObjects : (size: 0))
(empty : false, sum : (0), uniqueObjects : (size: 0))
(empty : false, sum : (0), uniqueObjects : (size: 0))

org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[] {j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GroupByOperator.hashAggregations


Full reference chains

22,855Kb (1.4%): org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg: 975,164 objects (12% of all objects referenced here)

 Random sample 
(empty : false, sum : j.l.Long(0), uniqueObjects : j.u.HashSet(size: 0))
(empty : false, sum : (0), uniqueObjects : (size: 0))
(empty : false, sum : (1), uniqueObjects : (size: 0))
(empty : false, sum : (0), uniqueObjects : (size: 0))
(empty : false, sum : (0), uniqueObjects : (size: 0))
(empty : false, sum : (1), uniqueObjects : (size: 0))
(empty : false, sum : (0), uniqueObjects : (size: 0))
(empty : false, sum : (1), uniqueObjects : (size: 0))
(empty : false, sum : (0), uniqueObjects : (size: 0))
(empty : false, sum : (1), uniqueObjects : (size: 0))
(empty : false, sum : (0), uniqueObjects : (size: 0))
(empty : false, sum : (0), uniqueObjects : (size: 0))
(empty : false, sum : (1), uniqueObjects : (size: 0))
(empty : false, sum : (1), uniqueObjects : (size: 0))
(empty : false, sum : (0), uniqueObjects : (size: 0))
(empty : false, sum : (0), uniqueObjects : (size: 0))
(empty : false, sum : (1), uniqueObjects : (size: 0))
(empty : false, sum : (1), uniqueObjects : (size: 0))
(empty : false, sum : (0), uniqueObjects : (size: 0))
(empty : false, sum : (0), uniqueObjects : (size: 0))

org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[] {j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GroupByOperator.hashAggregations
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.SelectOperator.childOperators
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.TableScanOperator.childOperators
{j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.aliasToWork
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap
48b (< 0.1%): org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg: 2 objects (12% of all objects referenced here)

 Random sample 
(empty : true, sum : j.l.Long(0), uniqueObjects : j.u.HashSet(size: 0))
(empty : true, sum : (0), uniqueObjects : (size: 0))

org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[] org.apache.hadoop.hive.ql.exec.GroupByOperator.aggregations
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.SelectOperator.childOperators
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.TableScanOperator.childOperators
{j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.aliasToWork
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap


 3,901,164 182,867Kb (11.2%) 19,599Kb (1.2%) 73,708Kb (4.5%)j.u.HashMap
Reference chains
Expensive data fields

19,333Kb (1.2%): j.u.HashMap: 1 objects

 Random sample 
j.u.HashMap<org.apache.hadoop.hive.ql.exec.KeyWrapperFactory$ListKeyWrapper, org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[]>(size: 487582, capacity: 1048576) {(key:(hashcode : 2047900177, ...), val:org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[](16)@d5309e40), (key:(hashcode : -1907323306, ...), val:org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[](16)@c4b2d818), (key:(hashcode : 1322274519, ...), val:org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[](16)@c44c19c8), (key:(hashcode : 1036008904, ...), val:org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[](16)@d9c4e5b0), (key:(hashcode : -1604280216, ...), val:org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[](16)@dd28a590), (key:(hashcode : 1890611389, ...), val:org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[](16)@c469e808), (key:(hashcode : -1981773329, ...), val:org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[](16)@c4697318), (key:(hashcode : 210766976, ...), val:org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[](16)@e1e1d4a8), (key:(hashcode : 1387287201, ...), val:org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[](16)@c400d998), (key:(hashcode : -826224943, ...), val:org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[](16)@e370ee48), ...}

 ↖org.apache.hadoop.hive.ql.exec.GroupByOperator.hashAggregations


Full reference chains

19,333Kb (1.2%): j.u.HashMap: 1 objects

 Random sample 
j.u.HashMap<org.apache.hadoop.hive.ql.exec.KeyWrapperFactory$ListKeyWrapper, org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[]>(size: 487582, capacity: 1048576) {(key:(hashcode : 2047900177, ...), val:org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[](16)@d5309e40), (key:(hashcode : -1907323306, ...), val:org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[](16)@c4b2d818), (key:(hashcode : 1322274519, ...), val:org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[](16)@c44c19c8), (key:(hashcode : 1036008904, ...), val:org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[](16)@d9c4e5b0), (key:(hashcode : -1604280216, ...), val:org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[](16)@dd28a590), (key:(hashcode : 1890611389, ...), val:org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[](16)@c469e808), (key:(hashcode : -1981773329, ...), val:org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[](16)@c4697318), (key:(hashcode : 210766976, ...), val:org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[](16)@e1e1d4a8), (key:(hashcode : 1387287201, ...), val:org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[](16)@c400d998), (key:(hashcode : -826224943, ...), val:org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[](16)@e370ee48), ...}

org.apache.hadoop.hive.ql.exec.GroupByOperator.hashAggregations {j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.SelectOperator.childOperators
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.TableScanOperator.childOperators
{j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.aliasToWork
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap
39Kb (< 0.1%): j.u.HashMap: 1 objects

 Random sample 
j.u.HashMap<String, j.u.LinkedHashMap>(size: 1000, capacity: 2048) {(key:"viewfs://root/user/bidata/partnerdb/catalogs/20180709/partner_partition=86", val:(size : 1, modCount : 1, threshold : 12, ...)), (key:"viewfs://root/user/bidata/partnerdb/catalogs/20180709/partner_partition=85", val:(size : 1, modCount : 1, threshold : 12, ...)), (key:"viewfs://root/user/bidata/partnerdb/catalogs/20180709/partner_partition=88", val:(size : 1, modCount : 1, threshold : 12, ...)), (key:"viewfs://root/user/bidata/partnerdb/catalogs/20180709/partner_partition=87", val:(size : 1, modCount : 1, threshold : 12, ...)), (key:"viewfs://root/user/bidata/partnerdb/catalogs/20180709/partner_partition=82", val:(size : 1, modCount : 1, threshold : 12, ...)), (key:"viewfs://root/user/bidata/partnerdb/catalogs/20180709/partner_partition=81", val:(size : 1, modCount : 1, threshold : 12, ...)), (key:"viewfs://root/user/bidata/partnerdb/catalogs/20180709/partner_partition=84", val:(size : 1, modCount : 1, threshold : 12, ...)), (key:"viewfs://root/user/bidata/partnerdb/catalogs/20180709/partner_partition=83", val:(size : 1, modCount : 1, threshold : 12, ...)), (key:"viewfs://root/user/bidata/partnerdb/catalogs/20180709/partner_partition=850", val:(size : 1, modCount : 1, threshold : 12, ...)), (key:"viewfs://root/user/bidata/partnerdb/catalogs/20180709/partner_partition=851", val:(size : 1, modCount : 1, threshold : 12, ...)), ...}

org.apache.hadoop.hive.ql.exec.MapOperator.opCtxMap {j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.TableScanOperator.parentOperators
{j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.aliasToWork
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap
35Kb (< 0.1%): j.u.HashMap: 1 objects

 Random sample 
j.u.HashMap<String, org.apache.hadoop.hive.conf.HiveConf$ConfVars>(size: 877, capacity: 2048) {(key:"hive.exec.reducers.bytes.per.reducer", val:(name : "BYTESPERREDUCER", varname : "hive.exec.reducers.bytes.per.reducer", defaultExpr : "256000000", ...)), (key:"hive.metastore.client.capability.check", val:(name : "METASTORE_CAPABILITY_CHECK", varname : "hive.metastore.client.capability.check", defaultExpr : "true", ...)), (key:"datanucleus.storeManagerType", val:(name : "METASTORE_STORE_MANAGER_TYPE", varname : "datanucleus.storeManagerType", defaultExpr : "rdbms", ...)), (key:"hive.aux.jars.path", val:(name : "HIVEAUXJARS", varname : "hive.aux.jars.path", defaultExpr : "", ...)), (key:"hive.exec.stagingdir", val:(name : "STAGINGDIR", varname : "hive.exec.stagingdir", defaultExpr : ".hive-staging", ...)), (key:"hive.metastore.hbase.aggregate.stats.false.positive.probability", val:(name : "METASTORE_HBASE_AGGREGATE_STATS_CACHE_FALSE_POSITIVE_PROBABILITY", varname : "hive.metastore.hbase.aggregate.stats.false.positive.probability", defaultExpr : "0.01", ...)), (key:"hive.exec.default.partition.name", val:(name : "DEFAULTPARTITIONNAME", varname : "hive.exec.default.partition.name", defaultExpr : "__HIVE_DEFAULT_PARTITION__", ...)), (key:"mapreduce.input.fileinputformat.split.minsize.per.rack", val:(name : "MAPREDMINSPLITSIZEPERRACK", varname : "mapreduce.input.fileinputformat.split.minsize.per.rack", defaultExpr : "1", ...)), (key:"hive.metastore.event.expiry.duration", val:(name : "METASTORE_EVENT_EXPIRY_DURATION", varname : "hive.metastore.event.expiry.duration", defaultExpr : "0s", ...)), (key:"hive.exec.mode.local.auto.input.files.max", val:(name : "LOCALMODEMAXINPUTFILES", varname : "hive.exec.mode.local.auto.input.files.max", defaultExpr : "4", ...)), ...}

 ↖Java Static org.apache.hadoop.hive.conf.HiveConf.vars


 487,583 15,236Kb (0.9%) 15,236Kb (0.9%) 26,664Kb (1.6%)org.apache.hadoop.hive.ql.exec.KeyWrapperFactory$ListKeyWrapper
Reference chains
Expensive data fields

15,236Kb (0.9%): org.apache.hadoop.hive.ql.exec.KeyWrapperFactory$ListKeyWrapper: 487,582 objects

 Random sample 
(hashcode : -277661827, keys : Object[](size: 2), equalComparer : org.apache.hadoop.hive.serde2.objectinspector.ListObjectsEqualComparer@c4b38458, this$0 : org.apache.hadoop.hive.ql.exec.KeyWrapperFactory@c4b38558)
(hashcode : -1559694932, keys : (size: 2), equalComparer : @c4b38458, this$0 : @c4b38558)
(hashcode : 1692096880, keys : (size: 2), equalComparer : @c4b38458, this$0 : @c4b38558)
(hashcode : -747006738, keys : (size: 2), equalComparer : @c4b38458, this$0 : @c4b38558)
(hashcode : 1293666561, keys : (size: 2), equalComparer : @c4b38458, this$0 : @c4b38558)
(hashcode : 426306545, keys : (size: 2), equalComparer : @c4b38458, this$0 : @c4b38558)
(hashcode : -200973971, keys : (size: 2), equalComparer : @c4b38458, this$0 : @c4b38558)
(hashcode : 1852010205, keys : (size: 2), equalComparer : @c4b38458, this$0 : @c4b38558)
(hashcode : -260380835, keys : (size: 2), equalComparer : @c4b38458, this$0 : @c4b38558)
(hashcode : -1768075374, keys : (size: 2), equalComparer : @c4b38458, this$0 : @c4b38558)
(hashcode : -1613810886, keys : (size: 2), equalComparer : @c4b38458, this$0 : @c4b38558)
(hashcode : -1468453948, keys : (size: 2), equalComparer : @c4b38458, this$0 : @c4b38558)
(hashcode : 1803465267, keys : (size: 2), equalComparer : @c4b38458, this$0 : @c4b38558)
(hashcode : -1021019654, keys : (size: 2), equalComparer : @c4b38458, this$0 : @c4b38558)
(hashcode : 189431968, keys : (size: 2), equalComparer : @c4b38458, this$0 : @c4b38558)
(hashcode : -503903149, keys : (size: 2), equalComparer : @c4b38458, this$0 : @c4b38558)
(hashcode : 1048294277, keys : (size: 2), equalComparer : @c4b38458, this$0 : @c4b38558)
(hashcode : 87405895, keys : (size: 2), equalComparer : @c4b38458, this$0 : @c4b38558)
(hashcode : 2077607202, keys : (size: 2), equalComparer : @c4b38458, this$0 : @c4b38558)
(hashcode : 1773541156, keys : (size: 2), equalComparer : @c4b38458, this$0 : @c4b38558)

{j.u.HashMap}.keys org.apache.hadoop.hive.ql.exec.GroupByOperator.hashAggregations


Full reference chains

15,236Kb (0.9%): org.apache.hadoop.hive.ql.exec.KeyWrapperFactory$ListKeyWrapper: 487,582 objects

 Random sample 
(hashcode : -277661827, keys : Object[](size: 2), equalComparer : org.apache.hadoop.hive.serde2.objectinspector.ListObjectsEqualComparer@c4b38458, this$0 : org.apache.hadoop.hive.ql.exec.KeyWrapperFactory@c4b38558)
(hashcode : -1559694932, keys : (size: 2), equalComparer : @c4b38458, this$0 : @c4b38558)
(hashcode : 1692096880, keys : (size: 2), equalComparer : @c4b38458, this$0 : @c4b38558)
(hashcode : -747006738, keys : (size: 2), equalComparer : @c4b38458, this$0 : @c4b38558)
(hashcode : 1293666561, keys : (size: 2), equalComparer : @c4b38458, this$0 : @c4b38558)
(hashcode : 426306545, keys : (size: 2), equalComparer : @c4b38458, this$0 : @c4b38558)
(hashcode : -200973971, keys : (size: 2), equalComparer : @c4b38458, this$0 : @c4b38558)
(hashcode : 1852010205, keys : (size: 2), equalComparer : @c4b38458, this$0 : @c4b38558)
(hashcode : -260380835, keys : (size: 2), equalComparer : @c4b38458, this$0 : @c4b38558)
(hashcode : -1768075374, keys : (size: 2), equalComparer : @c4b38458, this$0 : @c4b38558)
(hashcode : -1613810886, keys : (size: 2), equalComparer : @c4b38458, this$0 : @c4b38558)
(hashcode : -1468453948, keys : (size: 2), equalComparer : @c4b38458, this$0 : @c4b38558)
(hashcode : 1803465267, keys : (size: 2), equalComparer : @c4b38458, this$0 : @c4b38558)
(hashcode : -1021019654, keys : (size: 2), equalComparer : @c4b38458, this$0 : @c4b38558)
(hashcode : 189431968, keys : (size: 2), equalComparer : @c4b38458, this$0 : @c4b38558)
(hashcode : -503903149, keys : (size: 2), equalComparer : @c4b38458, this$0 : @c4b38558)
(hashcode : 1048294277, keys : (size: 2), equalComparer : @c4b38458, this$0 : @c4b38558)
(hashcode : 87405895, keys : (size: 2), equalComparer : @c4b38458, this$0 : @c4b38558)
(hashcode : 2077607202, keys : (size: 2), equalComparer : @c4b38458, this$0 : @c4b38558)
(hashcode : 1773541156, keys : (size: 2), equalComparer : @c4b38458, this$0 : @c4b38558)

{j.u.HashMap}.keys org.apache.hadoop.hive.ql.exec.GroupByOperator.hashAggregations
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.SelectOperator.childOperators
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.TableScanOperator.childOperators
{j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.aliasToWork
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap
32b (< 0.1%): org.apache.hadoop.hive.ql.exec.KeyWrapperFactory$ListKeyWrapper: 1 objects

 Random sample 
(hashcode : 1569083627, keys : Object[](size: 2), equalComparer : org.apache.hadoop.hive.serde2.objectinspector.ListObjectsEqualComparer@c4b38888, this$0 : org.apache.hadoop.hive.ql.exec.KeyWrapperFactory@c4b38558)

org.apache.hadoop.hive.ql.exec.GroupByOperator.newKeys {j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.SelectOperator.childOperators
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.TableScanOperator.childOperators
{j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.aliasToWork
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap


 495,823 11,972Kb (0.7%) 11,711Kb (0.7%) 31,216Kb (1.9%)Object[]
Reference chains
Expensive data fields

11,427Kb (0.7%): Object[]: 487,583 objects

 Random sample 
Object[2]{org.apache.hadoop.io.IntWritable(value : 23704), org.apache.hadoop.io.Text(length : 21, ...)}
Object[2]{org.apache.hadoop.io.IntWritable(value : 15598), org.apache.hadoop.io.Text(length : 76, ...)}
Object[2]{org.apache.hadoop.io.IntWritable(value : 15598), org.apache.hadoop.io.Text(length : 161, ...)}
Object[2]{org.apache.hadoop.io.IntWritable(value : 23704), org.apache.hadoop.io.Text(length : 17, ...)}
Object[2]{org.apache.hadoop.io.IntWritable(value : 1704), org.apache.hadoop.io.Text(length : 49, ...)}
Object[2]{org.apache.hadoop.io.IntWritable(value : 1704), org.apache.hadoop.io.Text(length : 46, ...)}
Object[2]{org.apache.hadoop.io.IntWritable(value : 19704), org.apache.hadoop.io.Text(length : 58, ...)}
Object[2]{org.apache.hadoop.io.IntWritable(value : 15598), org.apache.hadoop.io.Text(length : 120, ...)}
Object[2]{org.apache.hadoop.io.IntWritable(value : 15598), org.apache.hadoop.io.Text(length : 109, ...)}
Object[2]{org.apache.hadoop.io.IntWritable(value : 15598), org.apache.hadoop.io.Text(length : 33, ...)}
Object[2]{org.apache.hadoop.io.IntWritable(value : 1704), org.apache.hadoop.io.Text(length : 60, ...)}
Object[2]{org.apache.hadoop.io.IntWritable(value : 1704), org.apache.hadoop.io.Text(length : 94, ...)}
Object[2]{org.apache.hadoop.io.IntWritable(value : 10704), org.apache.hadoop.io.Text(length : 33, ...)}
Object[2]{org.apache.hadoop.io.IntWritable(value : 23704), org.apache.hadoop.io.Text(length : 22, ...)}
Object[2]{org.apache.hadoop.io.IntWritable(value : 15598), org.apache.hadoop.io.Text(length : 111, ...)}
Object[2]{org.apache.hadoop.io.IntWritable(value : 15598), org.apache.hadoop.io.Text(length : 61, ...)}
Object[2]{org.apache.hadoop.io.IntWritable(value : 15598), org.apache.hadoop.io.Text(length : 59, ...)}
Object[2]{org.apache.hadoop.io.IntWritable(value : 1704), org.apache.hadoop.io.Text(length : 61, ...)}
Object[2]{org.apache.hadoop.io.IntWritable(value : 1704), org.apache.hadoop.io.Text(length : 67, ...)}
Object[2]{org.apache.hadoop.io.IntWritable(value : 46704), org.apache.hadoop.io.Text(length : 61, ...)}

 ↖org.apache.hadoop.hive.ql.exec.KeyWrapperFactory$ListKeyWrapper.keys


Full reference chains

11,427Kb (0.7%): Object[]: 487,582 objects

 Random sample 
Object[2]{org.apache.hadoop.io.IntWritable(value : 23704), org.apache.hadoop.io.Text(length : 21, ...)}
Object[2]{org.apache.hadoop.io.IntWritable(value : 15598), org.apache.hadoop.io.Text(length : 76, ...)}
Object[2]{org.apache.hadoop.io.IntWritable(value : 15598), org.apache.hadoop.io.Text(length : 161, ...)}
Object[2]{org.apache.hadoop.io.IntWritable(value : 23704), org.apache.hadoop.io.Text(length : 17, ...)}
Object[2]{org.apache.hadoop.io.IntWritable(value : 1704), org.apache.hadoop.io.Text(length : 49, ...)}
Object[2]{org.apache.hadoop.io.IntWritable(value : 1704), org.apache.hadoop.io.Text(length : 46, ...)}
Object[2]{org.apache.hadoop.io.IntWritable(value : 19704), org.apache.hadoop.io.Text(length : 58, ...)}
Object[2]{org.apache.hadoop.io.IntWritable(value : 15598), org.apache.hadoop.io.Text(length : 120, ...)}
Object[2]{org.apache.hadoop.io.IntWritable(value : 15598), org.apache.hadoop.io.Text(length : 109, ...)}
Object[2]{org.apache.hadoop.io.IntWritable(value : 15598), org.apache.hadoop.io.Text(length : 33, ...)}
Object[2]{org.apache.hadoop.io.IntWritable(value : 1704), org.apache.hadoop.io.Text(length : 60, ...)}
Object[2]{org.apache.hadoop.io.IntWritable(value : 1704), org.apache.hadoop.io.Text(length : 94, ...)}
Object[2]{org.apache.hadoop.io.IntWritable(value : 10704), org.apache.hadoop.io.Text(length : 33, ...)}
Object[2]{org.apache.hadoop.io.IntWritable(value : 23704), org.apache.hadoop.io.Text(length : 22, ...)}
Object[2]{org.apache.hadoop.io.IntWritable(value : 15598), org.apache.hadoop.io.Text(length : 111, ...)}
Object[2]{org.apache.hadoop.io.IntWritable(value : 15598), org.apache.hadoop.io.Text(length : 61, ...)}
Object[2]{org.apache.hadoop.io.IntWritable(value : 15598), org.apache.hadoop.io.Text(length : 59, ...)}
Object[2]{org.apache.hadoop.io.IntWritable(value : 1704), org.apache.hadoop.io.Text(length : 61, ...)}
Object[2]{org.apache.hadoop.io.IntWritable(value : 1704), org.apache.hadoop.io.Text(length : 67, ...)}
Object[2]{org.apache.hadoop.io.IntWritable(value : 46704), org.apache.hadoop.io.Text(length : 61, ...)}

org.apache.hadoop.hive.ql.exec.KeyWrapperFactory$ListKeyWrapper.keys {j.u.HashMap}.keys
org.apache.hadoop.hive.ql.exec.GroupByOperator.hashAggregations
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.SelectOperator.childOperators
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.TableScanOperator.childOperators
{j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.aliasToWork
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap
159Kb (< 0.1%): Object[]: 2,433 objects (36% of all objects referenced here)

 Random sample 
Object[20](element class String){null, null, null, null, null, "GET", null, "TRACE", null, null, null, null, null, null, null, "POST", "HEAD", "OPTIONS", "PUT", "DELETE"}
Object[3](element class String){",name=", "monitor", "control"}
Object[6](element class String){null, null, null, null, null, "9223372036854775808"}
Object[1]{null}
Object[2]{null, null}
Object[2]{null, null}
Object[26](element class String){"control", null, null, null, null, ".handlers", "java.util.logging.config.class", null, null, "", "java.util.logging.config.file", "java.home", null, "lib", "logging.properties", "config", null, null, null, null, ...}
Object[17](element class String){"http://www.w3.org/TR/1998/REC-xml-19980210", "MSG_GRAMMAR_NOT_FOUND", "RootElementTypeMustMatchDoctypedecl", "ElementUnterminated", "http://www.w3.org/TR/1999/REC-xml-names-19990114", "ElementXMLNSPrefix", "ElementPrefixUnbound", "AttributePrefixUnbound", "AttributeNSNotUnique", "AttributeNotUnique", "ElementEntityMismatch", "EqRequiredInAttribute", "CantBindXMLNS", "CantBindXML", "EmptyPrefixedAttName", "ETagRequired", "ETagUnterminated"}
Object[7](element class String){"Malformed \uxxxx encoding.", "#", "8859_1", "=", "UTF-8", "-- listing properties --", "..."}
Object[11](element class String){"env:", "system:", null, null, null, null, null, "", "Variable substitution depth is deeper than ", " for expression ", "\$\{[^\}\$ ]+\}"}
Object[1](element class String){"spark"}
Object[17](element class String){null, null, null, null, null, null, null, null, null, null, "modifyThread", null, null, null, null, null, null}
Object[8](element class String){"os.arch", "", "i386", "x86", "amd64", null, null, null}
Object[1]{null}
Object[6](element class String){"META-INF/services/", null, null, null, null, null}
Object[2]{null, null}
Object[2]{null, null}
Object[9]{null, null, null, null, null, null, null, null, null}
Object[2]{null, null}
Object[7](element class String){"ISO-8859-1", "#!", null, null, "include", "line.separator", null}

↖Unreachable
  All or some objects may start live as:

11,427Kb (0.7%): Object[]: 487,582 objects

 Random sample 
Object[2]{org.apache.hadoop.io.IntWritable(value : 23704), org.apache.hadoop.io.Text(length : 21, ...)}
Object[2]{org.apache.hadoop.io.IntWritable(value : 15598), org.apache.hadoop.io.Text(length : 76, ...)}
Object[2]{org.apache.hadoop.io.IntWritable(value : 15598), org.apache.hadoop.io.Text(length : 161, ...)}
Object[2]{org.apache.hadoop.io.IntWritable(value : 23704), org.apache.hadoop.io.Text(length : 17, ...)}
Object[2]{org.apache.hadoop.io.IntWritable(value : 1704), org.apache.hadoop.io.Text(length : 49, ...)}
Object[2]{org.apache.hadoop.io.IntWritable(value : 1704), org.apache.hadoop.io.Text(length : 46, ...)}
Object[2]{org.apache.hadoop.io.IntWritable(value : 19704), org.apache.hadoop.io.Text(length : 58, ...)}
Object[2]{org.apache.hadoop.io.IntWritable(value : 15598), org.apache.hadoop.io.Text(length : 120, ...)}
Object[2]{org.apache.hadoop.io.IntWritable(value : 15598), org.apache.hadoop.io.Text(length : 109, ...)}
Object[2]{org.apache.hadoop.io.IntWritable(value : 15598), org.apache.hadoop.io.Text(length : 33, ...)}
Object[2]{org.apache.hadoop.io.IntWritable(value : 1704), org.apache.hadoop.io.Text(length : 60, ...)}
Object[2]{org.apache.hadoop.io.IntWritable(value : 1704), org.apache.hadoop.io.Text(length : 94, ...)}
Object[2]{org.apache.hadoop.io.IntWritable(value : 10704), org.apache.hadoop.io.Text(length : 33, ...)}
Object[2]{org.apache.hadoop.io.IntWritable(value : 23704), org.apache.hadoop.io.Text(length : 22, ...)}
Object[2]{org.apache.hadoop.io.IntWritable(value : 15598), org.apache.hadoop.io.Text(length : 111, ...)}
Object[2]{org.apache.hadoop.io.IntWritable(value : 15598), org.apache.hadoop.io.Text(length : 61, ...)}
Object[2]{org.apache.hadoop.io.IntWritable(value : 15598), org.apache.hadoop.io.Text(length : 59, ...)}
Object[2]{org.apache.hadoop.io.IntWritable(value : 1704), org.apache.hadoop.io.Text(length : 61, ...)}
Object[2]{org.apache.hadoop.io.IntWritable(value : 1704), org.apache.hadoop.io.Text(length : 67, ...)}
Object[2]{org.apache.hadoop.io.IntWritable(value : 46704), org.apache.hadoop.io.Text(length : 61, ...)}

org.apache.hadoop.hive.ql.exec.KeyWrapperFactory$ListKeyWrapper.keys {j.u.HashMap}.keys
org.apache.hadoop.hive.ql.exec.GroupByOperator.hashAggregations
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.SelectOperator.childOperators
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.TableScanOperator.childOperators
{j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.aliasToWork
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap
23Kb (< 0.1%): Object[]: 999 objects

 Random sample 
Object[2](element class Object[]){null, Object[](1)@c0e16198}
Object[2](element class Object[]){null, Object[](1)@c0bbe190}
Object[2](element class Object[]){null, Object[](1)@c11972b0}
Object[2](element class Object[]){null, Object[](1)@c0b2f310}
Object[2](element class Object[]){null, Object[](1)@c0d3cae0}
Object[2](element class Object[]){null, Object[](1)@c104f4c0}
Object[2](element class Object[]){null, Object[](1)@c0814c20}
Object[2](element class Object[]){null, Object[](1)@c0992ef0}
Object[2](element class Object[]){null, Object[](1)@c0f95e18}
Object[2](element class Object[]){null, Object[](1)@c07b4990}
Object[2](element class Object[]){null, Object[](1)@c10c2b58}
Object[2](element class Object[]){null, Object[](1)@c0cb4358}
Object[2](element class Object[]){null, Object[](1)@c0f141f0}
Object[2](element class Object[]){null, Object[](1)@c10a68a0}
Object[2](element class Object[]){null, Object[](1)@c1184990}
Object[2](element class Object[]){null, Object[](1)@c07f52c8}
Object[2](element class Object[]){null, Object[](1)@c089c520}
Object[2](element class Object[]){null, Object[](1)@c0fb1fd8}
Object[2](element class Object[]){null, Object[](1)@c0f706c8}
Object[2](element class Object[]){null, Object[](1)@c0b253d0}

org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx.rowWithPart {j.u.LinkedHashMap}.values
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.MapOperator.opCtxMap
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.TableScanOperator.parentOperators
{j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.aliasToWork
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap
23Kb (< 0.1%): Object[]: 999 objects (99% of all objects referenced here)

 Random sample 
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 933)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 38)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 153)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 825)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 248)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 370)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 425)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 638)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 901)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 203)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 335)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 517)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 931)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 318)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 124)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 422)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 665)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 775)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 753)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 456)}

Object[] org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx.rowWithPart
{j.u.LinkedHashMap}.values
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.MapOperator.opCtxMap
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.TableScanOperator.parentOperators
{j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.aliasToWork
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap
16Kb (< 0.1%): Object[]: 4 objects

 Random sample 
Object[1024](element class com.google.common.collect.MapMakerInternalMap$WeakEntry){null, null, (referent : "hive.llap.daemon.am.liveness.heartbeat.interval.ms", hash : 283482114, ...), (referent : "fs.df.interval", hash : 667159555, ...), null, null, null, (referent : "20.0", hash : 1036568583, ...), null, (referent : "hive.blobstore.use.blobstore.as.scratchdir", hash : 694263817, ...), (referent : "hive.spark.client.rpc.max.size", hash : 269829130, ...), null, (referent : "hive.tez.input.generate.consistent.splits", hash : 1039710220, ...), null, (referent : "hive.groupby.skewindata", hash : 879368206, ...), null, null, (referent : "hive.server2.webui.max.threads", hash : 448924689, ...), null, null, ...}
Object[1024](element class com.google.common.collect.MapMakerInternalMap$WeakEntry){(referent : "48-df-37-27-5b-f0.hpc.criteo.preprod:50070", hash : 1635065856, ...), null, (referent : "hive.map.aggr", hash : 1598769154, ...), null, (referent : "hive_zookeeper_namespace", hash : 1201139716, ...), null, (referent : "dfs.datanode.transfer.socket.send.buffer.size", hash : 2091598854, ...), (referent : "org.apache.hadoop.yarn.server.sharedcachemanager.RemoteAppChecker", hash : 1746011143, ...), null, null, (referent : "hive.llap.io.threadpool.size", hash : 1644576778, ...), null, (referent : "mapreduce.app-submission.cross-platform", hash : 1674049548, ...), null, (referent : "hive.exec.script.maxerrsize", hash : 1553343502, ...), null, (referent : "yarn.am.liveness-monitor.expiry-interval-ms", hash : 1408093200, ...), null, (referent : "org.apache.hadoop.hive.serde2.columnar.LazyBinaryColumnarSerDe", hash : 1577140242, ...), (referent : "hive.auto.convert.sortmerge.join.bigtable.selection.policy", hash : 1799067667, ...), ...}
Object[1024](element class com.google.common.collect.MapMakerInternalMap$WeakEntry){(referent : "ftp.blocksize", hash : -1366956032, ...), (referent : "dfs.default.chunk.view.size", hash : -1336341503, ...), null, (referent : "10s", hash : -1842974717, ...), null, (referent : "hive.server2.webui.use.spnego", hash : -1098757115, ...), null, (referent : "dfs.namenode.num.extra.edits.retained", hash : -1286292473, ...), null, null, null, (referent : "mapreduce.cluster.temp.dir", hash : -1264724981, ...), null, (referent : "yarn.timeline-service.leveldb-timeline-store.start-time-write-cache-size", hash : -1452300275, ...), (hash : -1338645490, ...), null, null, null, null, null, ...}
Object[1024](element class com.google.common.collect.MapMakerInternalMap$WeakEntry){null, (referent : "hive.variable.substitute.depth", hash : -320883711, ...), null, (referent : "fs.s3a.metadatastore.impl", hash : -450872317, ...), null, (referent : "datanucleus1", hash : -170013691, ...), (referent : "org.apache.hadoop.fs.s3a.S3AFileSystem", hash : -691271674, ...), null, (referent : "64", hash : -750568440, ...), (referent : "hive.optimize.cte.materialize.threshold", hash : -763467767, ...), null, null, null, (referent : "hive.server2.thrift.client.password", hash : -107592691, ...), (referent : "hdfs://preprod-pa4/user", hash : -739640306, ...), (referent : "mapreduce.task.tmp.dir", hash : -7145457, ...), null, null, (referent : "dfs.namenode.rpc-address.am5.6c-3b-e5-bd-4b-ac", hash : -85121006, ...), (referent : "hive.metastore.aggregate.stats.cache.max.full", hash : -505258989, ...), ...}

java.util.concurrent.atomic.AtomicReferenceArray.array com.google.common.collect.MapMakerInternalMap$Segment.table
com.google.common.collect.MapMakerInternalMap$Segment[]
com.google.common.collect.MapMakerInternalMap.segments
com.google.common.collect.Interners$CustomInterner.map
↖Java Static org.apache.hadoop.util.StringInterner.weakInterner


23Kb (< 0.1%): Object[]: 999 objects

 Random sample 
Object[2](element class Object[]){null, Object[](1)@c0e16198}
Object[2](element class Object[]){null, Object[](1)@c0bbe190}
Object[2](element class Object[]){null, Object[](1)@c11972b0}
Object[2](element class Object[]){null, Object[](1)@c0b2f310}
Object[2](element class Object[]){null, Object[](1)@c0d3cae0}
Object[2](element class Object[]){null, Object[](1)@c104f4c0}
Object[2](element class Object[]){null, Object[](1)@c0814c20}
Object[2](element class Object[]){null, Object[](1)@c0992ef0}
Object[2](element class Object[]){null, Object[](1)@c0f95e18}
Object[2](element class Object[]){null, Object[](1)@c07b4990}
Object[2](element class Object[]){null, Object[](1)@c10c2b58}
Object[2](element class Object[]){null, Object[](1)@c0cb4358}
Object[2](element class Object[]){null, Object[](1)@c0f141f0}
Object[2](element class Object[]){null, Object[](1)@c10a68a0}
Object[2](element class Object[]){null, Object[](1)@c1184990}
Object[2](element class Object[]){null, Object[](1)@c07f52c8}
Object[2](element class Object[]){null, Object[](1)@c089c520}
Object[2](element class Object[]){null, Object[](1)@c0fb1fd8}
Object[2](element class Object[]){null, Object[](1)@c0f706c8}
Object[2](element class Object[]){null, Object[](1)@c0b253d0}

org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx.rowWithPart {j.u.LinkedHashMap}.values
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.MapOperator.opCtxMap
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.TableScanOperator.parentOperators
{j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.aliasToWork
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap


 498,673 11,687Kb (0.7%) 11,687Kb (0.7%) 59,704Kb (3.7%)org.apache.hadoop.io.Text
Reference chains
Expensive data fields

11,427Kb (0.7%): org.apache.hadoop.io.Text: 487,567 objects (49% of all objects referenced here)

 Random sample 
(bytes : byte[](size: 81), length : 81)
(bytes : (size: 125), length : 125)
(bytes : (size: 63), length : 63)
(bytes : (size: 105), length : 105)
(bytes : (size: 67), length : 67)
(bytes : (size: 72), length : 72)
(bytes : (size: 54), length : 54)
(bytes : (size: 59), length : 59)
(bytes : (size: 62), length : 62)
(bytes : (size: 125), length : 125)
(bytes : (size: 98), length : 98)
(bytes : (size: 77), length : 77)
(bytes : (size: 139), length : 139)
(bytes : (size: 118), length : 118)
(bytes : (size: 69), length : 69)
(bytes : (size: 57), length : 57)
(bytes : (size: 73), length : 73)
(bytes : (size: 100), length : 100)
(bytes : (size: 79), length : 79)
(bytes : (size: 42), length : 42)

Object[] org.apache.hadoop.hive.ql.exec.KeyWrapperFactory$ListKeyWrapper.keys


Full reference chains

11,427Kb (0.7%): org.apache.hadoop.io.Text: 487,567 objects (49% of all objects referenced here)

 Random sample 
(bytes : byte[](size: 81), length : 81)
(bytes : (size: 125), length : 125)
(bytes : (size: 63), length : 63)
(bytes : (size: 105), length : 105)
(bytes : (size: 67), length : 67)
(bytes : (size: 72), length : 72)
(bytes : (size: 54), length : 54)
(bytes : (size: 59), length : 59)
(bytes : (size: 62), length : 62)
(bytes : (size: 125), length : 125)
(bytes : (size: 98), length : 98)
(bytes : (size: 77), length : 77)
(bytes : (size: 139), length : 139)
(bytes : (size: 118), length : 118)
(bytes : (size: 69), length : 69)
(bytes : (size: 57), length : 57)
(bytes : (size: 73), length : 73)
(bytes : (size: 100), length : 100)
(bytes : (size: 79), length : 79)
(bytes : (size: 42), length : 42)

Object[] org.apache.hadoop.hive.ql.exec.KeyWrapperFactory$ListKeyWrapper.keys
{j.u.HashMap}.keys
org.apache.hadoop.hive.ql.exec.GroupByOperator.hashAggregations
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.SelectOperator.childOperators
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.TableScanOperator.childOperators
{j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.aliasToWork
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap
233Kb (< 0.1%): org.apache.hadoop.io.Text: 9,980 objects

 Random sample 
(bytes : byte[](size: 0), length : 0)
(bytes : (size: 0), length : 0)
(bytes : (size: 0), length : 0)
(bytes : (size: 0), length : 0)
(bytes : (size: 0), length : 0)
(bytes : (size: 0), length : 0)
(bytes : (size: 0), length : 0)
(bytes : (size: 0), length : 0)
(bytes : (size: 0), length : 0)
(bytes : (size: 0), length : 0)
(bytes : (size: 0), length : 0)
(bytes : (size: 0), length : 0)
(bytes : (size: 0), length : 0)
(bytes : (size: 0), length : 0)
(bytes : (size: 0), length : 0)
(bytes : (size: 0), length : 0)
(bytes : (size: 0), length : 0)
(bytes : (size: 0), length : 0)
(bytes : (size: 0), length : 0)
(bytes : (size: 0), length : 0)

org.apache.hadoop.hive.serde2.lazy.LazyString.data org.apache.hadoop.hive.serde2.columnar.ColumnarStructBase$FieldInfo.field
org.apache.hadoop.hive.serde2.columnar.ColumnarStructBase$FieldInfo[]
org.apache.hadoop.hive.serde2.columnar.ColumnarStruct.fieldInfoList
org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe.cachedLazyStruct
org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx.deserializer
{j.u.LinkedHashMap}.values
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.MapOperator.opCtxMap
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.TableScanOperator.parentOperators
{j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.aliasToWork
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap
11Kb (< 0.1%): org.apache.hadoop.io.Text: 511 objects

 Random sample 
(bytes : byte[](size: 2), length : 2)
(bytes : (size: 2), length : 2)
(bytes : (size: 2), length : 2)
(bytes : (size: 2), length : 2)
(bytes : (size: 2), length : 2)
(bytes : (size: 2), length : 2)
(bytes : (size: 2), length : 2)
(bytes : (size: 2), length : 2)
(bytes : (size: 2), length : 2)
(bytes : (size: 2), length : 2)
(bytes : (size: 2), length : 2)
(bytes : (size: 2), length : 2)
(bytes : (size: 2), length : 2)
(bytes : (size: 2), length : 2)
(bytes : (size: 2), length : 2)
(bytes : (size: 2), length : 2)
(bytes : (size: 2), length : 2)
(bytes : (size: 2), length : 2)
(bytes : (size: 2), length : 2)
(bytes : (size: 2), length : 2)

org.apache.hadoop.hive.serde2.lazy.LazySerDeParameters.nullSequence org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe.serdeParams
org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx.deserializer
{j.u.LinkedHashMap}.values
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.MapOperator.opCtxMap
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.TableScanOperator.parentOperators
{j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.aliasToWork
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap


 102,739 2,407Kb (0.1%) 11,290Kb (0.7%) 11,290Kb (0.7%)String
Reference chains
Expensive data fields

2,111Kb (0.1%): String: 13,710 objects

 Random sample 
"http://static.criteo.net/images/produits/14842/stars/2.png"
"http://static.criteo.net/images/produits/14842/stars/0.png"
"http://static.criteo.net/images/produits/14842/stars/0.png"
"https://www.viator.com/images/stars/orange/16_5.gif"
"http://static.criteo.net/images/produits/14842/stars/4.png"
"http://static.criteo.net/images/produits/14842/stars/4.png"
"http://static.criteo.net/images/produits/14842/stars/5.png"
"http://static.criteo.net/images/produits/14842/stars/4.png"
"http://static.criteo.net/images/produits/14842/stars/4.png"
"https://www.viator.com/images/stars/orange/16_4.gif"
"http://static.criteo.net/images/produits/14842/stars/3.png"
"http://static.criteo.net/images/produits/14842/stars/5.png"
"http://static.criteo.net/images/produits/14842/stars/3.png"
"https://www.viator.com/images/stars/orange/16_5.gif"
"http://static.criteo.net/images/produits/14842/stars/0.png"
"http://static.criteo.net/images/produits/14842/stars/5.png"
"http://static.criteo.net/images/produits/14842/stars/5.png"
"http://static.criteo.net/images/produits/14842/stars/0.png"
"https://www.viator.com/images/stars/orange/16_4_5.gif"
"https://www.viator.com/images/stars/orange/16_5.gif"

 ↖org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg.o
2,111Kb (0.1%): String: 13,710 objects

 Random sample 
"http://static.criteo.net/images/produits/14842/stars/5.png"
"http://static.criteo.net/images/produits/14842/stars/3.png"
"http://static.criteo.net/images/produits/14842/stars/3.png"
"https://www.viator.com/images/stars/orange/16_4_5.gif"
"http://static.criteo.net/images/produits/14842/stars/5.png"
"http://static.criteo.net/images/produits/14842/stars/5.png"
"http://static.criteo.net/images/produits/14842/stars/4.png"
"http://static.criteo.net/images/produits/14842/stars/3.png"
"http://static.criteo.net/images/produits/14842/stars/4.png"
"https://www.viator.com/images/stars/orange/16_5.gif"
"http://static.criteo.net/images/produits/14842/stars/0.png"
"http://static.criteo.net/images/produits/14842/stars/5.png"
"http://static.criteo.net/images/produits/14842/stars/4.png"
"https://www.viator.com/images/stars/orange/16_4_5.gif"
"http://static.criteo.net/images/produits/14842/stars/0.png"
"http://static.criteo.net/images/produits/14842/stars/5.png"
"http://static.criteo.net/images/produits/14842/stars/5.png"
"http://static.criteo.net/images/produits/14842/stars/3.png"
"https://www.viator.com/images/stars/orange/16_4_5.gif"
"https://www.viator.com/images/stars/orange/16_5.gif"

 ↖org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMax$GenericUDAFMaxEvaluator$MaxAgg.o


Full reference chains

2,111Kb (0.1%): String: 13,710 objects

 Random sample 
"http://static.criteo.net/images/produits/14842/stars/5.png"
"http://static.criteo.net/images/produits/14842/stars/3.png"
"http://static.criteo.net/images/produits/14842/stars/3.png"
"https://www.viator.com/images/stars/orange/16_4_5.gif"
"http://static.criteo.net/images/produits/14842/stars/5.png"
"http://static.criteo.net/images/produits/14842/stars/5.png"
"http://static.criteo.net/images/produits/14842/stars/4.png"
"http://static.criteo.net/images/produits/14842/stars/3.png"
"http://static.criteo.net/images/produits/14842/stars/4.png"
"https://www.viator.com/images/stars/orange/16_5.gif"
"http://static.criteo.net/images/produits/14842/stars/0.png"
"http://static.criteo.net/images/produits/14842/stars/5.png"
"http://static.criteo.net/images/produits/14842/stars/4.png"
"https://www.viator.com/images/stars/orange/16_4_5.gif"
"http://static.criteo.net/images/produits/14842/stars/0.png"
"http://static.criteo.net/images/produits/14842/stars/5.png"
"http://static.criteo.net/images/produits/14842/stars/5.png"
"http://static.criteo.net/images/produits/14842/stars/3.png"
"https://www.viator.com/images/stars/orange/16_4_5.gif"
"https://www.viator.com/images/stars/orange/16_5.gif"

org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMax$GenericUDAFMaxEvaluator$MaxAgg.o org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[]
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GroupByOperator.hashAggregations
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.SelectOperator.childOperators
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.TableScanOperator.childOperators
{j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.aliasToWork
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap
2,111Kb (0.1%): String: 13,710 objects

 Random sample 
"http://static.criteo.net/images/produits/14842/stars/2.png"
"http://static.criteo.net/images/produits/14842/stars/0.png"
"http://static.criteo.net/images/produits/14842/stars/0.png"
"https://www.viator.com/images/stars/orange/16_5.gif"
"http://static.criteo.net/images/produits/14842/stars/4.png"
"http://static.criteo.net/images/produits/14842/stars/4.png"
"http://static.criteo.net/images/produits/14842/stars/5.png"
"http://static.criteo.net/images/produits/14842/stars/4.png"
"http://static.criteo.net/images/produits/14842/stars/4.png"
"https://www.viator.com/images/stars/orange/16_4.gif"
"http://static.criteo.net/images/produits/14842/stars/3.png"
"http://static.criteo.net/images/produits/14842/stars/5.png"
"http://static.criteo.net/images/produits/14842/stars/3.png"
"https://www.viator.com/images/stars/orange/16_5.gif"
"http://static.criteo.net/images/produits/14842/stars/0.png"
"http://static.criteo.net/images/produits/14842/stars/5.png"
"http://static.criteo.net/images/produits/14842/stars/5.png"
"http://static.criteo.net/images/produits/14842/stars/0.png"
"https://www.viator.com/images/stars/orange/16_4_5.gif"
"https://www.viator.com/images/stars/orange/16_5.gif"

org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg.o org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[]
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GroupByOperator.hashAggregations
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.SelectOperator.childOperators
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.TableScanOperator.childOperators
{j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.aliasToWork
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap
1,365Kb (< 0.1%): String: 22,977 objects

 Random sample 
"description"
"id"
"reallyrecommendable"
"smallimage"
"reallyrecommendable"
"statflag"
"lasttouch"
"id"
"smallimage"
"additionalimagelinks"
"discount"
"discount"
"recommendable"
"hashcode"
"statflag"
"bigimage"
"filtered"
"smallimage"
"partnerid"
"recommendable"

String[] j.u.Arrays$ArrayList.a
org.apache.hadoop.hive.serde2.lazy.LazySerDeParameters.columnNames
org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe.serdeParams
org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx.deserializer
{j.u.LinkedHashMap}.values
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.MapOperator.opCtxMap
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.TableScanOperator.parentOperators
{j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.aliasToWork
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap


 493,612 7,712Kb (0.5%) 7,712Kb (0.5%) 7,712Kb (0.5%)org.apache.hadoop.io.IntWritable
Reference chains
Expensive data fields

7,618Kb (0.5%): org.apache.hadoop.io.IntWritable: 487,582 objects (50% of all objects referenced here)

 Random sample 
(value : 8598)
(value : 15598)
(value : 15598)
(value : 8598)
(value : 15598)
(value : 15598)
(value : 8598)
(value : 15598)
(value : 15598)
(value : 15598)
(value : 15598)
(value : 15598)
(value : 15598)
(value : 8598)
(value : 15598)
(value : 15598)
(value : 15598)
(value : 15598)
(value : 15598)
(value : 15598)

Object[] org.apache.hadoop.hive.ql.exec.KeyWrapperFactory$ListKeyWrapper.keys


Full reference chains

7,618Kb (0.5%): org.apache.hadoop.io.IntWritable: 487,582 objects (50% of all objects referenced here)

 Random sample 
(value : 8598)
(value : 15598)
(value : 15598)
(value : 8598)
(value : 15598)
(value : 15598)
(value : 8598)
(value : 15598)
(value : 15598)
(value : 15598)
(value : 15598)
(value : 15598)
(value : 15598)
(value : 8598)
(value : 15598)
(value : 15598)
(value : 15598)
(value : 15598)
(value : 15598)
(value : 15598)

Object[] org.apache.hadoop.hive.ql.exec.KeyWrapperFactory$ListKeyWrapper.keys
{j.u.HashMap}.keys
org.apache.hadoop.hive.ql.exec.GroupByOperator.hashAggregations
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.SelectOperator.childOperators
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.TableScanOperator.childOperators
{j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.aliasToWork
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap
77Kb (< 0.1%): org.apache.hadoop.io.IntWritable: 4,990 objects

 Random sample 
(value : 0)
(value : 0)
(value : 0)
(value : 0)
(value : 0)
(value : 0)
(value : 0)
(value : 0)
(value : 0)
(value : 0)
(value : 0)
(value : 0)
(value : 0)
(value : 0)
(value : 0)
(value : 0)
(value : 0)
(value : 0)
(value : 0)
(value : 0)

org.apache.hadoop.hive.serde2.lazy.LazyInteger.data org.apache.hadoop.hive.serde2.columnar.ColumnarStructBase$FieldInfo.field
org.apache.hadoop.hive.serde2.columnar.ColumnarStructBase$FieldInfo[]
org.apache.hadoop.hive.serde2.columnar.ColumnarStruct.fieldInfoList
org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe.cachedLazyStruct
org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx.deserializer
{j.u.LinkedHashMap}.values
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.MapOperator.opCtxMap
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.TableScanOperator.parentOperators
{j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.aliasToWork
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap
15Kb (< 0.1%): org.apache.hadoop.io.IntWritable: 999 objects

 Random sample 
(value : 830)
(value : 269)
(value : 49)
(value : 491)
(value : 778)
(value : 866)
(value : 357)
(value : 712)
(value : 392)
(value : 570)
(value : 846)
(value : 562)
(value : 699)
(value : 384)
(value : 669)
(value : 133)
(value : 594)
(value : 625)
(value : 622)
(value : 310)

Object[] Object[]
org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx.rowWithPart
{j.u.LinkedHashMap}.values
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.MapOperator.opCtxMap
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.TableScanOperator.parentOperators
{j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.aliasToWork
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap


 2,026 94Kb (< 0.1%) 2,279Kb (0.1%) 3,486Kb (0.2%)j.u.Properties
Reference chains
Full reference chains

967Kb (< 0.1%): j.u.Properties: 999 objects

 Random sample 
j.u.Properties<String, String>(size: 23, capacity: 47) {(key:"name", val:"bi_data.partnerdb_catalogs"), (key:"column.name.delimiter", val:","), (key:"EXTERNAL", val:"TRUE"), (key:"file.inputformat", val:"org.apache.hadoop.hive.ql.io.RCFileInputFormat"), (key:"partition_columns", val:"partner_partition"), (key:"last_modified_time", val:"1531187675"), (key:"serialization.lib", val:"org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe"), (key:"file.outputformat", val:"org.apache.hadoop.hive.ql.io.RCFileOutputFormat"), (key:"totalSize", val:"1981871005"), (key:"bucket_count", val:"-1"), ...}
j.u.Properties<>(size: 23, capacity: 47) {(key:"name", val:"bi_data.partnerdb_catalogs"), (key:"column.name.delimiter", val:","), (key:"EXTERNAL", val:"TRUE"), (key:"file.inputformat", val:"org.apache.hadoop.hive.ql.io.RCFileInputFormat"), (key:"partition_columns", val:"partner_partition"), (key:"last_modified_time", val:"1531187686"), (key:"serialization.lib", val:"org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe"), (key:"file.outputformat", val:"org.apache.hadoop.hive.ql.io.RCFileOutputFormat"), (key:"totalSize", val:"143404896"), (key:"bucket_count", val:"-1"), ...}
j.u.Properties<>(size: 23, capacity: 47) {(key:"name", val:"bi_data.partnerdb_catalogs"), (key:"column.name.delimiter", val:","), (key:"EXTERNAL", val:"TRUE"), (key:"file.inputformat", val:"org.apache.hadoop.hive.ql.io.RCFileInputFormat"), (key:"partition_columns", val:"partner_partition"), (key:"last_modified_time", val:"1531187808"), (key:"serialization.lib", val:"org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe"), (key:"file.outputformat", val:"org.apache.hadoop.hive.ql.io.RCFileOutputFormat"), (key:"totalSize", val:"155266763"), (key:"bucket_count", val:"-1"), ...}
j.u.Properties<>(size: 23, capacity: 47) {(key:"name", val:"bi_data.partnerdb_catalogs"), (key:"column.name.delimiter", val:","), (key:"EXTERNAL", val:"TRUE"), (key:"file.inputformat", val:"org.apache.hadoop.hive.ql.io.RCFileInputFormat"), (key:"partition_columns", val:"partner_partition"), (key:"last_modified_time", val:"1531187688"), (key:"serialization.lib", val:"org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe"), (key:"file.outputformat", val:"org.apache.hadoop.hive.ql.io.RCFileOutputFormat"), (key:"totalSize", val:"72058740"), (key:"bucket_count", val:"-1"), ...}
j.u.Properties<>(size: 23, capacity: 47) {(key:"name", val:"bi_data.partnerdb_catalogs"), (key:"column.name.delimiter", val:","), (key:"EXTERNAL", val:"TRUE"), (key:"file.inputformat", val:"org.apache.hadoop.hive.ql.io.RCFileInputFormat"), (key:"partition_columns", val:"partner_partition"), (key:"last_modified_time", val:"1531187808"), (key:"serialization.lib", val:"org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe"), (key:"file.outputformat", val:"org.apache.hadoop.hive.ql.io.RCFileOutputFormat"), (key:"totalSize", val:"98299908"), (key:"bucket_count", val:"-1"), ...}
j.u.Properties<>(size: 23, capacity: 47) {(key:"name", val:"bi_data.partnerdb_catalogs"), (key:"column.name.delimiter", val:","), (key:"EXTERNAL", val:"TRUE"), (key:"file.inputformat", val:"org.apache.hadoop.hive.ql.io.RCFileInputFormat"), (key:"partition_columns", val:"partner_partition"), (key:"last_modified_time", val:"1531187663"), (key:"serialization.lib", val:"org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe"), (key:"file.outputformat", val:"org.apache.hadoop.hive.ql.io.RCFileOutputFormat"), (key:"totalSize", val:"1849065731"), (key:"bucket_count", val:"-1"), ...}
j.u.Properties<>(size: 23, capacity: 47) {(key:"name", val:"bi_data.partnerdb_catalogs"), (key:"column.name.delimiter", val:","), (key:"EXTERNAL", val:"TRUE"), (key:"file.inputformat", val:"org.apache.hadoop.hive.ql.io.RCFileInputFormat"), (key:"partition_columns", val:"partner_partition"), (key:"last_modified_time", val:"1531187676"), (key:"serialization.lib", val:"org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe"), (key:"file.outputformat", val:"org.apache.hadoop.hive.ql.io.RCFileOutputFormat"), (key:"totalSize", val:"195372478"), (key:"bucket_count", val:"-1"), ...}
j.u.Properties<>(size: 23, capacity: 47) {(key:"name", val:"bi_data.partnerdb_catalogs"), (key:"column.name.delimiter", val:","), (key:"EXTERNAL", val:"TRUE"), (key:"file.inputformat", val:"org.apache.hadoop.hive.ql.io.RCFileInputFormat"), (key:"partition_columns", val:"partner_partition"), (key:"last_modified_time", val:"1531187858"), (key:"serialization.lib", val:"org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe"), (key:"file.outputformat", val:"org.apache.hadoop.hive.ql.io.RCFileOutputFormat"), (key:"totalSize", val:"64989485"), (key:"bucket_count", val:"-1"), ...}
j.u.Properties<>(size: 23, capacity: 47) {(key:"name", val:"bi_data.partnerdb_catalogs"), (key:"column.name.delimiter", val:","), (key:"EXTERNAL", val:"TRUE"), (key:"file.inputformat", val:"org.apache.hadoop.hive.ql.io.RCFileInputFormat"), (key:"partition_columns", val:"partner_partition"), (key:"last_modified_time", val:"1531187846"), (key:"serialization.lib", val:"org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe"), (key:"file.outputformat", val:"org.apache.hadoop.hive.ql.io.RCFileOutputFormat"), (key:"totalSize", val:"428864255"), (key:"bucket_count", val:"-1"), ...}
j.u.Properties<>(size: 23, capacity: 47) {(key:"name", val:"bi_data.partnerdb_catalogs"), (key:"column.name.delimiter", val:","), (key:"EXTERNAL", val:"TRUE"), (key:"file.inputformat", val:"org.apache.hadoop.hive.ql.io.RCFileInputFormat"), (key:"partition_columns", val:"partner_partition"), (key:"last_modified_time", val:"1531187763"), (key:"serialization.lib", val:"org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe"), (key:"file.outputformat", val:"org.apache.hadoop.hive.ql.io.RCFileOutputFormat"), (key:"totalSize", val:"93614981"), (key:"bucket_count", val:"-1"), ...}
j.u.Properties<>(size: 23, capacity: 47) {(key:"name", val:"bi_data.partnerdb_catalogs"), (key:"column.name.delimiter", val:","), (key:"EXTERNAL", val:"TRUE"), (key:"file.inputformat", val:"org.apache.hadoop.hive.ql.io.RCFileInputFormat"), (key:"partition_columns", val:"partner_partition"), (key:"last_modified_time", val:"1531187786"), (key:"serialization.lib", val:"org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe"), (key:"file.outputformat", val:"org.apache.hadoop.hive.ql.io.RCFileOutputFormat"), (key:"totalSize", val:"193388137"), (key:"bucket_count", val:"-1"), ...}
j.u.Properties<>(size: 23, capacity: 47) {(key:"name", val:"bi_data.partnerdb_catalogs"), (key:"column.name.delimiter", val:","), (key:"EXTERNAL", val:"TRUE"), (key:"file.inputformat", val:"org.apache.hadoop.hive.ql.io.RCFileInputFormat"), (key:"partition_columns", val:"partner_partition"), (key:"last_modified_time", val:"1531187676"), (key:"serialization.lib", val:"org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe"), (key:"file.outputformat", val:"org.apache.hadoop.hive.ql.io.RCFileOutputFormat"), (key:"totalSize", val:"52798226"), (key:"bucket_count", val:"-1"), ...}
j.u.Properties<>(size: 23, capacity: 47) {(key:"name", val:"bi_data.partnerdb_catalogs"), (key:"column.name.delimiter", val:","), (key:"EXTERNAL", val:"TRUE"), (key:"file.inputformat", val:"org.apache.hadoop.hive.ql.io.RCFileInputFormat"), (key:"partition_columns", val:"partner_partition"), (key:"last_modified_time", val:"1531187738"), (key:"serialization.lib", val:"org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe"), (key:"file.outputformat", val:"org.apache.hadoop.hive.ql.io.RCFileOutputFormat"), (key:"totalSize", val:"580719978"), (key:"bucket_count", val:"-1"), ...}
j.u.Properties<>(size: 23, capacity: 47) {(key:"name", val:"bi_data.partnerdb_catalogs"), (key:"column.name.delimiter", val:","), (key:"EXTERNAL", val:"TRUE"), (key:"file.inputformat", val:"org.apache.hadoop.hive.ql.io.RCFileInputFormat"), (key:"partition_columns", val:"partner_partition"), (key:"last_modified_time", val:"1531187774"), (key:"serialization.lib", val:"org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe"), (key:"file.outputformat", val:"org.apache.hadoop.hive.ql.io.RCFileOutputFormat"), (key:"totalSize", val:"1443123224"), (key:"bucket_count", val:"-1"), ...}
j.u.Properties<>(size: 23, capacity: 47) {(key:"name", val:"bi_data.partnerdb_catalogs"), (key:"column.name.delimiter", val:","), (key:"EXTERNAL", val:"TRUE"), (key:"file.inputformat", val:"org.apache.hadoop.hive.ql.io.RCFileInputFormat"), (key:"partition_columns", val:"partner_partition"), (key:"last_modified_time", val:"1531187788"), (key:"serialization.lib", val:"org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe"), (key:"file.outputformat", val:"org.apache.hadoop.hive.ql.io.RCFileOutputFormat"), (key:"totalSize", val:"304819133"), (key:"bucket_count", val:"-1"), ...}
j.u.Properties<>(size: 23, capacity: 47) {(key:"name", val:"bi_data.partnerdb_catalogs"), (key:"column.name.delimiter", val:","), (key:"EXTERNAL", val:"TRUE"), (key:"file.inputformat", val:"org.apache.hadoop.hive.ql.io.RCFileInputFormat"), (key:"partition_columns", val:"partner_partition"), (key:"last_modified_time", val:"1531187654"), (key:"serialization.lib", val:"org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe"), (key:"file.outputformat", val:"org.apache.hadoop.hive.ql.io.RCFileOutputFormat"), (key:"totalSize", val:"52489296"), (key:"bucket_count", val:"-1"), ...}
j.u.Properties<>(size: 23, capacity: 47) {(key:"name", val:"bi_data.partnerdb_catalogs"), (key:"column.name.delimiter", val:","), (key:"EXTERNAL", val:"TRUE"), (key:"file.inputformat", val:"org.apache.hadoop.hive.ql.io.RCFileInputFormat"), (key:"partition_columns", val:"partner_partition"), (key:"last_modified_time", val:"1531187656"), (key:"serialization.lib", val:"org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe"), (key:"file.outputformat", val:"org.apache.hadoop.hive.ql.io.RCFileOutputFormat"), (key:"totalSize", val:"767087998"), (key:"bucket_count", val:"-1"), ...}
j.u.Properties<>(size: 23, capacity: 47) {(key:"name", val:"bi_data.partnerdb_catalogs"), (key:"column.name.delimiter", val:","), (key:"EXTERNAL", val:"TRUE"), (key:"file.inputformat", val:"org.apache.hadoop.hive.ql.io.RCFileInputFormat"), (key:"partition_columns", val:"partner_partition"), (key:"last_modified_time", val:"1531187751"), (key:"serialization.lib", val:"org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe"), (key:"file.outputformat", val:"org.apache.hadoop.hive.ql.io.RCFileOutputFormat"), (key:"totalSize", val:"2827944636"), (key:"bucket_count", val:"-1"), ...}
j.u.Properties<>(size: 23, capacity: 47) {(key:"name", val:"bi_data.partnerdb_catalogs"), (key:"column.name.delimiter", val:","), (key:"EXTERNAL", val:"TRUE"), (key:"file.inputformat", val:"org.apache.hadoop.hive.ql.io.RCFileInputFormat"), (key:"partition_columns", val:"partner_partition"), (key:"last_modified_time", val:"1531187685"), (key:"serialization.lib", val:"org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe"), (key:"file.outputformat", val:"org.apache.hadoop.hive.ql.io.RCFileOutputFormat"), (key:"totalSize", val:"266628232"), (key:"bucket_count", val:"-1"), ...}
j.u.Properties<>(size: 23, capacity: 47) {(key:"name", val:"bi_data.partnerdb_catalogs"), (key:"column.name.delimiter", val:","), (key:"EXTERNAL", val:"TRUE"), (key:"file.inputformat", val:"org.apache.hadoop.hive.ql.io.RCFileInputFormat"), (key:"partition_columns", val:"partner_partition"), (key:"last_modified_time", val:"1531187688"), (key:"serialization.lib", val:"org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe"), (key:"file.outputformat", val:"org.apache.hadoop.hive.ql.io.RCFileOutputFormat"), (key:"totalSize", val:"455181474"), (key:"bucket_count", val:"-1"), ...}

org.apache.hadoop.hive.serde2.lazy.LazySerDeParameters.tableProperties org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe.serdeParams
org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx.deserializer
{j.u.LinkedHashMap}.values
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.MapOperator.opCtxMap
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.TableScanOperator.parentOperators
{j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.aliasToWork
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap
937Kb (< 0.1%): j.u.Properties: 1,000 objects

 Random sample 
j.u.Properties<String, String>(size: 22, capacity: 47) {(key:"name", val:"bi_data.partnerdb_catalogs"), (key:"column.name.delimiter", val:","), (key:"file.inputformat", val:"org.apache.hadoop.hive.ql.io.RCFileInputFormat"), (key:"last_modified_time", val:"1531187741"), (key:"partition_columns", val:"partner_partition"), (key:"serialization.lib", val:"org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe"), (key:"file.outputformat", val:"org.apache.hadoop.hive.ql.io.RCFileOutputFormat"), (key:"totalSize", val:"68788226"), (key:"bucket_count", val:"-1"), (key:"columns.comments", val:""), ...}
j.u.Properties<>(size: 22, capacity: 47) {(key:"name", val:"bi_data.partnerdb_catalogs"), (key:"column.name.delimiter", val:","), (key:"file.inputformat", val:"org.apache.hadoop.hive.ql.io.RCFileInputFormat"), (key:"last_modified_time", val:"1531187808"), (key:"partition_columns", val:"partner_partition"), (key:"serialization.lib", val:"org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe"), (key:"file.outputformat", val:"org.apache.hadoop.hive.ql.io.RCFileOutputFormat"), (key:"totalSize", val:"98299908"), (key:"bucket_count", val:"-1"), (key:"columns.comments", val:""), ...}
j.u.Properties<>(size: 22, capacity: 47) {(key:"name", val:"bi_data.partnerdb_catalogs"), (key:"column.name.delimiter", val:","), (key:"file.inputformat", val:"org.apache.hadoop.hive.ql.io.RCFileInputFormat"), (key:"last_modified_time", val:"1531187719"), (key:"partition_columns", val:"partner_partition"), (key:"serialization.lib", val:"org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe"), (key:"file.outputformat", val:"org.apache.hadoop.hive.ql.io.RCFileOutputFormat"), (key:"totalSize", val:"3123319604"), (key:"bucket_count", val:"-1"), (key:"columns.comments", val:""), ...}
j.u.Properties<>(size: 22, capacity: 47) {(key:"name", val:"bi_data.partnerdb_catalogs"), (key:"column.name.delimiter", val:","), (key:"file.inputformat", val:"org.apache.hadoop.hive.ql.io.RCFileInputFormat"), (key:"last_modified_time", val:"1531187675"), (key:"partition_columns", val:"partner_partition"), (key:"serialization.lib", val:"org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe"), (key:"file.outputformat", val:"org.apache.hadoop.hive.ql.io.RCFileOutputFormat"), (key:"totalSize", val:"1981871005"), (key:"bucket_count", val:"-1"), (key:"columns.comments", val:""), ...}
j.u.Properties<>(size: 22, capacity: 47) {(key:"name", val:"bi_data.partnerdb_catalogs"), (key:"column.name.delimiter", val:","), (key:"file.inputformat", val:"org.apache.hadoop.hive.ql.io.RCFileInputFormat"), (key:"last_modified_time", val:"1531187809"), (key:"partition_columns", val:"partner_partition"), (key:"serialization.lib", val:"org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe"), (key:"file.outputformat", val:"org.apache.hadoop.hive.ql.io.RCFileOutputFormat"), (key:"totalSize", val:"1194311409"), (key:"bucket_count", val:"-1"), (key:"columns.comments", val:""), ...}
j.u.Properties<>(size: 22, capacity: 47) {(key:"name", val:"bi_data.partnerdb_catalogs"), (key:"column.name.delimiter", val:","), (key:"file.inputformat", val:"org.apache.hadoop.hive.ql.io.RCFileInputFormat"), (key:"last_modified_time", val:"1531187848"), (key:"partition_columns", val:"partner_partition"), (key:"serialization.lib", val:"org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe"), (key:"file.outputformat", val:"org.apache.hadoop.hive.ql.io.RCFileOutputFormat"), (key:"totalSize", val:"188408033"), (key:"bucket_count", val:"-1"), (key:"columns.comments", val:""), ...}
j.u.Properties<>(size: 22, capacity: 47) {(key:"name", val:"bi_data.partnerdb_catalogs"), (key:"column.name.delimiter", val:","), (key:"file.inputformat", val:"org.apache.hadoop.hive.ql.io.RCFileInputFormat"), (key:"last_modified_time", val:"1531187775"), (key:"partition_columns", val:"partner_partition"), (key:"serialization.lib", val:"org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe"), (key:"file.outputformat", val:"org.apache.hadoop.hive.ql.io.RCFileOutputFormat"), (key:"totalSize", val:"223325319"), (key:"bucket_count", val:"-1"), (key:"columns.comments", val:""), ...}
j.u.Properties<>(size: 22, capacity: 47) {(key:"name", val:"bi_data.partnerdb_catalogs"), (key:"column.name.delimiter", val:","), (key:"file.inputformat", val:"org.apache.hadoop.hive.ql.io.RCFileInputFormat"), (key:"last_modified_time", val:"1531187789"), (key:"partition_columns", val:"partner_partition"), (key:"serialization.lib", val:"org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe"), (key:"file.outputformat", val:"org.apache.hadoop.hive.ql.io.RCFileOutputFormat"), (key:"totalSize", val:"53200451"), (key:"bucket_count", val:"-1"), (key:"columns.comments", val:""), ...}
j.u.Properties<>(size: 22, capacity: 47) {(key:"name", val:"bi_data.partnerdb_catalogs"), (key:"column.name.delimiter", val:","), (key:"file.inputformat", val:"org.apache.hadoop.hive.ql.io.RCFileInputFormat"), (key:"last_modified_time", val:"1531187847"), (key:"partition_columns", val:"partner_partition"), (key:"serialization.lib", val:"org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe"), (key:"file.outputformat", val:"org.apache.hadoop.hive.ql.io.RCFileOutputFormat"), (key:"totalSize", val:"228290831"), (key:"bucket_count", val:"-1"), (key:"columns.comments", val:""), ...}
j.u.Properties<>(size: 22, capacity: 47) {(key:"name", val:"bi_data.partnerdb_catalogs"), (key:"column.name.delimiter", val:","), (key:"file.inputformat", val:"org.apache.hadoop.hive.ql.io.RCFileInputFormat"), (key:"last_modified_time", val:"1531187739"), (key:"partition_columns", val:"partner_partition"), (key:"serialization.lib", val:"org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe"), (key:"file.outputformat", val:"org.apache.hadoop.hive.ql.io.RCFileOutputFormat"), (key:"totalSize", val:"155860491"), (key:"bucket_count", val:"-1"), (key:"columns.comments", val:""), ...}
j.u.Properties<>(size: 22, capacity: 47) {(key:"name", val:"bi_data.partnerdb_catalogs"), (key:"column.name.delimiter", val:","), (key:"file.inputformat", val:"org.apache.hadoop.hive.ql.io.RCFileInputFormat"), (key:"last_modified_time", val:"1531187764"), (key:"partition_columns", val:"partner_partition"), (key:"serialization.lib", val:"org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe"), (key:"file.outputformat", val:"org.apache.hadoop.hive.ql.io.RCFileOutputFormat"), (key:"totalSize", val:"66279468"), (key:"bucket_count", val:"-1"), (key:"columns.comments", val:""), ...}
j.u.Properties<>(size: 22, capacity: 47) {(key:"name", val:"bi_data.partnerdb_catalogs"), (key:"column.name.delimiter", val:","), (key:"file.inputformat", val:"org.apache.hadoop.hive.ql.io.RCFileInputFormat"), (key:"last_modified_time", val:"1531187869"), (key:"partition_columns", val:"partner_partition"), (key:"serialization.lib", val:"org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe"), (key:"file.outputformat", val:"org.apache.hadoop.hive.ql.io.RCFileOutputFormat"), (key:"totalSize", val:"125401724"), (key:"bucket_count", val:"-1"), (key:"columns.comments", val:""), ...}
j.u.Properties<>(size: 22, capacity: 47) {(key:"name", val:"bi_data.partnerdb_catalogs"), (key:"column.name.delimiter", val:","), (key:"file.inputformat", val:"org.apache.hadoop.hive.ql.io.RCFileInputFormat"), (key:"last_modified_time", val:"1531187822"), (key:"partition_columns", val:"partner_partition"), (key:"serialization.lib", val:"org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe"), (key:"file.outputformat", val:"org.apache.hadoop.hive.ql.io.RCFileOutputFormat"), (key:"totalSize", val:"935602865"), (key:"bucket_count", val:"-1"), (key:"columns.comments", val:""), ...}
j.u.Properties<>(size: 22, capacity: 47) {(key:"name", val:"bi_data.partnerdb_catalogs"), (key:"column.name.delimiter", val:","), (key:"file.inputformat", val:"org.apache.hadoop.hive.ql.io.RCFileInputFormat"), (key:"last_modified_time", val:"1531187798"), (key:"partition_columns", val:"partner_partition"), (key:"serialization.lib", val:"org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe"), (key:"file.outputformat", val:"org.apache.hadoop.hive.ql.io.RCFileOutputFormat"), (key:"totalSize", val:"63010566"), (key:"bucket_count", val:"-1"), (key:"columns.comments", val:""), ...}
j.u.Properties<>(size: 22, capacity: 47) {(key:"name", val:"bi_data.partnerdb_catalogs"), (key:"column.name.delimiter", val:","), (key:"file.inputformat", val:"org.apache.hadoop.hive.ql.io.RCFileInputFormat"), (key:"last_modified_time", val:"1531187666"), (key:"partition_columns", val:"partner_partition"), (key:"serialization.lib", val:"org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe"), (key:"file.outputformat", val:"org.apache.hadoop.hive.ql.io.RCFileOutputFormat"), (key:"totalSize", val:"371722302"), (key:"bucket_count", val:"-1"), (key:"columns.comments", val:""), ...}
j.u.Properties<>(size: 22, capacity: 47) {(key:"name", val:"bi_data.partnerdb_catalogs"), (key:"column.name.delimiter", val:","), (key:"file.inputformat", val:"org.apache.hadoop.hive.ql.io.RCFileInputFormat"), (key:"last_modified_time", val:"1531187761"), (key:"partition_columns", val:"partner_partition"), (key:"serialization.lib", val:"org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe"), (key:"file.outputformat", val:"org.apache.hadoop.hive.ql.io.RCFileOutputFormat"), (key:"totalSize", val:"57333119"), (key:"bucket_count", val:"-1"), (key:"columns.comments", val:""), ...}
j.u.Properties<>(size: 22, capacity: 47) {(key:"name", val:"bi_data.partnerdb_catalogs"), (key:"column.name.delimiter", val:","), (key:"file.inputformat", val:"org.apache.hadoop.hive.ql.io.RCFileInputFormat"), (key:"last_modified_time", val:"1531187775"), (key:"partition_columns", val:"partner_partition"), (key:"serialization.lib", val:"org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe"), (key:"file.outputformat", val:"org.apache.hadoop.hive.ql.io.RCFileOutputFormat"), (key:"totalSize", val:"90464196"), (key:"bucket_count", val:"-1"), (key:"columns.comments", val:""), ...}
j.u.Properties<>(size: 22, capacity: 47) {(key:"name", val:"bi_data.partnerdb_catalogs"), (key:"column.name.delimiter", val:","), (key:"file.inputformat", val:"org.apache.hadoop.hive.ql.io.RCFileInputFormat"), (key:"last_modified_time", val:"1531187871"), (key:"partition_columns", val:"partner_partition"), (key:"serialization.lib", val:"org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe"), (key:"file.outputformat", val:"org.apache.hadoop.hive.ql.io.RCFileOutputFormat"), (key:"totalSize", val:"2437102344"), (key:"bucket_count", val:"-1"), (key:"columns.comments", val:""), ...}
j.u.Properties<>(size: 22, capacity: 47) {(key:"name", val:"bi_data.partnerdb_catalogs"), (key:"column.name.delimiter", val:","), (key:"file.inputformat", val:"org.apache.hadoop.hive.ql.io.RCFileInputFormat"), (key:"last_modified_time", val:"1531187707"), (key:"partition_columns", val:"partner_partition"), (key:"serialization.lib", val:"org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe"), (key:"file.outputformat", val:"org.apache.hadoop.hive.ql.io.RCFileOutputFormat"), (key:"totalSize", val:"915949031"), (key:"bucket_count", val:"-1"), (key:"columns.comments", val:""), ...}
j.u.Properties<>(size: 22, capacity: 47) {(key:"name", val:"bi_data.partnerdb_catalogs"), (key:"column.name.delimiter", val:","), (key:"file.inputformat", val:"org.apache.hadoop.hive.ql.io.RCFileInputFormat"), (key:"last_modified_time", val:"1531187833"), (key:"partition_columns", val:"partner_partition"), (key:"serialization.lib", val:"org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe"), (key:"file.outputformat", val:"org.apache.hadoop.hive.ql.io.RCFileOutputFormat"), (key:"totalSize", val:"72236656"), (key:"bucket_count", val:"-1"), (key:"columns.comments", val:""), ...}

org.apache.hadoop.hive.ql.plan.PartitionDesc.properties {j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.pathToPartitionInfo
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap
143Kb (< 0.1%): j.u.Properties: 2 objects

 Random sample 
j.u.Properties<String, String>(size: 1903, capacity: 3071) {(key:"yarn.app.mapreduce.client.job.retry-interval", val:"2000"), (key:"s3.stream-buffer-size", val:"4096"), (key:"dfs.namenode.servicerpc-address.preprod-pa3.9c-b6-54-7d-1c-7c", val:"9c-b6-54-7d-1c-7c.hpc.criteo.preprod:8018"), (key:"dfs.namenode.http-address.yarn-experimental.a4-5d-36-fc-ef-54", val:"a4-5d-36-fc-ef-54.hpc.criteo.preprod:50070"), (key:"hive.metastore.hbase.aggr.stats.cache.entries", val:"10000"), (key:"hive.allow.udf.load.on.demand", val:"false"), (key:"dfs.cluster.administrators", val:" Gu-sre-Lake,gu-sre-lake"), (key:"hadoop.registry.zk.root", val:"/registry"), (key:"hive.server2.map.fair.scheduler.queue", val:"true"), (key:"hive.txn.manager", val:"org.apache.hadoop.hive.ql.lockmgr.DummyTxnManager"), ...}
j.u.Properties<>(size: 1903, capacity: 3071) {(key:"yarn.app.mapreduce.client.job.retry-interval", val:"2000"), (key:"s3.stream-buffer-size", val:"4096"), (key:"dfs.namenode.servicerpc-address.preprod-pa3.9c-b6-54-7d-1c-7c", val:"9c-b6-54-7d-1c-7c.hpc.criteo.preprod:8018"), (key:"dfs.namenode.http-address.yarn-experimental.a4-5d-36-fc-ef-54", val:"a4-5d-36-fc-ef-54.hpc.criteo.preprod:50070"), (key:"hive.metastore.hbase.aggr.stats.cache.entries", val:"10000"), (key:"hive.allow.udf.load.on.demand", val:"false"), (key:"dfs.cluster.administrators", val:" Gu-sre-Lake,gu-sre-lake"), (key:"hadoop.registry.zk.root", val:"/registry"), (key:"hive.server2.map.fair.scheduler.queue", val:"true"), (key:"hive.txn.manager", val:"org.apache.hadoop.hive.ql.lockmgr.DummyTxnManager"), ...}

org.apache.hadoop.conf.Configuration.properties org.apache.hadoop.hdfs.server.namenode.ha.ConfiguredFailoverProxyProvider.conf
org.apache.hadoop.io.retry.RetryInvocationHandler.proxyProvider
com.sun.proxy.$Proxy15.h
org.apache.hadoop.hdfs.DFSClient.namenode
org.apache.hadoop.hdfs.DistributedFileSystem.dfs
{j.u.HashMap}.values
org.apache.hadoop.fs.FileSystem$Cache.map
↖Java Static org.apache.hadoop.fs.FileSystem.CACHE






4. Where Memory Goes, by GC Root:  Leak candidate(s) found  What's this?

1,047,625Kb (64.3%) Object tree for GC root(s) Java Static java.beans.ThreadGroupContext.contexts
Root object java.beans.ThreadGroupContext$1@8055e958 java.beans.ThreadGroupContext$1(queue : j.l.r.ReferenceQueue@8055e978, table : java.beans.WeakIdentityMap$Entry[](size: 8), threshold : 6, size : 1)
                              1. byte[] self 1,047,552Kb (64.3%), 1 object(s)
                            1. ... 18 more referenced objects retaining 936b (< 0.1%)
                        1. ... 5 more referenced objects retaining 416b (< 0.1%)
                        1. ... 23 referenced objects retaining 6Kb (< 0.1%)
                  1. String self 48b (< 0.1%), 1 object(s)
                1. ... 2 referenced objects retaining 64Kb (< 0.1%)
          1. j.l.r.ReferenceQueue$Lock self 16b (< 0.1%), 1 object(s)
571,788Kb (35.1%) Object tree for GC root(s) Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap LEAK CANDIDATE(S) FOUND 
Root object org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory@c078c300 org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory(threadLocalWorkMap : null, gWorkMap : j.u.HashMap(size: 1), dummy : org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory$DummyMap@c078c918)
                                                  1. j.u.HashSet self 182,843Kb (11.2%), 2,925,492 object(s)
                                                  1. j.u.HashSet self 60,947Kb (3.7%), 975,164 object(s)
                                                  1. j.l.Long self 2Kb (< 0.1%), 98 object(s)
                                                  1. String self 2,111Kb (0.1%), 13,710 object(s)
                                                  1. String self 2,111Kb (0.1%), 13,710 object(s)
                                                      1. byte[] self 47,987Kb (2.9%), 487,567 object(s)
                                                  1. org.apache.hadoop.io.IntWritable self 7,618Kb (0.5%), 487,582 object(s)
                                                      1. org.apache.hadoop.hive.serde2.objectinspector.ListObjectsEqualComparer$FieldComparer self 160b (< 0.1%), 2 object(s)
                                    1. ... 31 more referenced objects retaining 24Kb (< 0.1%)
                                                          1. byte[] self 3,776Kb (0.2%), 4 object(s)
                                                    1. ... 26 more referenced objects retaining 1,044Kb (< 0.1%)
                                              1. int[] self 48b (< 0.1%), 1 object(s)
                                              1. org.apache.hadoop.io.IntWritable self 16b (< 0.1%), 1 object(s)
                                    1. ... 3 more referenced objects retaining 928b (< 0.1%)
                                  1. ... 11 more referenced objects retaining 4Kb (< 0.1%)
                            1. ... 13 more referenced objects retaining 9Kb (< 0.1%)
                                                1. ... 5506 referenced objects retaining 2,950Kb (0.2%)
                                                        1. ... 45908 referenced objects retaining 1,380Kb (< 0.1%)
                                                1. ... 1485 more referenced objects retaining 69Kb (< 0.1%)
                                            1. ... 3996 more referenced objects retaining 1,061Kb (< 0.1%)
                                                              1. byte[] self 4,942Kb (0.3%), 7 object(s)
                                                          1. ... 23 referenced objects retaining 2Kb (< 0.1%)
                                                1. ... 2 more referenced objects retaining 96b (< 0.1%)
                                                  1. org.apache.hadoop.io.IntWritable self 15Kb (< 0.1%), 999 object(s)
                                        1. ... 1998 more referenced objects retaining 101Kb (< 0.1%)
                                    1. ... 1002 more referenced objects retaining 15Kb (< 0.1%)
                                1. ... 1001 more referenced objects retaining 187Kb (< 0.1%)
                            1. ... 17 more referenced objects retaining 30Kb (< 0.1%)
                    1. ... 13 more referenced objects retaining 16Kb (< 0.1%)
                  1. String self 96b (< 0.1%), 1 object(s)
                        1. ... 12137 referenced objects retaining 927Kb (< 0.1%)
                    1. ... 2001 more referenced objects retaining 320Kb (< 0.1%)
                1. ... 3 more referenced objects retaining 48b (< 0.1%)
            1. ... 11 more referenced objects retaining 892Kb (< 0.1%)
                1. ... 6 referenced objects retaining 1Kb (< 0.1%)
      1. org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory$DummyMap self 16b (< 0.1%), 1 object(s)
1,953Kb (0.1%) Object tree for GC root(s) Java Static org.apache.hive.com.esotericsoftware.reflectasm.AccessClassLoader.selfContextParentClassLoader
Root object sun.misc.Launcher$AppClassLoader@8001d198 sun.misc.Launcher$AppClassLoader(parent : sun.misc.Launcher$ExtClassLoader@8001d1f8, parallelLockMap : java.util.concurrent.ConcurrentHashMap(size: 4,821), package2certs : java.util.concurrent.ConcurrentHashMap(size: 283), classes : j.u.Vector(size: 4,373), defaultDomain : java.security.ProtectionDomain@80218448, domains : j.u.Collections$SynchronizedSet@801fcc10, packages : j.u.HashMap(size: 281), nativeLibraries : j.u.Vector(size: 1), assertionLock : Object@80200d18, defaultAssertionStatus : false, packageAssertionStatus : null, classAssertionStatus : null, initialized : true, pdcache : j.u.HashMap(size: 30), ucp : sun.misc.URLClassPath@80177708, acc : java.security.AccessControlContext@8004e2b0, closeables : j.u.WeakHashMap(size: 5), ucp : sun.misc.URLClassPath@80177708)
    1. ... 13 referenced objects retaining 1,953Kb (0.1%)
... and 7245 more GC roots together retaining 6,276Kb (0.4%)




5. Live vs Garbage Objects What's this?

     Live   Garbage  Total 
 Objects 19,517,030 16,69219,533,722
 Bytes 1,628,172Kb (100.0%) 749Kb (< 0.1%)1,628,922Kb (100.0%)

Details:

  #instances
garbage 
  Shallow size garbage   #instances
live 
  Shallow size live  Class name 
 40 968b (< 0.1%) 492,129 1,106,966Kb (68.0%)byte[]
 96 4Kb (< 0.1%) 3,901,068 182,862Kb (11.2%)j.u.HashMap
 0 0b (0.0%) 2,925,498 68,566Kb (4.2%)org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg
 6 96b (< 0.1%) 3,900,702 60,948Kb (3.7%)j.u.HashSet
 0 0b (0.0%) 487,583 38,092Kb (2.3%)org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[]
 0 0b (0.0%) 1,950,332 30,473Kb (1.9%)org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMax$GenericUDAFMaxEvaluator$MaxAgg
 0 0b (0.0%) 1,950,332 30,473Kb (1.9%)org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg
 0 0b (0.0%) 975,166 22,855Kb (1.4%)org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg
 265 8Kb (< 0.1%) 493,950 15,435Kb (0.9%)j.u.HashMap$Node
 0 0b (0.0%) 487,583 15,236Kb (0.9%)org.apache.hadoop.hive.ql.exec.KeyWrapperFactory$ListKeyWrapper
 2,568 164Kb (< 0.1%) 493,255 11,807Kb (0.7%)Object[]
 0 0b (0.0%) 498,673 11,687Kb (0.7%)org.apache.hadoop.io.Text
 3,776 297Kb (< 0.1%) 99,161 8,933Kb (0.5%)char[]
 0 0b (0.0%) 493,612 7,712Kb (0.5%)org.apache.hadoop.io.IntWritable
 239 13Kb (< 0.1%) 2,439 4,354Kb (0.3%)j.u.HashMap$Node[]
 3,775 88Kb (< 0.1%) 98,964 2,319Kb (0.1%)String
 0 0b (0.0%) 56,301 1,759Kb (0.1%)j.u.Hashtable$Entry




6. Fixed per-object overhead What's this?

  Fixed per-object overhead  Total overhead 
 12b228,910Kb (14.1%)

Details:

  #instances   (Average) object
size 
  Total overhead
per class 
 Class name 
 3,901,164 48b 45,716Kb (2.8%)j.u.HashMap
 3,900,708 16b 45,711Kb (2.8%)j.u.HashSet
 2,925,498 24b 34,283Kb (2.1%)org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg
 1,950,332 16b 22,855Kb (1.4%)org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMax$GenericUDAFMaxEvaluator$MaxAgg
 1,950,332 16b 22,855Kb (1.4%)org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg
 975,166 24b 11,427Kb (0.7%)org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg
 498,673 24b 5,843Kb (0.4%)org.apache.hadoop.io.Text
 495,823 24b 5,810Kb (0.4%)Object[]
 494,215 32b 5,791Kb (0.4%)j.u.HashMap$Node
 493,612 16b 5,784Kb (0.4%)org.apache.hadoop.io.IntWritable
 492,169 2Kb 5,767Kb (0.4%)byte[]
 487,583 32b 5,713Kb (0.4%)org.apache.hadoop.hive.ql.exec.KeyWrapperFactory$ListKeyWrapper
 487,583 80b 5,713Kb (0.4%)org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[]




7. Memory Retained by Objects Awaiting Finalization:  no significant overhead  What's this?





8. Duplicate Strings:  overhead 0.5%  What's this?

  Total strings   Unique strings   Duplicate values  Overhead 
 102,739 31,356 5,9427,397Kb (0.5%)

Top duplicate strings

  Overhead   # char[]s   # objects  Value 
 1,387Kb (< 0.1%) 8,884 8,884"http://static.criteo.net/images/produits/14842/stars/5.png"
 825Kb (< 0.1%) 5,281 5,281"http://static.criteo.net/images/produits/14842/stars/4.png"
 726Kb (< 0.1%) 4,649 4,649"http://static.criteo.net/images/produits/14842/stars/0.png"
 612Kb (< 0.1%) 3,921 3,921"http://static.criteo.net/images/produits/14842/stars/3.png"
 396Kb (< 0.1%) 2,824 2,824"https://www.viator.com/images/stars/orange/16_5.gif"
 124Kb (< 0.1%) 842 842"https://www.viator.com/images/stars/orange/16_4_5.gif"
 95Kb (< 0.1%) 2,033 2,033"root"
 93Kb (< 0.1%) 666 666"https://www.viator.com/images/stars/orange/16_4.gif"
 85Kb (< 0.1%) 1,001 1,001"transient_lastDdlTime"
 85Kb (< 0.1%) 1,000 1,000"COLUMN_STATS_ACCURATE"
 78Kb (< 0.1%) 1,003 1,003"serialization.format"
 78Kb (< 0.1%) 1,002 1,002"reallyrecommendable"
 78Kb (< 0.1%) 1,002 1,002"additionalimagelinks"
 78Kb (< 0.1%) 1,001 1,001"last_modified_time"
 70Kb (< 0.1%) 1,002 1,002"recommendable"
 70Kb (< 0.1%) 1,001 1,001"last_modified_by"
 63Kb (< 0.1%) 1,014 1,014"description"
 62Kb (< 0.1%) 1,003 1,003"partnerid"
 62Kb (< 0.1%) 1,002 1,002"producturl"
 62Kb (< 0.1%) 1,002 1,002"smallimage"



Reference Chains for Duplicate Strings

Expensive data fields

2,111Kb (0.1%), 13,710 / 100% dup strings (8 unique), 13710 dup backing arrays:

  Num strings  String value 
 4,380"http://static.criteo.net/images/produits/14842/stars/5.png"
 2,641"http://static.criteo.net/images/produits/14842/stars/4.png"
 2,375"http://static.criteo.net/images/produits/14842/stars/0.png"
 1,971"http://static.criteo.net/images/produits/14842/stars/3.png"
 1,412"https://www.viator.com/images/stars/orange/16_5.gif"
 421"https://www.viator.com/images/stars/orange/16_4_5.gif"
 333"https://www.viator.com/images/stars/orange/16_4.gif"
 177"http://static.criteo.net/images/produits/14842/stars/2.png"

 ↖org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg.o
2,111Kb (0.1%), 13,710 / 100% dup strings (8 unique), 13710 dup backing arrays:

  Num strings  String value 
 4,504"http://static.criteo.net/images/produits/14842/stars/5.png"
 2,640"http://static.criteo.net/images/produits/14842/stars/4.png"
 2,274"http://static.criteo.net/images/produits/14842/stars/0.png"
 1,950"http://static.criteo.net/images/produits/14842/stars/3.png"
 1,412"https://www.viator.com/images/stars/orange/16_5.gif"
 421"https://www.viator.com/images/stars/orange/16_4_5.gif"
 333"https://www.viator.com/images/stars/orange/16_4.gif"
 176"http://static.criteo.net/images/produits/14842/stars/2.png"

 ↖org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMax$GenericUDAFMaxEvaluator$MaxAgg.o
1,365Kb (< 0.1%), 23,002 / 100% dup strings (25 unique), 23002 dup backing arrays:

  Num strings  String value 
 1,000"price"
 1,000"retailprice"
 1,000"name"
 1,000"additionalimagelinks"
 1,000"componentid"
 1,000"instock"
 1,000"extra"
 1,000"discount"
 1,000"category"
 1,000"hashcode"
... and 5,000 more strings, of which 15 are unique

String[] j.u.Arrays$ArrayList.a


Full reference chains

2,111Kb (0.1%), 13,710 / 100% dup strings (8 unique), 13710 dup backing arrays:

  Num strings  String value 
 4,504"http://static.criteo.net/images/produits/14842/stars/5.png"
 2,640"http://static.criteo.net/images/produits/14842/stars/4.png"
 2,274"http://static.criteo.net/images/produits/14842/stars/0.png"
 1,950"http://static.criteo.net/images/produits/14842/stars/3.png"
 1,412"https://www.viator.com/images/stars/orange/16_5.gif"
 421"https://www.viator.com/images/stars/orange/16_4_5.gif"
 333"https://www.viator.com/images/stars/orange/16_4.gif"
 176"http://static.criteo.net/images/produits/14842/stars/2.png"

org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMax$GenericUDAFMaxEvaluator$MaxAgg.o org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[]
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GroupByOperator.hashAggregations
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.SelectOperator.childOperators
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.TableScanOperator.childOperators
{j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.aliasToWork
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap
2,111Kb (0.1%), 13,710 / 100% dup strings (8 unique), 13710 dup backing arrays:

  Num strings  String value 
 4,380"http://static.criteo.net/images/produits/14842/stars/5.png"
 2,641"http://static.criteo.net/images/produits/14842/stars/4.png"
 2,375"http://static.criteo.net/images/produits/14842/stars/0.png"
 1,971"http://static.criteo.net/images/produits/14842/stars/3.png"
 1,412"https://www.viator.com/images/stars/orange/16_5.gif"
 421"https://www.viator.com/images/stars/orange/16_4_5.gif"
 333"https://www.viator.com/images/stars/orange/16_4.gif"
 177"http://static.criteo.net/images/produits/14842/stars/2.png"

org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg.o org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[]
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GroupByOperator.hashAggregations
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.SelectOperator.childOperators
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.TableScanOperator.childOperators
{j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.aliasToWork
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap
1,364Kb (< 0.1%), 22,977 / 100% dup strings (23 unique), 22977 dup backing arrays:

  Num strings  String value 
 999"price"
 999"name"
 999"retailprice"
 999"hashcode"
 999"additionalimagelinks"
 999"sqlid"
 999"instock"
 999"createddate"
 999"discount"
 999"extra"
... and 2,997 more strings, of which 13 are unique

String[] j.u.Arrays$ArrayList.a
org.apache.hadoop.hive.serde2.lazy.LazySerDeParameters.columnNames
org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe.serdeParams
org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx.deserializer
{j.u.LinkedHashMap}.values
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.MapOperator.opCtxMap
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.TableScanOperator.parentOperators
{j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.aliasToWork
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap





9. Bad Collections:  overhead 15.0%  What's this?

  Total collections   Bad collections  Overhead: 
 7,809,571 3,905,512244,518Kb (15.0%)

What can I do to fix Bad Collections?


The overhead (waste) of all 3,905,512 bad collections is 244,518Kb, or 15.0% of used heap.
That's how much memory you could save if:
- There were no empty collections

All of the collections referenced by the data structures below are empty.
They waste 243,791Kb, or 15.0% of used heap. Does your app really use/need them?
  org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg.uniqueObjects : reference 2,925,498 of bad j.u.HashSet , waste 182,843Kb, or 11.2% of used heap
  org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg.uniqueObjects : reference 975,166 of bad j.u.HashSet , waste 60,947Kb, or 3.7% of used heap

How to fix?

All of the collections above are empty.
It means that these collections are initialized, but likely never used.
Every initialized collection uses some memory even when it's empty.
For example, you may have a class such as:
  class Foo {
    Map map = new HashMap<>(); // Allocated with default capacity 16
    void addToMap(K key, V val) { map.put(key, val); }
    ...
  }

but addToMap() is never called.

So, if the collections above are not used at all, the respective code is dead and should be removed.
Otherwise, such collections should be initialized lazily in method(s) that access them, e.g.
  void addToMap(K key, V val) {
    if (map == null) map = new HashMap<>();
    map.put(key, val);
  }




For detailed information about bad collections in this dump, check the tables below.
To learn more about fixing bad collections, check this article



Top bad collections:

  Overhead   Problem   # objects  Type 
 243,793Kb (15.0%) empty 3900687 / 99%j.u.HashSet


Reference Chains for Bad Collections

Expensive data fields

182,843Kb (11.2%): j.u.HashSet: 2,925,498 / 100% of empty 182,843Kb (11.2%)
 ↖org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg.uniqueObjects
60,947Kb (3.7%): j.u.HashSet: 975,166 / 100% of empty 60,947Kb (3.7%)
 ↖org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg.uniqueObjects


Full reference chains

182,843Kb (11.2%): j.u.HashSet: 2,925,492 / 100% of empty 182,843Kb (11.2%)
org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg.uniqueObjects org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[]
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GroupByOperator.hashAggregations
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.SelectOperator.childOperators
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.TableScanOperator.childOperators
{j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.aliasToWork
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap
60,947Kb (3.7%): j.u.HashSet: 975,164 / 100% of empty 60,947Kb (3.7%)
org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg.uniqueObjects org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[]
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GroupByOperator.hashAggregations
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.SelectOperator.childOperators
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.TableScanOperator.childOperators
{j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.aliasToWork
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap
171Kb (< 0.1%): j.u.LinkedHashMap: 1,000 / 100% of 1-elem 171Kb (< 0.1%)

 Random sample of non-empty containers 
j.u.LinkedHashMap<org.apache.hadoop.hive.ql.exec.TableScanOperator, org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx>(size: 1, capacity: 16) {(key:(operatorId : "TS_0", id : "0", schemaEvolutionColumns : "bigimage,category,componentid,createddate,description,discount,extra,filtered,hashcode,id,instock,lasttouch,name,partnerid,price,producturl, ...[length 232]", ...), val:(alias : "$hdt$_0:partnerdb_catalogs", tableName : "bi_data.partnerdb_catalogs", partName : "{partner_partition=813}", ...))}
j.u.LinkedHashMap<>(size: 1, capacity: 16) {(key:(operatorId : "TS_0", id : "0", schemaEvolutionColumns : "bigimage,category,componentid,createddate,description,discount,extra,filtered,hashcode,id,instock,lasttouch,name,partnerid,price,producturl, ...[length 232]", ...), val:(alias : "$hdt$_0:partnerdb_catalogs", tableName : "bi_data.partnerdb_catalogs", partName : "{partner_partition=794}", ...))}
j.u.LinkedHashMap<>(size: 1, capacity: 16) {(key:(operatorId : "TS_0", id : "0", schemaEvolutionColumns : "bigimage,category,componentid,createddate,description,discount,extra,filtered,hashcode,id,instock,lasttouch,name,partnerid,price,producturl, ...[length 232]", ...), val:(alias : "$hdt$_0:partnerdb_catalogs", tableName : "bi_data.partnerdb_catalogs", partName : "{partner_partition=895}", ...))}
j.u.LinkedHashMap<>(size: 1, capacity: 16) {(key:(operatorId : "TS_0", id : "0", schemaEvolutionColumns : "bigimage,category,componentid,createddate,description,discount,extra,filtered,hashcode,id,instock,lasttouch,name,partnerid,price,producturl, ...[length 232]", ...), val:(alias : "$hdt$_0:partnerdb_catalogs", tableName : "bi_data.partnerdb_catalogs", partName : "{partner_partition=958}", ...))}
j.u.LinkedHashMap<>(size: 1, capacity: 16) {(key:(operatorId : "TS_0", id : "0", schemaEvolutionColumns : "bigimage,category,componentid,createddate,description,discount,extra,filtered,hashcode,id,instock,lasttouch,name,partnerid,price,producturl, ...[length 232]", ...), val:(alias : "$hdt$_0:partnerdb_catalogs", tableName : "bi_data.partnerdb_catalogs", partName : "{partner_partition=230}", ...))}
j.u.LinkedHashMap<>(size: 1, capacity: 16) {(key:(operatorId : "TS_0", id : "0", schemaEvolutionColumns : "bigimage,category,componentid,createddate,description,discount,extra,filtered,hashcode,id,instock,lasttouch,name,partnerid,price,producturl, ...[length 232]", ...), val:(alias : "$hdt$_0:partnerdb_catalogs", tableName : "bi_data.partnerdb_catalogs", partName : "{partner_partition=859}", ...))}
j.u.LinkedHashMap<>(size: 1, capacity: 16) {(key:(operatorId : "TS_0", id : "0", schemaEvolutionColumns : "bigimage,category,componentid,createddate,description,discount,extra,filtered,hashcode,id,instock,lasttouch,name,partnerid,price,producturl, ...[length 232]", ...), val:(alias : "$hdt$_0:partnerdb_catalogs", tableName : "bi_data.partnerdb_catalogs", partName : "{partner_partition=356}", ...))}
j.u.LinkedHashMap<>(size: 1, capacity: 16) {(key:(operatorId : "TS_0", id : "0", schemaEvolutionColumns : "bigimage,category,componentid,createddate,description,discount,extra,filtered,hashcode,id,instock,lasttouch,name,partnerid,price,producturl, ...[length 232]", ...), val:(alias : "$hdt$_0:partnerdb_catalogs", tableName : "bi_data.partnerdb_catalogs", partName : "{partner_partition=713}", ...))}
j.u.LinkedHashMap<>(size: 1, capacity: 16) {(key:(operatorId : "TS_0", id : "0", schemaEvolutionColumns : "bigimage,category,componentid,createddate,description,discount,extra,filtered,hashcode,id,instock,lasttouch,name,partnerid,price,producturl, ...[length 232]", ...), val:(alias : "$hdt$_0:partnerdb_catalogs", tableName : "bi_data.partnerdb_catalogs", partName : "{partner_partition=477}", ...))}
j.u.LinkedHashMap<>(size: 1, capacity: 16) {(key:(operatorId : "TS_0", id : "0", schemaEvolutionColumns : "bigimage,category,componentid,createddate,description,discount,extra,filtered,hashcode,id,instock,lasttouch,name,partnerid,price,producturl, ...[length 232]", ...), val:(alias : "$hdt$_0:partnerdb_catalogs", tableName : "bi_data.partnerdb_catalogs", partName : "{partner_partition=361}", ...))}
j.u.LinkedHashMap<>(size: 1, capacity: 16) {(key:(operatorId : "TS_0", id : "0", schemaEvolutionColumns : "bigimage,category,componentid,createddate,description,discount,extra,filtered,hashcode,id,instock,lasttouch,name,partnerid,price,producturl, ...[length 232]", ...), val:(alias : "$hdt$_0:partnerdb_catalogs", tableName : "bi_data.partnerdb_catalogs", partName : "{partner_partition=636}", ...))}
j.u.LinkedHashMap<>(size: 1, capacity: 16) {(key:(operatorId : "TS_0", id : "0", schemaEvolutionColumns : "bigimage,category,componentid,createddate,description,discount,extra,filtered,hashcode,id,instock,lasttouch,name,partnerid,price,producturl, ...[length 232]", ...), val:(alias : "$hdt$_0:partnerdb_catalogs", tableName : "bi_data.partnerdb_catalogs", partName : "{partner_partition=485}", ...))}
j.u.LinkedHashMap<>(size: 1, capacity: 16) {(key:(operatorId : "TS_0", id : "0", schemaEvolutionColumns : "bigimage,category,componentid,createddate,description,discount,extra,filtered,hashcode,id,instock,lasttouch,name,partnerid,price,producturl, ...[length 232]", ...), val:(alias : "$hdt$_0:partnerdb_catalogs", tableName : "bi_data.partnerdb_catalogs", partName : "{partner_partition=986}", ...))}
j.u.LinkedHashMap<>(size: 1, capacity: 16) {(key:(operatorId : "TS_0", id : "0", schemaEvolutionColumns : "bigimage,category,componentid,createddate,description,discount,extra,filtered,hashcode,id,instock,lasttouch,name,partnerid,price,producturl, ...[length 232]", ...), val:(alias : "$hdt$_0:partnerdb_catalogs", tableName : "bi_data.partnerdb_catalogs", partName : "{partner_partition=674}", ...))}
j.u.LinkedHashMap<>(size: 1, capacity: 16) {(key:(operatorId : "TS_0", id : "0", schemaEvolutionColumns : "bigimage,category,componentid,createddate,description,discount,extra,filtered,hashcode,id,instock,lasttouch,name,partnerid,price,producturl, ...[length 232]", ...), val:(alias : "$hdt$_0:partnerdb_catalogs", tableName : "bi_data.partnerdb_catalogs", partName : "{partner_partition=69}", ...))}
j.u.LinkedHashMap<>(size: 1, capacity: 16) {(key:(operatorId : "TS_0", id : "0", schemaEvolutionColumns : "bigimage,category,componentid,createddate,description,discount,extra,filtered,hashcode,id,instock,lasttouch,name,partnerid,price,producturl, ...[length 232]", ...), val:(alias : "$hdt$_0:partnerdb_catalogs", tableName : "bi_data.partnerdb_catalogs", partName : "{partner_partition=364}", ...))}
j.u.LinkedHashMap<>(size: 1, capacity: 16) {(key:(operatorId : "TS_0", id : "0", schemaEvolutionColumns : "bigimage,category,componentid,createddate,description,discount,extra,filtered,hashcode,id,instock,lasttouch,name,partnerid,price,producturl, ...[length 232]", ...), val:(alias : "$hdt$_0:partnerdb_catalogs", tableName : "bi_data.partnerdb_catalogs", partName : "{partner_partition=594}", ...))}
j.u.LinkedHashMap<>(size: 1, capacity: 16) {(key:(operatorId : "TS_0", id : "0", schemaEvolutionColumns : "bigimage,category,componentid,createddate,description,discount,extra,filtered,hashcode,id,instock,lasttouch,name,partnerid,price,producturl, ...[length 232]", ...), val:(alias : "$hdt$_0:partnerdb_catalogs", tableName : "bi_data.partnerdb_catalogs", partName : "{partner_partition=887}", ...))}
j.u.LinkedHashMap<>(size: 1, capacity: 16) {(key:(operatorId : "TS_0", id : "0", schemaEvolutionColumns : "bigimage,category,componentid,createddate,description,discount,extra,filtered,hashcode,id,instock,lasttouch,name,partnerid,price,producturl, ...[length 232]", ...), val:(alias : "$hdt$_0:partnerdb_catalogs", tableName : "bi_data.partnerdb_catalogs", partName : "{partner_partition=649}", ...))}
j.u.LinkedHashMap<>(size: 1, capacity: 16) {(key:(operatorId : "TS_0", id : "0", schemaEvolutionColumns : "bigimage,category,componentid,createddate,description,discount,extra,filtered,hashcode,id,instock,lasttouch,name,partnerid,price,producturl, ...[length 232]", ...), val:(alias : "$hdt$_0:partnerdb_catalogs", tableName : "bi_data.partnerdb_catalogs", partName : "{partner_partition=964}", ...))}

{j.u.HashMap}.values org.apache.hadoop.hive.ql.exec.MapOperator.opCtxMap
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.TableScanOperator.parentOperators
{j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.aliasToWork
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap





10. Bad Object Arrays:  no significant overhead  What's this?

  Total object arrays   Bad object arrays  Overhead: 
 999,807 7,916268Kb (< 0.1%)


Reference Chains for Bad Object Arrays

Full reference chains

77Kb (< 0.1%): Object[]: 815 / 12% of empty 36Kb (< 0.1%), 140 / 2% of sparse 18Kb (< 0.1%), 565 / 8% of 1-length 13Kb (< 0.1%), 190 / 2% of 1-elem 9Kb (< 0.1%)

 Random sample of non-empty containers 
Object[50](element class String){"X509", ".SF", ".DSA", ".RSA", ".EC", null, null, null, "1.0", null, null, null, "./", "/", null, null, "META-INF/MANIFEST.MF", null, null, null, ...}
Object[41](element class String){null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, ...}
Object[55](element class String){"webhdfs", null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, ...}
Object[17](element class String){null, null, null, null, null, "log4j.defaultInitOverride", null, "log4j.configuration", "log4j.configuratorClass", null, null, "Using URL [", "] for automatic log4j configuration.", null, null, null, null}
Object[15](element class String){null, null, null, null, null, null, null, null, null, null, "JobToken", "MapReduceShuffleToken", "MapReduceEncryptedSpillKey", null, null}
Object[94](element class String){null, null, null, null, null, null, null, null, null, "registerMBean", null, null, null, null, null, null, null, null, null, null, ...}
Object[7](element class String){null, null, null, null, null, "com.sun.proxy.", "$Proxy"}
Object[23](element class String){"hdfs", null, null, null, null, null, null, null, null, null, "dfs.client.retry.policy.enabled", "dfs.client.retry.policy.spec", "10000,6,60000,10", "dfs.client.failover.proxy.provider.", null, null, "Interface %s is not a NameNode protocol", null, null, null, ...}
Object[47](element class String){null, null, null, "serialization.format", null, null, null, null, "serialization.lib", null, null, null, null, null, null, null, null, null, null, null, ...}
Object[40](element class String){null, null, "IPC Client (", ") connection to ", " from ", null, null, null, null, null, "javax.security.sasl.qop", "Negotiated QOP is :", null, null, null, null, null, null, null, null, ...}
Object[34](element class String){null, null, null, null, null, null, null, null, null, null, "Get token info proto:", " info:", null, null, null, null, null, null, null, null, ...}
Object[51](element class String){null, null, null, null, null, "mapreduce.task.id", "mapreduce.task.attempt.id", "mapreduce.task.ismap", "mapreduce.task.partition", "mapreduce.job.id", null, "mapreduce.job.process-tree.class", "JVM_PID", " Using ResourceCalculatorProcessTree : ", "mapreduce.task.max.status.length", null, null, null, null, null, ...}
Object[14](element class String){null, null, null, null, null, null, null, null, null, null, "mapreduce.Counters", "org.apache.hadoop.mapred.Task$Counter", "org.apache.hadoop.mapred.JobInProgress$Counter", "FileSystemCounters"}
Object[21](element class String){"org.apache.hadoop.mapred.JobConf", "org.apache.hadoop.mapred.JobConfigurable", "configure", null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null}
Object[16](element class String){null, null, null, null, null, null, null, null, null, null, null, ": adding tracer ", null, "TracerPool(", ")", "Global"}
Object[31](element class String){"not a method or field: ", "not invocable, no method type", null, null, null, null, null, null, null, null, null, null, null, null, "<init>", null, null, null, null, null, ...}
Object[7](element class String){null, null, null, "yes", "no", null, null}
Object[7](element class String){null, null, null, "xml", "&", null, null}
Object[33](element class String){".", "hadoop-metrics2-", ".properties", "hadoop-metrics2.properties", "loaded properties from ", "Cannot locate configuration", null, null, null, null, "*.", null, null, null, null, null, null, "([^.*]+)\..+", null, null, ...}
Object[25](element class String){"hftp", null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, ...}
Object[1]{null}
Object[1]{null}
Object[1]{null}
Object[1]{null}
Object[1](element class String){"+"}
Object[1]{null}
Object[1]{null}
Object[1](element class String){"java.util.Arrays.useLegacyMergeSort"}
Object[1]{null}
Object[1](element class String){"metric info"}
Object[1](element class String){"java.lang:type=ClassLoading"}
Object[1]{null}
Object[1]{null}
Object[1](element class String){""}
Object[1](element class String){"auth.policy.provider"}
Object[1]{null}
Object[1]{null}
Object[1](element class String){"attachment"}
Object[1](element class String){"Detect premature EOF"}
Object[1]{null}
Object[40](element class String){"har", null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, ...}
Object[2](element class String){"DFSOpsCountStatistics", null}
Object[13](element class String){null, null, null, "org.apache.xerces.impl.dv.ObjectFactory", null, null, null, null, null, null, null, null, null}
Object[3](element class java.lang.invoke.SimpleMethodHandle){(customizationCount : 0, ...), null, null}
Object[3](element class String){null, null, "INSTANCE"}
Object[4](element class String){null, null, null, ""}
Object[4](element class String){null, "/", null, null}
Object[2](element class String){"IPC Parameter Sending Thread #%d", null}
Object[30](element class String){null, null, null, "X.509", null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, ...}
Object[4](element class String){"Weights must be non-negative", null, null, null}
Object[6](element class String){null, "attempt", null, null, null, null}
Object[5](element class String){"job", null, null, null, null}
Object[6](element class String){"jvm", null, null, null, null, null}
Object[13](element class String){null, null, null, null, null, null, "-07-", null, null, null, null, null, null}
Object[2](element class String){"io.file.buffer.size", null}
Object[12](element class String){null, null, null, null, null, null, "/", null, null, null, null, null}
Object[17](element class String){null, null, null, null, null, null, null, null, null, null, "modifyThread", null, null, null, null, null, null}
Object[2](element class String){"ltrim", null}
Object[3](element class String){"ISO-2022-KR", null, null}
Object[12](element class String){null, null, null, null, null, null, null, null, null, null, null, "Null key for a Map not allowed in JSON (use a converting NullKeySerializer?)"}

↖Unreachable
  All or some objects may start live as:

23Kb (< 0.1%): Object[]: 999 / 99% of 1-length 23Kb (< 0.1%)

 Random sample of non-empty containers 
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 933)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 38)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 153)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 825)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 248)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 370)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 425)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 638)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 901)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 203)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 335)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 517)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 931)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 318)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 124)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 422)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 665)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 775)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 753)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 456)}

Object[] org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx.rowWithPart
{j.u.LinkedHashMap}.values
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.MapOperator.opCtxMap
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.TableScanOperator.parentOperators
{j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.aliasToWork
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap
23Kb (< 0.1%): Object[]: 998 / 99% of 1-elem 23Kb (< 0.1%)

 Random sample of non-empty containers 
Object[2](element class Object[]){null, Object[](1)@c0e1add8}
Object[2](element class Object[]){null, Object[](1)@c0bc0e70}
Object[2](element class Object[]){null, Object[](1)@c119bce0}
Object[2](element class Object[]){null, Object[](1)@c0b31c38}
Object[2](element class Object[]){null, Object[](1)@c0d3a4f8}
Object[2](element class Object[]){null, Object[](1)@c10511c8}
Object[2](element class Object[]){null, Object[](1)@c0814c20}
Object[2](element class Object[]){null, Object[](1)@c09954d0}
Object[2](element class Object[]){null, Object[](1)@c0fa1940}
Object[2](element class Object[]){null, Object[](1)@c07b4990}
Object[2](element class Object[]){null, Object[](1)@c10c4860}
Object[2](element class Object[]){null, Object[](1)@c0cb6938}
Object[2](element class Object[]){null, Object[](1)@c0f15ef8}
Object[2](element class Object[]){null, Object[](1)@c10a85a8}
Object[2](element class Object[]){null, Object[](1)@c1186d30}
Object[2](element class Object[]){null, Object[](1)@c07f52c8}
Object[2](element class Object[]){null, Object[](1)@c089eb00}
Object[2](element class Object[]){null, Object[](1)@c0fb3ce0}
Object[2](element class Object[]){null, Object[](1)@c0f723d0}
Object[2](element class Object[]){null, Object[](1)@c0b279b0}

org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx.rowWithPart {j.u.LinkedHashMap}.values
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.MapOperator.opCtxMap
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.TableScanOperator.parentOperators
{j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.aliasToWork
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap
15Kb (< 0.1%): Object[]: 194 / 99% of 1-elem 15Kb (< 0.1%), 1 / 0% of empty 80b (< 0.1%)

 Random sample of non-empty containers 
Object[16](element class java.util.zip.Inflater){(off : 0, len : 0, finished : false, ...), null, null, null, null, null, null, null, null, null, null, null, null, null, null, null}
Object[16](element class java.util.zip.Inflater){(off : 0, len : 0, finished : false, ...), null, null, null, null, null, null, null, null, null, null, null, null, null, null, null}
Object[16](element class java.util.zip.Inflater){(off : 0, len : 0, finished : false, ...), null, null, null, null, null, null, null, null, null, null, null, null, null, null, null}
Object[16](element class java.util.zip.Inflater){(off : 0, len : 0, finished : false, ...), null, null, null, null, null, null, null, null, null, null, null, null, null, null, null}
Object[16](element class java.util.zip.Inflater){(off : 0, len : 0, finished : false, ...), null, null, null, null, null, null, null, null, null, null, null, null, null, null, null}
Object[16](element class java.util.zip.Inflater){(off : 0, len : 0, finished : false, ...), null, null, null, null, null, null, null, null, null, null, null, null, null, null, null}
Object[16](element class java.util.zip.Inflater){null, null, (off : 0, len : 0, finished : false, ...), null, null, null, null, null, null, null, null, null, null, null, null, null}
Object[16](element class java.util.zip.Inflater){(off : 0, len : 0, finished : false, ...), null, null, null, null, null, null, null, null, null, null, null, null, null, null, null}
Object[16](element class java.util.zip.Inflater){(off : 0, len : 0, finished : false, ...), null, null, null, null, null, null, null, null, null, null, null, null, null, null, null}
Object[16](element class java.util.zip.Inflater){(off : 0, len : 0, finished : false, ...), null, null, null, null, null, null, null, null, null, null, null, null, null, null, null}
Object[16](element class java.util.zip.Inflater){(off : 0, len : 0, finished : false, ...), null, null, null, null, null, null, null, null, null, null, null, null, null, null, null}
Object[16](element class java.util.zip.Inflater){null, (off : 0, len : 0, finished : false, ...), null, null, null, null, null, null, null, null, null, null, null, null, null, null}
Object[16](element class java.util.zip.Inflater){(off : 0, len : 0, finished : false, ...), null, null, null, null, null, null, null, null, null, null, null, null, null, null, null}
Object[16](element class java.util.zip.Inflater){(off : 0, len : 0, finished : false, ...), null, null, null, null, null, null, null, null, null, null, null, null, null, null, null}
Object[16](element class java.util.zip.Inflater){(off : 0, len : 0, finished : false, ...), null, null, null, null, null, null, null, null, null, null, null, null, null, null, null}
Object[16](element class java.util.zip.Inflater){(off : 0, len : 0, finished : false, ...), null, null, null, null, null, null, null, null, null, null, null, null, null, null, null}
Object[16](element class java.util.zip.Inflater){(off : 0, len : 0, finished : false, ...), null, null, null, null, null, null, null, null, null, null, null, null, null, null, null}
Object[16](element class java.util.zip.Inflater){null, (off : 0, len : 0, finished : false, ...), null, null, null, null, null, null, null, null, null, null, null, null, null, null}
Object[16](element class java.util.zip.Inflater){(off : 0, len : 0, finished : false, ...), null, null, null, null, null, null, null, null, null, null, null, null, null, null, null}
Object[16](element class java.util.zip.Inflater){(off : 0, len : 0, finished : false, ...), null, null, null, null, null, null, null, null, null, null, null, null, null, null, null}

j.u.ArrayDeque.elements java.util.jar.JarFile.inflaterCache
sun.misc.URLClassPath$JarLoader.jar
{j.u.ArrayList}
sun.misc.URLClassPath.loaders
sun.misc.Launcher$AppClassLoader.ucp
↖Java Static org.apache.hive.com.esotericsoftware.reflectasm.AccessClassLoader.selfContextParentClassLoader
7Kb (< 0.1%): Object[]: 88 / 96% of 1-elem 6Kb (< 0.1%), 3 / 3% of empty 240b (< 0.1%)

 Random sample of non-empty containers 
Object[16](element class java.util.zip.Inflater){(off : 0, len : 0, finished : false, ...), null, null, null, null, null, null, null, null, null, null, null, null, null, null, null}
Object[16](element class java.util.zip.Inflater){null, null, null, null, null, null, null, (off : 0, len : 0, finished : false, ...), null, null, null, null, null, null, null, null}
Object[16](element class java.util.zip.Inflater){(off : 0, len : 0, finished : false, ...), null, null, null, null, null, null, null, null, null, null, null, null, null, null, null}
Object[16](element class java.util.zip.Inflater){(off : 0, len : 0, finished : false, ...), null, null, null, null, null, null, null, null, null, null, null, null, null, null, null}
Object[16](element class java.util.zip.Inflater){(off : 0, len : 0, finished : false, ...), null, null, null, null, null, null, null, null, null, null, null, null, null, null, null}
Object[16](element class java.util.zip.Inflater){(off : 0, len : 0, finished : false, ...), null, null, null, null, null, null, null, null, null, null, null, null, null, null, null}
Object[16](element class java.util.zip.Inflater){(off : 0, len : 0, finished : false, ...), null, null, null, null, null, null, null, null, null, null, null, null, null, null, null}
Object[16](element class java.util.zip.Inflater){(off : 0, len : 0, finished : false, ...), null, null, null, null, null, null, null, null, null, null, null, null, null, null, null}
Object[16](element class java.util.zip.Inflater){(off : 0, len : 0, finished : false, ...), null, null, null, null, null, null, null, null, null, null, null, null, null, null, null}
Object[16](element class java.util.zip.Inflater){(off : 0, len : 0, finished : false, ...), null, null, null, null, null, null, null, null, null, null, null, null, null, null, null}
Object[16](element class java.util.zip.Inflater){(off : 0, len : 0, finished : false, ...), null, null, null, null, null, null, null, null, null, null, null, null, null, null, null}
Object[16](element class java.util.zip.Inflater){(off : 0, len : 0, finished : false, ...), null, null, null, null, null, null, null, null, null, null, null, null, null, null, null}
Object[16](element class java.util.zip.Inflater){(off : 0, len : 0, finished : false, ...), null, null, null, null, null, null, null, null, null, null, null, null, null, null, null}
Object[16](element class java.util.zip.Inflater){(off : 0, len : 0, finished : false, ...), null, null, null, null, null, null, null, null, null, null, null, null, null, null, null}
Object[16](element class java.util.zip.Inflater){(off : 0, len : 0, finished : false, ...), null, null, null, null, null, null, null, null, null, null, null, null, null, null, null}
Object[16](element class java.util.zip.Inflater){null, null, null, null, null, null, null, (off : 0, len : 0, finished : false, ...), null, null, null, null, null, null, null, null}
Object[16](element class java.util.zip.Inflater){(off : 0, len : 0, finished : false, ...), null, null, null, null, null, null, null, null, null, null, null, null, null, null, null}
Object[16](element class java.util.zip.Inflater){(off : 0, len : 0, finished : false, ...), null, null, null, null, null, null, null, null, null, null, null, null, null, null, null}
Object[16](element class java.util.zip.Inflater){(off : 0, len : 0, finished : false, ...), null, null, null, null, null, null, null, null, null, null, null, null, null, null, null}
Object[16](element class java.util.zip.Inflater){null, null, null, null, null, null, null, null, null, null, null, (off : 0, len : 0, finished : false, ...), null, null, null, null}

j.u.ArrayDeque.elements java.util.jar.JarFile.inflaterCache
sun.misc.URLClassPath$JarLoader.jar
{j.u.HashMap}.values
sun.misc.URLClassPath.lmap
sun.misc.Launcher$AppClassLoader.ucp
↖Java Static org.apache.hive.com.esotericsoftware.reflectasm.AccessClassLoader.selfContextParentClassLoader


23Kb (< 0.1%): Object[]: 999 / 99% of 1-length 23Kb (< 0.1%)

 Random sample of non-empty containers 
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 933)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 38)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 153)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 825)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 248)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 370)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 425)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 638)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 901)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 203)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 335)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 517)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 931)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 318)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 124)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 422)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 665)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 775)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 753)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 456)}

Object[] org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx.rowWithPart
{j.u.LinkedHashMap}.values
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.MapOperator.opCtxMap
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.TableScanOperator.parentOperators
{j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.aliasToWork
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap
23Kb (< 0.1%): Object[]: 998 / 99% of 1-elem 23Kb (< 0.1%)

 Random sample of non-empty containers 
Object[2](element class Object[]){null, Object[](1)@c0e1add8}
Object[2](element class Object[]){null, Object[](1)@c0bc0e70}
Object[2](element class Object[]){null, Object[](1)@c119bce0}
Object[2](element class Object[]){null, Object[](1)@c0b31c38}
Object[2](element class Object[]){null, Object[](1)@c0d3a4f8}
Object[2](element class Object[]){null, Object[](1)@c10511c8}
Object[2](element class Object[]){null, Object[](1)@c0814c20}
Object[2](element class Object[]){null, Object[](1)@c09954d0}
Object[2](element class Object[]){null, Object[](1)@c0fa1940}
Object[2](element class Object[]){null, Object[](1)@c07b4990}
Object[2](element class Object[]){null, Object[](1)@c10c4860}
Object[2](element class Object[]){null, Object[](1)@c0cb6938}
Object[2](element class Object[]){null, Object[](1)@c0f15ef8}
Object[2](element class Object[]){null, Object[](1)@c10a85a8}
Object[2](element class Object[]){null, Object[](1)@c1186d30}
Object[2](element class Object[]){null, Object[](1)@c07f52c8}
Object[2](element class Object[]){null, Object[](1)@c089eb00}
Object[2](element class Object[]){null, Object[](1)@c0fb3ce0}
Object[2](element class Object[]){null, Object[](1)@c0f723d0}
Object[2](element class Object[]){null, Object[](1)@c0b279b0}

org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx.rowWithPart {j.u.LinkedHashMap}.values
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.MapOperator.opCtxMap
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.TableScanOperator.parentOperators
{j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.aliasToWork
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap





11. Bad Primitive Arrays:  overhead 64.6%  What's this?

  Total primitive arrays   Bad primitive arrays  Overhead 
 601,294 7,2671,052,826Kb (64.6%)

Top bad primitive arrays:

  Overhead   Problem   # objects  Type 
 1,047,628Kb (64.3%) empty 1076 / 0%byte[]
 4,530Kb (0.3%) trail-0s 186 / 0%byte[]


Reference Chains for Bad Primitive Arrays

Expensive data fields

1,047,552Kb (64.3%): byte[]: 1 / 100% of empty 1,047,552Kb (64.3%)
 ↖org.apache.hadoop.mapred.MapTask$MapOutputBuffer.kvbuffer
3,513Kb (0.2%): byte[]: 9 / 81% of trail-0s 3,513Kb (0.2%)

 Random sample of non-empty containers 
byte[88284]{'1', '0', '2', '0', '0', '0', '\', 'N', '\', 'N', '1', '6', '1', '0', '0', '0', '\', 'N', '\', 'N', '\', 'N', '\', 'N', '2', '8', '5', '0', '0', '0', ...}
byte[1114812]{'\', 'N', '\', 'N', '\', 'N', '\', 'N', '\', 'N', '\', 'N', '\', 'N', '\', 'N', '\', 'N', '\', 'N', '\', 'N', '\', 'N', '\', 'N', '\', 'N', '\', 'N', ...}
byte[86628]{'9', '7', '0', '0', '0', '\', 'N', '\', 'N', '1', '5', '4', '0', '0', '0', '\', 'N', '\', 'N', '\', 'N', '\', 'N', '2', '7', '1', '0', '0', '0', '\', ...}
byte[186840]{'1', '5', '5', '9', '8', '1', '5', '5', '9', '8', '1', '5', '5', '9', '8', '1', '5', '5', '9', '8', '1', '5', '5', '9', '8', '1', '5', '5', '9', '8', ...}
byte[1343412]{'h', 't', 't', 'p', 's', ':', '/', '/', 'i', 'd', '-', 'l', 'i', 'v', 'e', '-', '0', '1', '.', 's', 'l', 'a', 't', 'i', 'c', '.', 'n', 'e', 't', '/', ...}
byte[2753314]{'@', 'K', 's', 'a', 'm', 's', 'u', 'n', 'g', '_', 'i', 'd', 's', '@', 'V', '0', '@', 'V', '@', 'K', 's', 'e', 'l', 'l', 'e', 'r', '_', 'i', 'd', '@', ...}

 ↖org.apache.hadoop.hive.serde2.lazy.ByteArrayRef.data


Full reference chains

1,047,552Kb (64.3%): byte[]: 1 / 100% of empty 1,047,552Kb (64.3%)
org.apache.hadoop.mapred.MapTask$MapOutputBuffer.kvbuffer org.apache.hadoop.mapred.MapTask$MapOutputBuffer$SpillThread.this$0
j.l.Thread[]
j.l.ThreadGroup.threads
java.beans.WeakIdentityMap$Entry.referent
java.beans.WeakIdentityMap$Entry[]
java.beans.ThreadGroupContext$1.table
↖Java Static java.beans.ThreadGroupContext.contexts
1,930Kb (0.1%): byte[]: 6 / 85% of trail-0s 1,930Kb (0.1%)

 Random sample of non-empty containers 
byte[88284]{'1', '0', '2', '0', '0', '0', '\', 'N', '\', 'N', '1', '6', '1', '0', '0', '0', '\', 'N', '\', 'N', '\', 'N', '\', 'N', '2', '8', '5', '0', '0', '0', ...}
byte[74736]{'\', 'N', '\', 'N', '\', 'N', '\', 'N', '\', 'N', '\', 'N', '\', 'N', '\', 'N', '\', 'N', '\', 'N', '\', 'N', '\', 'N', '\', 'N', '\', 'N', '\', 'N', ...}
byte[86628]{'9', '7', '0', '0', '0', '\', 'N', '\', 'N', '1', '5', '4', '0', '0', '0', '\', 'N', '\', 'N', '\', 'N', '\', 'N', '2', '7', '1', '0', '0', '0', '\', ...}
byte[186840]{'1', '5', '5', '9', '8', '1', '5', '5', '9', '8', '1', '5', '5', '9', '8', '1', '5', '5', '9', '8', '1', '5', '5', '9', '8', '1', '5', '5', '9', '8', ...}
byte[1343412]{'h', 't', 't', 'p', 's', ':', '/', '/', 'i', 'd', '-', 'l', 'i', 'v', 'e', '-', '0', '1', '.', 's', 'l', 'a', 't', 'i', 'c', '.', 'n', 'e', 't', '/', ...}
byte[2753314]{'@', 'K', 's', 'a', 'm', 's', 'u', 'n', 'g', '_', 'i', 'd', 's', '@', 'V', '0', '@', 'V', '@', 'K', 's', 'e', 'l', 'l', 'e', 'r', '_', 'i', 'd', '@', ...}

org.apache.hadoop.hive.serde2.lazy.ByteArrayRef.data org.apache.hadoop.hive.serde2.columnar.ColumnarStructBase$FieldInfo.cachedByteArrayRef
org.apache.hadoop.hive.serde2.columnar.ColumnarStructBase$FieldInfo[]
org.apache.hadoop.hive.serde2.columnar.ColumnarStruct.fieldInfoList
Object[]
org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx.rowWithPart
{j.u.LinkedHashMap}.values
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.MapOperator.opCtxMap
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.TableScanOperator.parentOperators
{j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.aliasToWork
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap
1,582Kb (< 0.1%): byte[]: 3 / 75% of trail-0s 1,582Kb (< 0.1%)

 Random sample of non-empty containers 
byte[115190]{'2', '3', '7', '0', '4', '2', '3', '7', '0', '4', '2', '3', '7', '0', '4', '2', '3', '7', '0', '4', '2', '3', '7', '0', '4', '2', '3', '7', '0', '4', ...}
byte[1114812]{'\', 'N', '\', 'N', '\', 'N', '\', 'N', '\', 'N', '\', 'N', '\', 'N', '\', 'N', '\', 'N', '\', 'N', '\', 'N', '\', 'N', '\', 'N', '\', 'N', '\', 'N', ...}
byte[2344694]{'h', 't', 't', 'p', ':', '/', '/', 'i', '1', '.', 's', 't', 'a', 't', 'i', 'c', '-', 's', 'h', 'o', 'p', 'c', 'a', 'd', 'e', '.', 'c', 'o', 'm', '/', ...}

org.apache.hadoop.hive.serde2.lazy.ByteArrayRef.data org.apache.hadoop.hive.serde2.columnar.ColumnarStructBase$FieldInfo.cachedByteArrayRef
org.apache.hadoop.hive.serde2.columnar.ColumnarStructBase$FieldInfo[]
org.apache.hadoop.hive.serde2.columnar.ColumnarStruct.fieldInfoList
Object[]
org.apache.hadoop.hive.ql.exec.ExprNodeGenericFuncEvaluator.rowObject
org.apache.hadoop.hive.ql.exec.ExprNodeEvaluator[]
org.apache.hadoop.hive.ql.exec.SelectOperator.eval
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.TableScanOperator.childOperators
{j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.aliasToWork
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap





12. Boxed Numbers:  no significant overhead  What's this?

  Total boxed objects  Overhead 
 1,30623Kb (< 0.1%)


Reference Chains for Boxed Numbers

Full reference chains

4Kb (< 0.1%): j.l.Byte: 253 objects

 Random sample 
j.l.Byte(82)
(-68)
(-117)
(-112)
(29)
(50)
(88)
(32)
(7)
(-52)
(35)
(-50)
(-49)
(28)
(59)
(-26)
(-8)
(62)
(-22)
(-42)

j.l.Byte[] ↖Java Static java.lang.Byte$ByteCache.cache
4Kb (< 0.1%): j.l.Short: 255 objects

 Random sample 
j.l.Short(-63)
(-108)
(-125)
(-126)
(30)
(51)
(-57)
(33)
(3)
(101)
(36)
(103)
(104)
(85)
(60)
(127)
(-10)
(63)
(-111)
(111)

j.l.Short[] ↖Java Static java.lang.Short$ShortCache.cache
3Kb (< 0.1%): j.l.Long: 178 objects

 Random sample 
j.l.Long(92)
(-18)
(-49)
(-68)
(-47)
(-14)
(99)
(-24)
(-100)
(-42)
(-41)
(-40)
(-39)
(-58)
(-5)
(85)
(-112)
(-2)
(-21)
(-32)

j.l.Long[] ↖Java Static java.lang.Long$LongCache.cache





13. Duplicate Objects:  overhead 3.7%  What's this?

  Total objects of types
with duplicates 
  Unique objects   Duplicate values  Overhead 
 3,900,664 27,422 260,519Kb (3.7%)

Types of duplicate objects:

  Overhead   # objects   Unique objects  Class name 
 30,259Kb (1.9%) 1,950,332 13,711org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMax$GenericUDAFMaxEvaluator$MaxAgg
 30,259Kb (1.9%) 1,950,332 13,711org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg


Top duplicate objects

  Overhead   # objects  Value 
 30,259Kb (1.9%) 1,936,622org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMax$GenericUDAFMaxEvaluator$MaxAgg(o : null)
 30,259Kb (1.9%) 1,936,622org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg(o : null)



Reference Chains for Duplicate Objects

Expensive data fields

30,259Kb (1.9%): org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMax$GenericUDAFMaxEvaluator$MaxAgg: 1,936,618 / 24% dup objects (1 unique)

  Num objects  Object value 
 1,936,618(o : null)

org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[] {j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GroupByOperator.hashAggregations
30,259Kb (1.9%): org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg: 1,936,618 / 24% dup objects (1 unique)

  Num objects  Object value 
 1,936,618(o : null)

org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[] {j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GroupByOperator.hashAggregations


Full reference chains

30,259Kb (1.9%): org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMax$GenericUDAFMaxEvaluator$MaxAgg: 1,936,618 / 24% dup objects (1 unique)

  Num objects  Object value 
 1,936,618(o : null)

org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[] {j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GroupByOperator.hashAggregations
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.SelectOperator.childOperators
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.TableScanOperator.childOperators
{j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.aliasToWork
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap
30,259Kb (1.9%): org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg: 1,936,618 / 24% dup objects (1 unique)

  Num objects  Object value 
 1,936,618(o : null)

org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[] {j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GroupByOperator.hashAggregations
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.SelectOperator.childOperators
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.TableScanOperator.childOperators
{j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.aliasToWork
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap
48b (< 0.1%): org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMax$GenericUDAFMaxEvaluator$MaxAgg: 4 / 25% dup objects (1 unique)

  Num objects  Object value 
 4(o : null)

org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[] org.apache.hadoop.hive.ql.exec.GroupByOperator.aggregations
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.SelectOperator.childOperators
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.TableScanOperator.childOperators
{j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.aliasToWork
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap





14. Duplicate Primitive Arrays:  no significant overhead  What's this?

  Total primitive arrays   Unique arrays   Duplicate values  Overhead 
 498,565 489,682 264542Kb (< 0.1%)

Types of duplicate objects:

  Overhead   # objects   Unique objects  Class name 
 271Kb (< 0.1%) 1,086 40boolean[]
 130Kb (< 0.1%) 492,169 488,472byte[]
 113Kb (< 0.1%) 5,062 1,005int[]
 26Kb (< 0.1%) 102,937 140char[]
 592b (< 0.1%) 11 2short[]
 32b (< 0.1%) 4 2float[]
 32b (< 0.1%) 17 15long[]
 32b (< 0.1%) 8 6double[]
 0b (0.0%) 102,937 0char[]
 0b (0.0%) 1,086 0boolean[]
 0b (0.0%) 492,169 0byte[]
 0b (0.0%) 11 0short[]
 0b (0.0%) 5,062 0int[]
 0b (0.0%) 4 0float[]
 0b (0.0%) 17 0long[]
 0b (0.0%) 8 0double[]


Top duplicate arrays

  Overhead   # objects  Value 
 265Kb (< 0.1%) 1,001boolean[256]{false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, ...}
 47Kb (< 0.1%) 1,006byte[32]{0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, ...}
 46Kb (< 0.1%) 1,000int[7]{13, 14, 18, 12, 0, 19, 6}
 33Kb (< 0.1%) 2,158int[0]{}
 23Kb (< 0.1%) 1,001byte[2]{'\', 'N'}
 23Kb (< 0.1%) 1,001byte[8]{1, 2, 3, 4, 5, 6, 7, 8}
 16Kb (< 0.1%) 3byte[8192]{0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, ...}
 16Kb (< 0.1%) 2char[8192]{, , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ...}
 7Kb (< 0.1%) 309byte[1]{'?'}
 5Kb (< 0.1%) 152int[6]{67, 82, 73, 84, 69, 79}
 4Kb (< 0.1%) 10char[256]{, �, �, �, �, �, �, �, �, �, �, �, �, �, �, �, �, �, �, �, �, �, �, �, �, �, �, �, �, �, ...}
 4Kb (< 0.1%) 136int[4]{80, 82, 79, 68}
 2Kb (< 0.1%) 27int[17]{1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1}
 2Kb (< 0.1%) 72int[3]{72, 80, 67}
 1Kb (< 0.1%) 76int[2]{65, 68}
 1Kb (< 0.1%) 7boolean[256]{false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, ...}
 1Kb (< 0.1%) 68byte[4]{-2, -54, 0, 0}
 1Kb (< 0.1%) 4char[256]{, , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ...}
 1Kb (< 0.1%) 61byte[6]{0, 1, 0, 'J', 0, 0}
 1Kb (< 0.1%) 26char[19]{, , , , , , , , , , , , , , , , , , }



Reference Chains for Duplicate Primitive Arrays

Full reference chains

265Kb (< 0.1%): boolean[]: 999 / 100% dup arrays (1 unique)

  Num objects  Object value 
 999boolean[256]{false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, ...}

org.apache.hadoop.hive.serde2.lazy.LazySerDeParameters.needsEscape org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe.serdeParams
org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx.deserializer
{j.u.LinkedHashMap}.values
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.MapOperator.opCtxMap
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.TableScanOperator.parentOperators
{j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.aliasToWork
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap
46Kb (< 0.1%): byte[]: 999 / 100% dup arrays (1 unique)

  Num objects  Object value 
 999byte[32]{0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, ...}

org.apache.hadoop.hive.serde2.ByteStream$Output.buf org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe.serializeStream
org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx.deserializer
{j.u.LinkedHashMap}.values
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.MapOperator.opCtxMap
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.TableScanOperator.parentOperators
{j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.aliasToWork
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap
46Kb (< 0.1%): int[]: 998 / 100% dup arrays (1 unique)

  Num objects  Object value 
 998int[7]{13, 14, 18, 12, 0, 19, 6}

org.apache.hadoop.hive.serde2.columnar.ColumnarStruct.prjColIDs org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe.cachedLazyStruct
org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx.deserializer
{j.u.LinkedHashMap}.values
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.MapOperator.opCtxMap
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.TableScanOperator.parentOperators
{j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.aliasToWork
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap





15. Duplicate Object Arrays:  no significant overhead  What's this?



16. Duplicate Lists:  no significant overhead  What's this?



17. WeakHashMaps with hard references from values to keys:  no significant overhead  What's this?



18. Off-heap memory used by java.nio.DirectByteBuffers:  no significant overhead  What's this?

  # objects   Total capacity  Total size 
 18 529Kb (< 0.1%)529Kb (< 0.1%)


Reference Chains for java.nio.DirectByteBuffers

Full reference chains

393Kb (< 0.1%): java.nio.DirectByteBuffer: 7 objects

 Random sample 
(mark : -1, position : 12839, limit : 65536, capacity : 65536, address : 140051011641680, hb : null, offset : 0, isReadOnly : false, bigEndian : true, nativeByteOrder : false, fd : null, att : null, cleaner : sun.misc.Cleaner@c053cf10)
(mark : -1, position : 17866, limit : 17866, capacity : 65536, address : 140051017466848, hb : null, offset : 0, isReadOnly : false, bigEndian : true, nativeByteOrder : false, fd : null, att : null, cleaner : @c053cee8)
(mark : -1, position : 764, limit : 65536, capacity : 65536, address : 140051008614592, hb : null, offset : 0, isReadOnly : false, bigEndian : true, nativeByteOrder : false, fd : null, att : null, cleaner : @c053cec0)
(mark : -1, position : 64146, limit : 64146, capacity : 65536, address : 140051016144560, hb : null, offset : 0, isReadOnly : false, bigEndian : true, nativeByteOrder : false, fd : null, att : null, cleaner : @c053ce98)
(mark : -1, position : 64, limit : 64, capacity : 8192, address : 140050564898576, hb : null, offset : 0, isReadOnly : false, bigEndian : true, nativeByteOrder : false, fd : null, att : null, cleaner : @e21b39b8)
(mark : -1, position : 0, limit : 131231, capacity : 131231, address : 140051009967360, hb : null, offset : 0, isReadOnly : false, bigEndian : true, nativeByteOrder : false, fd : null, att : null, cleaner : @e3703988)
(mark : -1, position : 874, limit : 874, capacity : 874, address : 140050982019408, hb : null, offset : 0, isReadOnly : false, bigEndian : true, nativeByteOrder : false, fd : null, att : null, cleaner : @e3726010)

sun.misc.Cleaner.referent sun.misc.Cleaner.{prev}
java.nio.DirectByteBuffer.cleaner
java.nio.DirectByteBufferR.att
↖Java Static org.apache.hadoop.hdfs.DFSInputStream.EMPTY_BUFFER
128Kb (< 0.1%): java.nio.DirectByteBuffer: 1 objects

 Random sample 
(mark : -1, position : 3512, limit : 131072, capacity : 131072, address : 140051009967519, hb : null, offset : 0, isReadOnly : false, bigEndian : true, nativeByteOrder : false, fd : null, att : java.nio.DirectByteBuffer@e3703928, cleaner : null)

org.apache.hadoop.hdfs.RemoteBlockReader2.curDataSlice org.apache.hadoop.hdfs.DFSInputStream.blockReader
org.apache.hadoop.hdfs.client.HdfsDataInputStream.in
org.apache.hadoop.hive.ql.io.RCFile$Reader.in
org.apache.hadoop.hive.ql.io.RCFileRecordReader.in
org.apache.hadoop.hive.ql.io.CombineHiveRecordReader.recordReader
org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileRecordReader.curReader
org.apache.hadoop.mapred.MapTask$TrackedRecordReader.rawIn
↖Java Local(org.apache.hadoop.mapred.MapTask$TrackedRecordReader)

8Kb (< 0.1%): java.nio.DirectByteBuffer: 1 objects

 Random sample 
(mark : -1, position : 16, limit : 16, capacity : 8192, address : 140050990345872, hb : null, offset : 0, isReadOnly : false, bigEndian : true, nativeByteOrder : false, fd : null, att : null, cleaner : sun.misc.Cleaner@8006d9a8)

sun.misc.Cleaner.referent sun.misc.Cleaner.next
java.nio.DirectByteBuffer.cleaner
java.nio.DirectByteBufferR.att
↖Java Static org.apache.hadoop.hdfs.DFSInputStream.EMPTY_BUFFER





19. Heap Size Configuration:  no significant overhead  What's this?
We estimate that the maximum heap size (-Xmx) is configured properly for your current working set = 1,628,922Kb



20. Very Long (Over 1000 Elements) Reference Chains:  not found  What's this?



21. Thread stacks (number of threads:  11 ) What's this?
Thread name: "main", daemon: false
java.lang.OutOfMemoryError.<init>(OutOfMemoryError.java:48)
java.util.regex.Matcher.<init>(Matcher.java:225)
java.util.regex.Pattern.matcher(Pattern.java:1093)
com.criteo.hadoop.hive.udf.UDFExtraDataToMap.toMap(UDFExtraDataToMap.java:42)
com.criteo.hadoop.hive.udf.UDFExtraDataToMap.evaluate(UDFExtraDataToMap.java:82)
org.apache.hadoop.hive.ql.exec.ExprNodeGenericFuncEvaluator._evaluate(ExprNodeGenericFuncEvaluator.java:187)
org.apache.hadoop.hive.ql.exec.ExprNodeEvaluator.evaluate(ExprNodeEvaluator.java:80)
org.apache.hadoop.hive.ql.exec.ExprNodeGenericFuncEvaluator$DeferredExprObject.get(ExprNodeGenericFuncEvaluator.java:88)
org.apache.hadoop.hive.ql.udf.generic.GenericUDFIndex.evaluate(GenericUDFIndex.java:100)
org.apache.hadoop.hive.ql.exec.ExprNodeGenericFuncEvaluator._evaluate(ExprNodeGenericFuncEvaluator.java:187)
org.apache.hadoop.hive.ql.exec.ExprNodeEvaluator.evaluate(ExprNodeEvaluator.java:80)
org.apache.hadoop.hive.ql.exec.ExprNodeGenericFuncEvaluator$DeferredExprObject.get(ExprNodeGenericFuncEvaluator.java:88)
org.apache.hadoop.hive.ql.udf.generic.GenericUDFWhen.evaluate(GenericUDFWhen.java:104)
org.apache.hadoop.hive.ql.exec.ExprNodeGenericFuncEvaluator._evaluate(ExprNodeGenericFuncEvaluator.java:187)
org.apache.hadoop.hive.ql.exec.ExprNodeEvaluator.evaluate(ExprNodeEvaluator.java:80)
org.apache.hadoop.hive.ql.exec.ExprNodeEvaluatorHead._evaluate(ExprNodeEvaluatorHead.java:44)
org.apache.hadoop.hive.ql.exec.ExprNodeEvaluator.evaluate(ExprNodeEvaluator.java:80)
org.apache.hadoop.hive.ql.exec.ExprNodeEvaluator.evaluate(ExprNodeEvaluator.java:68)
org.apache.hadoop.hive.ql.exec.SelectOperator.process(SelectOperator.java:88)
org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:897)
org.apache.hadoop.hive.ql.exec.TableScanOperator.process(TableScanOperator.java:130)
org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx.forward(MapOperator.java:148)
org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:547)
org.apache.hadoop.hive.ql.exec.mr.ExecMapper.map(ExecMapper.java:160)
org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:459)
org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
java.security.AccessController.doPrivileged(Native method)
javax.security.auth.Subject.doAs(Subject.java:422)
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1920)
org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)


Thread name: "SpillThread", daemon: true
sun.misc.Unsafe.park(Native method)
java.util.concurrent.locks.LockSupport.park(LockSupport.java:175)
java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(AbstractQueuedSynchronizer.java:2039)
org.apache.hadoop.mapred.MapTask$MapOutputBuffer$SpillThread.run(MapTask.java:1531)


Thread name: "Thread for syncLogs", daemon: true
org.apache.log4j.Hierarchy.getCurrentLoggers(Hierarchy.java:314)
org.apache.hadoop.mapred.TaskLog.syncLogs(TaskLog.java:304)
org.apache.hadoop.mapred.TaskLog$3.run(TaskLog.java:350)
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
java.lang.Thread.run(Thread.java:748)


Thread name: "communication thread", daemon: true
java.lang.ref.SoftReference.get(SoftReference.java:113)
java.lang.StringCoding.deref(StringCoding.java:66)
java.lang.StringCoding.encode(StringCoding.java:330)
java.lang.String.getBytes(String.java:918)
java.io.UnixFileSystem.getBooleanAttributes0(Native method)
java.io.UnixFileSystem.getBooleanAttributes(UnixFileSystem.java:242)
java.io.File.isDirectory(File.java:849)
org.apache.hadoop.yarn.util.ProcfsBasedProcessTree.getProcessList(ProcfsBasedProcessTree.java:510)
org.apache.hadoop.yarn.util.ProcfsBasedProcessTree.updateProcessTree(ProcfsBasedProcessTree.java:209)
org.apache.hadoop.mapred.Task.updateResourceCounters(Task.java:898)
org.apache.hadoop.mapred.Task.updateCounters(Task.java:1067)
org.apache.hadoop.mapred.Task.access$500(Task.java:82)
org.apache.hadoop.mapred.Task$TaskReporter.run(Task.java:786)
java.lang.Thread.run(Thread.java:748)


Thread name: "Timer for 'MapTask' metrics system", daemon: true
java.lang.Object.wait(Native method)
java.util.TimerThread.mainLoop(Timer.java:552)
java.util.TimerThread.run(Timer.java:505)


Thread name: "Reference Handler", daemon: true
java.lang.Object.wait(Native method)
java.lang.Object.wait(Object.java:502)
java.lang.ref.Reference.tryHandlePending(Reference.java:191)
java.lang.ref.Reference$ReferenceHandler.run(Reference.java:153)


Thread name: "Finalizer", daemon: true
java.lang.Object.wait(Native method)
java.lang.ref.ReferenceQueue.remove(ReferenceQueue.java:143)
java.lang.ref.ReferenceQueue.remove(ReferenceQueue.java:164)
java.lang.ref.Finalizer$FinalizerThread.run(Finalizer.java:209)


Thread name: "IPC Parameter Sending Thread #1", daemon: true
sun.misc.Unsafe.park(Native method)
java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:215)
java.util.concurrent.SynchronousQueue$TransferStack.awaitFulfill(SynchronousQueue.java:460)
java.util.concurrent.SynchronousQueue$TransferStack.transfer(SynchronousQueue.java:362)
java.util.concurrent.SynchronousQueue.poll(SynchronousQueue.java:941)
java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1073)
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1134)
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
java.lang.Thread.run(Thread.java:748)


Thread name: "org.apache.hadoop.hdfs.PeerCache@522b2631", daemon: true
java.lang.Thread.sleep(Native method)
org.apache.hadoop.hdfs.PeerCache.run(PeerCache.java:255)
org.apache.hadoop.hdfs.PeerCache.access$000(PeerCache.java:46)
org.apache.hadoop.hdfs.PeerCache$1.run(PeerCache.java:124)
java.lang.Thread.run(Thread.java:748)


Thread name: "org.apache.hadoop.fs.FileSystem$Statistics$StatisticsDataReferenceCleaner", daemon: true
java.lang.Object.wait(Native method)
java.lang.ref.ReferenceQueue.remove(ReferenceQueue.java:143)
java.lang.ref.ReferenceQueue.remove(ReferenceQueue.java:164)
org.apache.hadoop.fs.FileSystem$Statistics$StatisticsDataReferenceCleaner.run(FileSystem.java:3212)
java.lang.Thread.run(Thread.java:748)


Thread name: "Thread-7", daemon: true
org.apache.hadoop.net.unix.DomainSocketWatcher.doPoll0(Native method)
org.apache.hadoop.net.unix.DomainSocketWatcher.access$900(DomainSocketWatcher.java:52)
org.apache.hadoop.net.unix.DomainSocketWatcher$2.run(DomainSocketWatcher.java:509)
java.lang.Thread.run(Thread.java:748)





22. System Properties (result of java.lang.System.getProperties()) What's this?

  Key  Value 
 awt.toolkitsun.awt.X11.XToolkit
 file.encodingUTF-8
 file.encoding.pkgsun.io
 file.separator/
 hadoop.metrics.log.levelWARN
 hadoop.root.logfilesyslog
 hadoop.root.loggerWARN,CLA
 java.awt.graphicsenvsun.awt.X11GraphicsEnvironment
 java.awt.printerjobsun.print.PSPrinterJob
 java.class.path/hdfs/uuid/55cb3c9b-4314-4fa5-b75e-73f503daa156/yarn/data/usercache/sz.ho/appcache/application_1531301354100_57311/container_e260_1531301354100_57311_01_005034:/etc/hadoop/conf:/usr/lib/hadoop/parquet-format-javadoc.jar:/usr/lib/hadoop/parquet-format-sources.jar:/usr/lib/hadoop/parquet-format.jar:/usr/lib/hadoop/parquet-avro.jar:/usr/lib/hadoop/parquet-cascading.jar:/usr/lib/hadoop/parquet-column.jar:/usr/lib/hadoop/parquet-common.jar:/usr/lib/hadoop/parquet-encoding.jar:/usr/lib/hadoop/parquet-generator.jar:/usr/lib/hadoop/parquet-hadoop-bundle.jar:/usr/lib/hadoop/parquet-hadoop.jar:/usr/lib/hadoop/parquet-jackson.jar:/usr/lib/hadoop/parquet-pig-bundle.jar:/usr/lib/hadoop/parquet-pig.jar:/usr/lib/hadoop/parquet-protobuf.jar:/usr/lib/hadoop/parquet-test-hadoop2.jar:/usr/lib/hadoop/parquet-thrift.jar:/usr/lib/hadoop/parquet-tools.jar:/usr/lib/hadoop/hadoop-annotations-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-aws.jar:/usr/lib/hadoop/hadoop-azure-datalake-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop/hadoop-azure-datalake.jar:/usr/lib/hadoop/hadoop-common-2.6.0-cdh5.11.0-tests.jar:/usr/lib/hadoop/hadoop-common-tests.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-nfs-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/hadoop-common-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop/hadoop-aws-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop/hadoop-auth-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/jetty-6.1.26.cloudera.4.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/logredactor-1.0.3.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/avro.jar:/usr/lib/hadoop/lib/netty-3.10.5.Final.jar:/usr/lib/hadoop/lib/aws-java-sdk-core-1.10.6.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/aws-java-sdk-dynamodb-1.10.6.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/aws-java-sdk-kms-1.10.6.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/aws-java-sdk-s3-1.10.6.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/aws-java-sdk-sts-1.10.6.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/azure-data-lake-store-sdk-2.1.4.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.5.jar:/usr/lib/hadoop/lib/commons-beanutils-1.9.2.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/slf4j-log4j12.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/commons-el-1.0.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/zookeeper.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/hadoop-lzo-0.4.15-cdh5.11.0.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/hadoop-lzo.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/htrace-core4-4.0.1-incubating.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.8.8.jar:/usr/lib/hadoop/lib/jackson-xc-1.8.8.jar:/usr/lib/hadoop/lib/jasper-compiler-5.5.23.jar:/usr/lib/hadoop/lib/jasper-runtime-5.5.23.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.cloudera.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.6.0-cdh5.11.0-tests.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-tests.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-el-1.0.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/htrace-core4-4.0.1-incubating.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.8.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.8.8.jar:/usr/lib/hadoop-hdfs/lib/jasper-runtime-5.5.23.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.cloudera.4.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.cloudera.4.jar:/usr/lib/hadoop-hdfs/lib/jsp-api-2.1.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/lib/netty-3.10.5.Final.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/avro.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.9.2.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.8.8.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-el-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.8.8.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.8.8.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.8.8.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/htrace-core4-4.0.1-incubating.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/jasper-compiler-5.5.23.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/jasper-runtime-5.5.23.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archive-logs-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.cloudera.4.jar:/usr/lib/hadoop-mapreduce/hadoop-archive-logs.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-tests.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-nativetask.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.6.0-cdh5.11.0-tests.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-nativetask-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.cloudera.4.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.2.jar:/usr/lib/hadoop-mapreduce/microsoft-windowsazure-storage-sdk-0.6.0.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/okhttp-2.4.0.jar:/usr/lib/hadoop-mapreduce/okio-1.4.0.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/zookeeper.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/avro.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.8.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.8.8.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.10.5.Final.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.8.8.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.8.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.8.8.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.8.8.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.cloudera.4.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.cloudera.4.jar:/usr/lib/hadoop-yarn/lib/jline-2.11.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/zookeeper.jar:/usr/lib/hadoop-mapreduce/share/hadoop/mapreduce/*:/usr/lib/hadoop-mapreduce/share/hadoop/mapreduce/lib/*:job.jar/job.jar:job.jar/classes/:job.jar/lib/*:/hdfs/uuid/55cb3c9b-4314-4fa5-b75e-73f503daa156/yarn/data/usercache/sz.ho/appcache/application_1531301354100_57311/container_e260_1531301354100_57311_01_005034/criteo-datadisco-hive-17394-uber.jar:/hdfs/uuid/55cb3c9b-4314-4fa5-b75e-73f503daa156/yarn/data/usercache/sz.ho/appcache/application_1531301354100_57311/container_e260_1531301354100_57311_01_005034/criteo-hadoop-hiveutils-hive2-17394-uber.jar:/hdfs/uuid/55cb3c9b-4314-4fa5-b75e-73f503daa156/yarn/data/usercache/sz.ho/appcache/application_1531301354100_57311/container_e260_1531301354100_57311_01_005034/job.jar:/hdfs/uuid/55cb3c9b-4314-4fa5-b75e-73f503daa156/yarn/data/usercache/sz.ho/appcache/application_1531301354100_57311/container_e260_1531301354100_57311_01_005034/glup-schemas-17394-uber.jar:/hdfs/uuid/55cb3c9b-4314-4fa5-b75e-73f503daa156/yarn/data/usercache/sz.ho/appcache/application_1531301354100_57311/container_e260_1531301354100_57311_01_005034:/etc/hadoop/conf:/usr/lib/hadoop/parquet-format-javadoc.jar:/usr/lib/hadoop/parquet-format-sources.jar:/usr/lib/hadoop/parquet-format.jar:/usr/lib/hadoop/parquet-avro.jar:/usr/lib/hadoop/parquet-cascading.jar:/usr/lib/hadoop/parquet-column.jar:/usr/lib/hadoop/parquet-common.jar:/usr/lib/hadoop/parquet-encoding.jar:/usr/lib/hadoop/parquet-generator.jar:/usr/lib/hadoop/parquet-hadoop-bundle.jar:/usr/lib/hadoop/parquet-hadoop.jar:/usr/lib/hadoop/parquet-jackson.jar:/usr/lib/hadoop/parquet-pig-bundle.jar:/usr/lib/hadoop/parquet-pig.jar:/usr/lib/hadoop/parquet-protobuf.jar:/usr/lib/hadoop/parquet-test-hadoop2.jar:/usr/lib/hadoop/parquet-thrift.jar:/usr/lib/hadoop/parquet-tools.jar:/usr/lib/hadoop/hadoop-annotations-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-aws.jar:/usr/lib/hadoop/hadoop-azure-datalake-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop/hadoop-azure-datalake.jar:/usr/lib/hadoop/hadoop-common-2.6.0-cdh5.11.0-tests.jar:/usr/lib/hadoop/hadoop-common-tests.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-nfs-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/hadoop-common-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop/hadoop-aws-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop/hadoop-auth-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/jetty-6.1.26.cloudera.4.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/logredactor-1.0.3.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/avro.jar:/usr/lib/hadoop/lib/netty-3.10.5.Final.jar:/usr/lib/hadoop/lib/aws-java-sdk-core-1.10.6.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/aws-java-sdk-dynamodb-1.10.6.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/aws-java-sdk-kms-1.10.6.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/aws-java-sdk-s3-1.10.6.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/aws-java-sdk-sts-1.10.6.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/azure-data-lake-store-sdk-2.1.4.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.5.jar:/usr/lib/hadoop/lib/commons-beanutils-1.9.2.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/slf4j-log4j12.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/commons-el-1.0.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/zookeeper.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/hadoop-lzo-0.4.15-cdh5.11.0.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/hadoop-lzo.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/htrace-core4-4.0.1-incubating.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.8.8.jar:/usr/lib/hadoop/lib/jackson-xc-1.8.8.jar:/usr/lib/hadoop/lib/jasper-compiler-5.5.23.jar:/usr/lib/hadoop/lib/jasper-runtime-5.5.23.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.cloudera.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.6.0-cdh5.11.0-tests.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-tests.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-el-1.0.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/htrace-core4-4.0.1-incubating.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.8.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.8.8.jar:/usr/lib/hadoop-hdfs/lib/jasper-runtime-5.5.23.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.cloudera.4.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.cloudera.4.jar:/usr/lib/hadoop-hdfs/lib/jsp-api-2.1.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/lib/netty-3.10.5.Final.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/avro.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.9.2.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.8.8.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-el-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.8.8.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.8.8.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.8.8.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/htrace-core4-4.0.1-incubating.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/jasper-compiler-5.5.23.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/jasper-runtime-5.5.23.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archive-logs-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.cloudera.4.jar:/usr/lib/hadoop-mapreduce/hadoop-archive-logs.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-tests.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-nativetask.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.6.0-cdh5.11.0-tests.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-nativetask-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.cloudera.4.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.2.jar:/usr/lib/hadoop-mapreduce/microsoft-windowsazure-storage-sdk-0.6.0.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/okhttp-2.4.0.jar:/usr/lib/hadoop-mapreduce/okio-1.4.0.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/zookeeper.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/avro.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.8.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.8.8.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.10.5.Final.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/*:/lib/*:/usr/lib/hadoop-mapreduce/share/hadoop/mapreduce/*:/usr/lib/hadoop-mapreduce/share/hadoop/mapreduce/lib/*:job.jar/job.jar:job.jar/classes/:job.jar/lib/*:/hdfs/uuid/55cb3c9b-4314-4fa5-b75e-73f503daa156/yarn/data/usercache/sz.ho/appcache/application_1531301354100_57311/container_e260_1531301354100_57311_01_005034/criteo-datadisco-hive-17394-uber.jar:/hdfs/uuid/55cb3c9b-4314-4fa5-b75e-73f503daa156/yarn/data/usercache/sz.ho/appcache/application_1531301354100_57311/container_e260_1531301354100_57311_01_005034/criteo-hadoop-hiveutils-hive2-17394-uber.jar:/hdfs/uuid/55cb3c9b-4314-4fa5-b75e-73f503daa156/yarn/data/usercache/sz.ho/appcache/application_1531301354100_57311/container_e260_1531301354100_57311_01_005034/job.jar:/hdfs/uuid/55cb3c9b-4314-4fa5-b75e-73f503daa156/yarn/data/usercache/sz.ho/appcache/application_1531301354100_57311/container_e260_1531301354100_57311_01_005034/glup-schemas-17394-uber.jar
 java.class.version52.0
 java.endorsed.dirs/usr/lib/jvm/jdk1.8.0_144/jre/lib/endorsed
 java.ext.dirs/usr/lib/jvm/jdk1.8.0_144/jre/lib/ext:/usr/java/packages/lib/ext
 java.home/usr/lib/jvm/jdk1.8.0_144/jre
 java.io.tmpdir/hdfs/uuid/55cb3c9b-4314-4fa5-b75e-73f503daa156/yarn/data/usercache/sz.ho/appcache/application_1531301354100_57311/container_e260_1531301354100_57311_01_005034/tmp
 java.library.path/hdfs/uuid/55cb3c9b-4314-4fa5-b75e-73f503daa156/yarn/data/usercache/sz.ho/appcache/application_1531301354100_57311/container_e260_1531301354100_57311_01_005034:/usr/lib/hadoop/lib/native:/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib
 java.net.preferIPv4Stacktrue
 java.runtime.nameJava(TM) SE Runtime Environment
 java.runtime.version1.8.0_144-b01
 java.specification.nameJava Platform API Specification
 java.specification.vendorOracle Corporation
 java.specification.version1.8
 java.vendorOracle Corporation
 java.vendor.urlhttp://java.oracle.com/
 java.vendor.url.bughttp://bugreport.sun.com/bugreport/
 java.version1.8.0_144
 java.vm.infomixed mode
 java.vm.nameJava HotSpot(TM) 64-Bit Server VM
 java.vm.specification.nameJava Virtual Machine Specification
 java.vm.specification.vendorOracle Corporation
 java.vm.specification.version1.8
 java.vm.vendorOracle Corporation
 java.vm.version25.144-b01
 line.separator
 log4j.configurationcontainer-log4j.properties
 os.archamd64
 os.nameLinux
 os.version4.4.21-1.el7.elrepo.x86_64
 path.separator:
 sun.arch.data.model64
 sun.boot.class.path/usr/lib/jvm/jdk1.8.0_144/jre/lib/resources.jar:/usr/lib/jvm/jdk1.8.0_144/jre/lib/rt.jar:/usr/lib/jvm/jdk1.8.0_144/jre/lib/sunrsasign.jar:/usr/lib/jvm/jdk1.8.0_144/jre/lib/jsse.jar:/usr/lib/jvm/jdk1.8.0_144/jre/lib/jce.jar:/usr/lib/jvm/jdk1.8.0_144/jre/lib/charsets.jar:/usr/lib/jvm/jdk1.8.0_144/jre/lib/jfr.jar:/usr/lib/jvm/jdk1.8.0_144/jre/classes
 sun.boot.library.path/usr/lib/jvm/jdk1.8.0_144/jre/lib/amd64
 sun.cpu.endianlittle
 sun.cpu.isalist
 sun.io.unicode.encodingUnicodeLittle
 sun.java.commandorg.apache.hadoop.mapred.YarnChild 10.224.22.18 46104 attempt_1531301354100_57311_m_001303_1 285873023226794
 sun.java.launcherSUN_STANDARD
 sun.jnu.encodingUTF-8
 sun.management.compilerHotSpot 64-Bit Tiered Compilers
 sun.os.patch.levelunknown
 user.countryUS
 user.dir/hdfs/uuid/55cb3c9b-4314-4fa5-b75e-73f503daa156/yarn/data/usercache/sz.ho/appcache/application_1531301354100_57311/container_e260_1531301354100_57311_01_005034
 user.home/home/sz.ho
 user.languageen
 user.namesz.ho
 user.timezoneEtc/UTC
 yarn.app.container.log.dir/hdfs/uuid/f444a50c-5326-4c38-869b-17715b806ac7/yarn/logs/application_1531301354100_57311/container_e260_1531301354100_57311_01_005034
 yarn.app.container.log.filesize0