demo@flex13:~/userlogs$ cat application_1483626889488_0625/container_1483626889488_0625_01_000008/stderr
Picked up JAVA_TOOL_OPTIONS: -XX:+PreserveFramePointer -ea
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/mnt/tmpfs/tmp/nm-local-dir/usercache/demo/filecache/1238/__spark_libs__3104534836721913725.zip/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/demo/zac-deployment/hadoop-2.6.0/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
17/02/08 11:32:05 0 INFO CoarseGrainedExecutorBackend: Started daemon with process name: 9763@flex14
17/02/08 11:32:05 3 INFO SignalUtils: Registered signal handler for TERM
17/02/08 11:32:05 3 INFO SignalUtils: Registered signal handler for HUP
17/02/08 11:32:05 3 INFO SignalUtils: Registered signal handler for INT
17/02/08 11:32:06 326 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
17/02/08 11:32:06 377 INFO SecurityManager: Changing view acls to: demo
17/02/08 11:32:06 377 INFO SecurityManager: Changing modify acls to: demo
17/02/08 11:32:06 378 INFO SecurityManager: Changing view acls groups to:
17/02/08 11:32:06 378 INFO SecurityManager: Changing modify acls groups to:
17/02/08 11:32:06 378 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(demo); groups with view permissions: Set(); users with modify permissions: Set(demo); groups with modify permissions: Set()
############### Here cores : 1 for rpc
java.lang.Exception
at org.apache.spark.network.netty.SparkTransportConf$.fromSparkConf(SparkTransportConf.scala:59)
at org.apache.spark.rpc.netty.NettyRpcEnv.<init>(NettyRpcEnv.scala:49)
at org.apache.spark.rpc.netty.NettyRpcEnvFactory.create(NettyRpcEnv.scala:442)
at org.apache.spark.rpc.RpcEnv$.create(RpcEnv.scala:56)
at org.apache.spark.rpc.RpcEnv$.create(RpcEnv.scala:43)
at org.apache.spark.executor.CoarseGrainedExecutorBackend$$anonfun$run$1.apply$mcV$sp(CoarseGrainedExecutorBackend.scala:195)
at org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:67)
at org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:66)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
at org.apache.spark.deploy.SparkHadoopUtil.runAsSparkUser(SparkHadoopUtil.scala:66)
at org.apache.spark.executor.CoarseGrainedExecutorBackend$.run(CoarseGrainedExecutorBackend.scala:188)
at org.apache.spark.executor.CoarseGrainedExecutorBackend$.main(CoarseGrainedExecutorBackend.scala:284)
at org.apache.spark.executor.CoarseGrainedExecutorBackend.main(CoarseGrainedExecutorBackend.scala)
17/02/08 11:32:06 656 INFO TransportClientFactory: Successfully created connection to /10.40.0.11:49221 after 50 ms (0 ms spent in bootstraps)
17/02/08 11:32:06 732 INFO SecurityManager: Changing view acls to: demo
17/02/08 11:32:06 732 INFO SecurityManager: Changing modify acls to: demo
17/02/08 11:32:06 732 INFO SecurityManager: Changing view acls groups to:
17/02/08 11:32:06 732 INFO SecurityManager: Changing modify acls groups to:
17/02/08 11:32:06 733 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(demo); groups with view permissions: Set(); users with modify permissions: Set(demo); groups with modify permissions: Set()
############### Here cores : 1 for rpc
java.lang.Exception
at org.apache.spark.network.netty.SparkTransportConf$.fromSparkConf(SparkTransportConf.scala:59)
at org.apache.spark.rpc.netty.NettyRpcEnv.<init>(NettyRpcEnv.scala:49)
at org.apache.spark.rpc.netty.NettyRpcEnvFactory.create(NettyRpcEnv.scala:442)
at org.apache.spark.rpc.RpcEnv$.create(RpcEnv.scala:56)
at org.apache.spark.SparkEnv$.create(SparkEnv.scala:245)
at org.apache.spark.SparkEnv$.createExecutorEnv(SparkEnv.scala:200)
at org.apache.spark.executor.CoarseGrainedExecutorBackend$$anonfun$run$1.apply$mcV$sp(CoarseGrainedExecutorBackend.scala:223)
at org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:67)
at org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:66)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
at org.apache.spark.deploy.SparkHadoopUtil.runAsSparkUser(SparkHadoopUtil.scala:66)
at org.apache.spark.executor.CoarseGrainedExecutorBackend$.run(CoarseGrainedExecutorBackend.scala:188)
at org.apache.spark.executor.CoarseGrainedExecutorBackend$.main(CoarseGrainedExecutorBackend.scala:284)
at org.apache.spark.executor.CoarseGrainedExecutorBackend.main(CoarseGrainedExecutorBackend.scala)
17/02/08 11:32:06 755 INFO TransportClientFactory: Successfully created connection to /10.40.0.11:49221 after 1 ms (0 ms spent in bootstraps)
17/02/08 11:32:06 766 WARN SortShuffleManager: spark.shuffle.spill was set to false, but this configuration is ignored as of Spark 1.6+. Shuffle will continue to spill to disk when necessary.
############### Here cores : 1 for shuffle
java.lang.Exception
at org.apache.spark.network.netty.SparkTransportConf$.fromSparkConf(SparkTransportConf.scala:59)
at org.apache.spark.shuffle.IndexShuffleBlockResolver.<init>(IndexShuffleBlockResolver.scala:52)
at org.apache.spark.shuffle.sort.SortShuffleManager.<init>(SortShuffleManager.scala:82)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.spark.SparkEnv$.instantiateClass$1(SparkEnv.scala:270)
at org.apache.spark.SparkEnv$.create(SparkEnv.scala:323)
at org.apache.spark.SparkEnv$.createExecutorEnv(SparkEnv.scala:200)
at org.apache.spark.executor.CoarseGrainedExecutorBackend$$anonfun$run$1.apply$mcV$sp(CoarseGrainedExecutorBackend.scala:223)
at org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:67)
at org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:66)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
at org.apache.spark.deploy.SparkHadoopUtil.runAsSparkUser(SparkHadoopUtil.scala:66)
at org.apache.spark.executor.CoarseGrainedExecutorBackend$.run(CoarseGrainedExecutorBackend.scala:188)
at org.apache.spark.executor.CoarseGrainedExecutorBackend$.main(CoarseGrainedExecutorBackend.scala:284)
at org.apache.spark.executor.CoarseGrainedExecutorBackend.main(CoarseGrainedExecutorBackend.scala)
############### Here cores : 1 for shuffle
java.lang.Exception
at org.apache.spark.network.netty.SparkTransportConf$.fromSparkConf(SparkTransportConf.scala:59)
at org.apache.spark.network.netty.NettyBlockTransferService.<init>(NettyBlockTransferService.scala:54)
at org.apache.spark.SparkEnv$.create(SparkEnv.scala:340)
at org.apache.spark.SparkEnv$.createExecutorEnv(SparkEnv.scala:200)
at org.apache.spark.executor.CoarseGrainedExecutorBackend$$anonfun$run$1.apply$mcV$sp(CoarseGrainedExecutorBackend.scala:223)
at org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:67)
at org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:66)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
at org.apache.spark.deploy.SparkHadoopUtil.runAsSparkUser(SparkHadoopUtil.scala:66)
at org.apache.spark.executor.CoarseGrainedExecutorBackend$.run(CoarseGrainedExecutorBackend.scala:188)
at org.apache.spark.executor.CoarseGrainedExecutorBackend$.main(CoarseGrainedExecutorBackend.scala:284)
at org.apache.spark.executor.CoarseGrainedExecutorBackend.main(CoarseGrainedExecutorBackend.scala)
17/02/08 11:32:06 875 INFO DiskBlockManager: Created local directory at /mnt/tmpfs/tmp/nm-local-dir/usercache/demo/appcache/application_1483626889488_0625/blockmgr-478b1cd0-f671-4d93-bfd9-06d0d2a75586
17/02/08 11:32:06 889 INFO MemoryStore: MemoryStore started with capacity 34.0 GB
17/02/08 11:32:06 1058 INFO CoarseGrainedExecutorBackend: Connecting to driver: spark://[email protected]:49221
17/02/08 11:32:07 1073 INFO CoarseGrainedExecutorBackend: Successfully registered with driver
17/02/08 11:32:07 1074 INFO Executor: Starting executor ID 7 on host flex14.zurich.ibm.com
17/02/08 11:32:07 1098 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 60837.
17/02/08 11:32:07 1098 INFO NettyBlockTransferService: Server created on flex14.zurich.ibm.com:60837 with numCores: 1
17/02/08 11:32:07 1099 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
17/02/08 11:32:07 1100 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(7, flex14.zurich.ibm.com, 60837, None)
17/02/08 11:32:07 1106 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(7, flex14.zurich.ibm.com, 60837, None)
17/02/08 11:32:07 1106 INFO BlockManager: Initialized BlockManager: BlockManagerId(7, flex14.zurich.ibm.com, 60837, None)
17/02/08 11:32:07 1108 INFO Executor: Using REPL class URI: spark://10.40.0.11:49221/classes
17/02/08 11:32:15 9901 INFO CoarseGrainedExecutorBackend: Got assigned task 1
17/02/08 11:32:15 9905 INFO Executor: Running task 1.0 in stage 0.0 (TID 1)
17/02/08 11:32:15 9963 INFO TorrentBroadcast: Started reading broadcast variable 2
17/02/08 11:32:15 9992 INFO TransportClientFactory: Successfully created connection to /10.40.0.11:34805 after 2 ms (0 ms spent in bootstraps)
17/02/08 11:32:15 10028 INFO MemoryStore: Block broadcast_2_piece0 stored as bytes in memory (estimated size 15.0 KB, free 34.0 GB)
17/02/08 11:32:15 10037 INFO TorrentBroadcast: Reading broadcast variable 2 took 73 ms
17/02/08 11:32:16 10197 INFO MemoryStore: Block broadcast_2 stored as values in memory (estimated size 15.0 KB, free 34.0 GB)
############### Here cores : 1 for files
java.lang.Exception
at org.apache.spark.network.netty.SparkTransportConf$.fromSparkConf(SparkTransportConf.scala:59)
at org.apache.spark.rpc.netty.NettyRpcEnv.downloadClient(NettyRpcEnv.scala:353)
at org.apache.spark.rpc.netty.NettyRpcEnv.openChannel(NettyRpcEnv.scala:324)
at org.apache.spark.repl.ExecutorClassLoader.org$apache$spark$repl$ExecutorClassLoader$$getClassFileInputStreamFromSparkRPC(ExecutorClassLoader.scala:90)
at org.apache.spark.repl.ExecutorClassLoader$$anonfun$1.apply(ExecutorClassLoader.scala:57)
at org.apache.spark.repl.ExecutorClassLoader$$anonfun$1.apply(ExecutorClassLoader.scala:57)
at org.apache.spark.repl.ExecutorClassLoader.findClassLocally(ExecutorClassLoader.scala:162)
at org.apache.spark.repl.ExecutorClassLoader.findClass(ExecutorClassLoader.scala:80)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:411)
at org.apache.spark.util.ParentClassLoader.loadClass(ParentClassLoader.scala:34)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at org.apache.spark.util.ParentClassLoader.loadClass(ParentClassLoader.scala:30)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at org.codehaus.janino.ClassLoaderIClassLoader.findIClass(ClassLoaderIClassLoader.java:83)
at org.codehaus.janino.IClassLoader.loadIClass(IClassLoader.java:288)
at org.codehaus.janino.UnitCompiler.findTypeByName(UnitCompiler.java:7611)
at org.codehaus.janino.UnitCompiler.getReferenceType(UnitCompiler.java:5906)
at org.codehaus.janino.UnitCompiler.getReferenceType(UnitCompiler.java:5751)
at org.codehaus.janino.UnitCompiler.getType2(UnitCompiler.java:5732)
at org.codehaus.janino.UnitCompiler.access$13200(UnitCompiler.java:206)
at org.codehaus.janino.UnitCompiler$18.visitReferenceType(UnitCompiler.java:5668)
at org.codehaus.janino.UnitCompiler$18.visitReferenceType(UnitCompiler.java:5660)
at org.codehaus.janino.Java$ReferenceType.accept(Java.java:3356)
at org.codehaus.janino.UnitCompiler.getType(UnitCompiler.java:5660)
at org.codehaus.janino.UnitCompiler.getType2(UnitCompiler.java:6007)
at org.codehaus.janino.UnitCompiler.access$13000(UnitCompiler.java:206)
at org.codehaus.janino.UnitCompiler$18.visitArrayType(UnitCompiler.java:5666)
at org.codehaus.janino.UnitCompiler$18.visitArrayType(UnitCompiler.java:5660)
at org.codehaus.janino.Java$ArrayType.accept(Java.java:3425)
at org.codehaus.janino.UnitCompiler.getType(UnitCompiler.java:5660)
at org.codehaus.janino.UnitCompiler.access$1200(UnitCompiler.java:206)
at org.codehaus.janino.UnitCompiler$32.getParameterTypes2(UnitCompiler.java:9326)
at org.codehaus.janino.IClass$IInvocable.getParameterTypes(IClass.java:853)
at org.codehaus.janino.IClass$IMethod.getDescriptor2(IClass.java:1084)
at org.codehaus.janino.IClass$IInvocable.getDescriptor(IClass.java:865)
at org.codehaus.janino.IClass.getIMethods(IClass.java:211)
at org.codehaus.janino.IClass.getIMethods(IClass.java:200)
at org.codehaus.janino.UnitCompiler.compile2(UnitCompiler.java:439)
at org.codehaus.janino.UnitCompiler.compile2(UnitCompiler.java:420)
at org.codehaus.janino.UnitCompiler.access$400(UnitCompiler.java:206)
at org.codehaus.janino.UnitCompiler$2.visitPackageMemberClassDeclaration(UnitCompiler.java:374)
at org.codehaus.janino.UnitCompiler$2.visitPackageMemberClassDeclaration(UnitCompiler.java:369)
at org.codehaus.janino.Java$AbstractPackageMemberClassDeclaration.accept(Java.java:1309)
at org.codehaus.janino.UnitCompiler.compile(UnitCompiler.java:369)
at org.codehaus.janino.UnitCompiler.compileUnit(UnitCompiler.java:345)
at org.codehaus.janino.SimpleCompiler.compileToClassLoader(SimpleCompiler.java:396)
at org.codehaus.janino.ClassBodyEvaluator.compileToClass(ClassBodyEvaluator.java:311)
at org.codehaus.janino.ClassBodyEvaluator.cook(ClassBodyEvaluator.java:229)
at org.codehaus.janino.SimpleCompiler.cook(SimpleCompiler.java:196)
at org.codehaus.commons.compiler.Cookable.cook(Cookable.java:91)
at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$.org$apache$spark$sql$catalyst$expressions$codegen$CodeGenerator$$doCompile(CodeGenerator.scala:935)
at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$$anon$1.load(CodeGenerator.scala:998)
at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$$anon$1.load(CodeGenerator.scala:995)
at org.spark_project.guava.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3599)
at org.spark_project.guava.cache.LocalCache$Segment.loadSync(LocalCache.java:2379)
at org.spark_project.guava.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2342)
at org.spark_project.guava.cache.LocalCache$Segment.get(LocalCache.java:2257)
at org.spark_project.guava.cache.LocalCache.get(LocalCache.java:4000)
at org.spark_project.guava.cache.LocalCache.getOrLoad(LocalCache.java:4004)
at org.spark_project.guava.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4874)
at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$.compile(CodeGenerator.scala:890)
at org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$8.apply(WholeStageCodegenExec.scala:372)
at org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$8.apply(WholeStageCodegenExec.scala:371)
at org.apache.spark.rdd.RDD$$anonfun$mapPartitionsWithIndex$1$$anonfun$apply$26.apply(RDD.scala:843)
at org.apache.spark.rdd.RDD$$anonfun$mapPartitionsWithIndex$1$$anonfun$apply$26.apply(RDD.scala:843)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:96)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:53)
at org.apache.spark.scheduler.Task.run(Task.scala:108)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:283)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
17/02/08 11:32:16 10448 INFO TransportClientFactory: Successfully created connection to /10.40.0.11:49221 after 2 ms (0 ms spent in bootstraps)
17/02/08 11:32:16 10523 INFO CodeGenerator: Code generated in 232.190964 ms
17/02/08 11:32:16 10675 INFO CodeGenerator: Code generated in 28.404387 ms
17/02/08 11:32:16 10684 INFO FileScanRDD: Reading File path: hdfs://flex11-40g0:9000/sql/parquet-20m-s1k/part-00001-53bcd2d8-c1e8-41e5-82d7-c6012de58353.parquet, range: 0-2108144314, partition values: [empty row]
17/02/08 11:32:16 10691 INFO TorrentBroadcast: Started reading broadcast variable 0
17/02/08 11:32:16 10696 INFO TransportClientFactory: Successfully created connection to flex18.zurich.ibm.com/10.40.0.18:45864 after 1 ms (0 ms spent in bootstraps)
17/02/08 11:32:16 10705 INFO MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 65.9 KB, free 34.0 GB)
17/02/08 11:32:16 10708 INFO TorrentBroadcast: Reading broadcast variable 0 took 17 ms
17/02/08 11:32:16 10763 INFO MemoryStore: Block broadcast_0 stored as values in memory (estimated size 403.4 KB, free 34.0 GB)
[INFO] [SpecificParquetRecordReaderBase] New Parquet reader is initializing hdfs://flex11-40g0:9000/sql/parquet-20m-s1k/part-00001-53bcd2d8-c1e8-41e5-82d7-c6012de58353.parquet node: , and rowGroupOffset (isNULL): true
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
just footer reading took : 541669 usec
[INFO] [SpecificParquetRecordReaderBase] New Parquet reader initialized : hdfs://flex11-40g0:9000/sql/parquet-20m-s1k/part-00001-53bcd2d8-c1e8-41e5-82d7-c6012de58353.parquet with : 2000000 rows, callstack below: | time: 1: start 2541 usec 2: readFooter 546214 usec 1: parse 41808 usec
17/02/08 11:32:24 18187 INFO Executor: Finished task 1.0 in stage 0.0 (TID 1). 2832 bytes result sent to driver, e2etime 8282.982 msec, sparktime: 7913 msec
17/02/08 11:32:24 18188 INFO Executor:
-------------[ USERS: 0 ----
totalBytes: 0 totalRecs: 0 sortTime: 0.0 ms shuffleIteratorTime: 0.0 ms networkTime: 0.0 ms end2endTime: 0.0 ms
[ShuffleStats]: localBytes: 0 remoteBytes: 0 localTime: 0 ms, networkTime: 0 ms
-----------------------------------------------------
=====================================================
RESULTS: bytes: 0 networkTime: 0.0 ms deSerTime: 0.0 ms joinTime: 0.0 ms netBW 0.0 Gbps networkRatio/end2end : 0.0 %
=====================================================
17/02/08 11:32:24 18192 INFO CoarseGrainedExecutorBackend: Got assigned task 15
17/02/08 11:32:24 18193 INFO Executor: Running task 4.0 in stage 1.0 (TID 15)
17/02/08 11:32:24 18221 INFO TorrentBroadcast: Started reading broadcast variable 3
17/02/08 11:32:24 18225 INFO TransportClientFactory: Successfully created connection to flex21.zurich.ibm.com/10.40.0.21:38123 after 1 ms (0 ms spent in bootstraps)
17/02/08 11:32:24 18228 INFO MemoryStore: Block broadcast_3_piece0 stored as bytes in memory (estimated size 15.0 KB, free 34.0 GB)
17/02/08 11:32:24 18230 INFO TorrentBroadcast: Reading broadcast variable 3 took 9 ms
17/02/08 11:32:24 18235 INFO MemoryStore: Block broadcast_3 stored as values in memory (estimated size 15.0 KB, free 34.0 GB)
17/02/08 11:32:24 18241 INFO FileScanRDD: Reading File path: hdfs://flex11-40g0:9000/sql/parquet-20m2-s1k/part-00004-58325887-b4e1-4625-ac87-5251c34287e2.parquet, range: 0-2108144314, partition values: [empty row]
17/02/08 11:32:24 18241 INFO TorrentBroadcast: Started reading broadcast variable 1
17/02/08 11:32:24 18244 INFO TransportClientFactory: Successfully created connection to flex19.zurich.ibm.com/10.40.0.19:44291 after 1 ms (0 ms spent in bootstraps)
17/02/08 11:32:24 18265 INFO MemoryStore: Block broadcast_1_piece0 stored as bytes in memory (estimated size 65.9 KB, free 34.0 GB)
17/02/08 11:32:24 18267 INFO TorrentBroadcast: Reading broadcast variable 1 took 26 ms
17/02/08 11:32:24 18284 INFO MemoryStore: Block broadcast_1 stored as values in memory (estimated size 403.4 KB, free 34.0 GB)
[INFO] [SpecificParquetRecordReaderBase] New Parquet reader is initializing hdfs://flex11-40g0:9000/sql/parquet-20m2-s1k/part-00004-58325887-b4e1-4625-ac87-5251c34287e2.parquet node: , and rowGroupOffset (isNULL): true
just footer reading took : 7561 usec
[INFO] [SpecificParquetRecordReaderBase] New Parquet reader initialized : hdfs://flex11-40g0:9000/sql/parquet-20m2-s1k/part-00004-58325887-b4e1-4625-ac87-5251c34287e2.parquet with : 2000000 rows, callstack below: | time: 1: start 78 usec 2: readFooter 7613 usec 1: parse 3152 usec
17/02/08 11:32:30 24219 INFO Executor: Finished task 4.0 in stage 1.0 (TID 15). 2824 bytes result sent to driver, e2etime 6026.2656 msec, sparktime: 5999 msec
17/02/08 11:32:30 24219 INFO Executor:
-------------[ USERS: 0 ----
totalBytes: 0 totalRecs: 0 sortTime: 0.0 ms shuffleIteratorTime: 0.0 ms networkTime: 0.0 ms end2endTime: 0.0 ms
[ShuffleStats]: localBytes: 0 remoteBytes: 0 localTime: 0 ms, networkTime: 0 ms
-----------------------------------------------------
=====================================================
RESULTS: bytes: 0 networkTime: 0.0 ms deSerTime: 0.0 ms joinTime: 0.0 ms netBW 0.0 Gbps networkRatio/end2end : 0.0 %
=====================================================
17/02/08 11:32:30 24606 INFO CoarseGrainedExecutorBackend: Got assigned task 24
17/02/08 11:32:30 24607 INFO Executor: Running task 4.0 in stage 2.0 (TID 24)
17/02/08 11:32:30 24612 INFO MapOutputTrackerWorker: Updating epoch to 2 and clearing cache
17/02/08 11:32:30 24613 INFO TorrentBroadcast: Started reading broadcast variable 4
17/02/08 11:32:30 24618 INFO MemoryStore: Block broadcast_4_piece0 stored as bytes in memory (estimated size 94.7 KB, free 34.0 GB)
17/02/08 11:32:30 24620 INFO TorrentBroadcast: Reading broadcast variable 4 took 7 ms
17/02/08 11:32:30 24626 INFO MemoryStore: Block broadcast_4 stored as values in memory (estimated size 94.7 KB, free 34.0 GB)
17/02/08 11:32:30 24675 INFO deprecation: mapred.job.id is deprecated. Instead, use mapreduce.job.id
17/02/08 11:32:30 24675 INFO deprecation: mapred.tip.id is deprecated. Instead, use mapreduce.task.id
17/02/08 11:32:30 24677 INFO deprecation: mapred.task.id is deprecated. Instead, use mapreduce.task.attempt.id
17/02/08 11:32:30 24678 INFO deprecation: mapred.task.is.map is deprecated. Instead, use mapreduce.task.ismap
17/02/08 11:32:30 24680 INFO deprecation: mapred.task.partition is deprecated. Instead, use mapreduce.task.partition
17/02/08 11:32:30 24684 INFO MapOutputTrackerWorker: Don't have map outputs for shuffle 0, fetching them
17/02/08 11:32:30 24684 INFO MapOutputTrackerWorker: Doing the fetch; tracker endpoint = NettyRpcEndpointRef(spark://[email protected]:49221)
17/02/08 11:32:30 24689 INFO MapOutputTrackerWorker: Got the output locations
17/02/08 11:32:30 24694 INFO ShuffleBlockFetcherIterator: Getting 10 non-empty blocks out of 10 blocks
17/02/08 11:32:30 24696 INFO ShuffleBlockFetcherIterator: Issuing request to BlockManagerId(8, flex13.zurich.ibm.com, 56768, None) hashSet: Set(shuffle_0_0_4) size 1 shufflleClient is org.apache.spark.network.netty.NettyBlockTransferService@60dcb391
17/02/08 11:32:30 24697 INFO TransportClientFactory: Successfully created connection to flex13.zurich.ibm.com/10.40.0.13:56768 after 0 ms (0 ms spent in bootstraps)
17/02/08 11:32:30 24697 INFO ShuffleBlockFetcherIterator: Issuing request to BlockManagerId(2, flex19.zurich.ibm.com, 44291, None) hashSet: Set(shuffle_0_8_4) size 1 shufflleClient is org.apache.spark.network.netty.NettyBlockTransferService@60dcb391
17/02/08 11:32:30 24698 INFO ShuffleBlockFetcherIterator: Issuing request to BlockManagerId(10, flex20.zurich.ibm.com, 54213, None) hashSet: Set(shuffle_0_2_4) size 1 shufflleClient is org.apache.spark.network.netty.NettyBlockTransferService@60dcb391
17/02/08 11:32:30 24699 INFO TransportClientFactory: Successfully created connection to flex20.zurich.ibm.com/10.40.0.20:54213 after 0 ms (0 ms spent in bootstraps)
17/02/08 11:32:30 24699 INFO ShuffleBlockFetcherIterator: Issuing request to BlockManagerId(9, flex22.zurich.ibm.com, 34400, None) hashSet: Set(shuffle_0_4_4) size 1 shufflleClient is org.apache.spark.network.netty.NettyBlockTransferService@60dcb391
17/02/08 11:32:30 24700 INFO TransportClientFactory: Successfully created connection to flex22.zurich.ibm.com/10.40.0.22:34400 after 0 ms (0 ms spent in bootstraps)
17/02/08 11:32:30 24700 INFO ShuffleBlockFetcherIterator: Issuing request to BlockManagerId(6, flex15.zurich.ibm.com, 43187, None) hashSet: Set(shuffle_0_3_4) size 1 shufflleClient is org.apache.spark.network.netty.NettyBlockTransferService@60dcb391
17/02/08 11:32:30 24702 INFO TransportClientFactory: Successfully created connection to flex15.zurich.ibm.com/10.40.0.15:43187 after 0 ms (0 ms spent in bootstraps)
17/02/08 11:32:30 24702 INFO ShuffleBlockFetcherIterator: Issuing request to BlockManagerId(1, flex21.zurich.ibm.com, 38123, None) hashSet: Set(shuffle_0_7_4) size 1 shufflleClient is org.apache.spark.network.netty.NettyBlockTransferService@60dcb391
17/02/08 11:32:30 24702 INFO ShuffleBlockFetcherIterator: Issuing request to BlockManagerId(4, flex18.zurich.ibm.com, 45864, None) hashSet: Set(shuffle_0_5_4) size 1 shufflleClient is org.apache.spark.network.netty.NettyBlockTransferService@60dcb391
17/02/08 11:32:30 24703 INFO ShuffleBlockFetcherIterator: Issuing request to BlockManagerId(3, flex16.zurich.ibm.com, 33005, None) hashSet: Set(shuffle_0_6_4) size 1 shufflleClient is org.apache.spark.network.netty.NettyBlockTransferService@60dcb391
17/02/08 11:32:30 24704 INFO TransportClientFactory: Successfully created connection to flex16.zurich.ibm.com/10.40.0.16:33005 after 0 ms (0 ms spent in bootstraps)
17/02/08 11:32:30 24704 INFO ShuffleBlockFetcherIterator: Issuing request to BlockManagerId(5, flex23.zurich.ibm.com, 43312, None) hashSet: Set(shuffle_0_9_4) size 1 shufflleClient is org.apache.spark.network.netty.NettyBlockTransferService@60dcb391
17/02/08 11:32:30 24705 INFO TransportClientFactory: Successfully created connection to flex23.zurich.ibm.com/10.40.0.23:43312 after 0 ms (0 ms spent in bootstraps)
17/02/08 11:32:30 24705 INFO ShuffleBlockFetcherIterator: Started 9 remote fetches in 14 ms
17/02/08 11:32:30 24706 INFO ShuffleBlockFetcherIterator: total time to fetch the local data 218420748 bytes in 717.597 usec bw : 2437051 Mbps
17/02/08 11:32:30 24745 INFO CodeGenerator: Code generated in 34.898144 ms
17/02/08 11:32:30 24767 INFO CodeGenerator: Code generated in 13.554722 ms
Creating the sorter in SortExec.createSorter : canUseRedix : true sortOrder.length 1 boundSortExpression: IntegerType
17/02/08 11:32:30 24787 INFO CodeGenerator: Code generated in 9.026521 ms
17/02/08 11:32:30 24813 INFO UnsafeInMemorySorter: Sorter init array size : 8388608 usableCapacity: 4194304
17/02/08 11:32:30 24814 INFO MapOutputTrackerWorker: Don't have map outputs for shuffle 1, fetching them
17/02/08 11:32:30 24814 INFO MapOutputTrackerWorker: Doing the fetch; tracker endpoint = NettyRpcEndpointRef(spark://[email protected]:49221)
17/02/08 11:32:30 24816 INFO MapOutputTrackerWorker: Got the output locations
17/02/08 11:32:30 24817 INFO ShuffleBlockFetcherIterator: Getting 10 non-empty blocks out of 10 blocks
17/02/08 11:32:30 24817 INFO ShuffleBlockFetcherIterator: Issuing request to BlockManagerId(10, flex20.zurich.ibm.com, 54213, None) hashSet: Set(shuffle_1_6_4) size 1 shufflleClient is org.apache.spark.network.netty.NettyBlockTransferService@60dcb391
17/02/08 11:32:30 24817 INFO ShuffleBlockFetcherIterator: Issuing request to BlockManagerId(1, flex21.zurich.ibm.com, 38123, None) hashSet: Set(shuffle_1_8_4) size 1 shufflleClient is org.apache.spark.network.netty.NettyBlockTransferService@60dcb391
17/02/08 11:32:30 24818 INFO ShuffleBlockFetcherIterator: Issuing request to BlockManagerId(5, flex23.zurich.ibm.com, 43312, None) hashSet: Set(shuffle_1_3_4) size 1 shufflleClient is org.apache.spark.network.netty.NettyBlockTransferService@60dcb391
17/02/08 11:32:30 24818 INFO ShuffleBlockFetcherIterator: Issuing request to BlockManagerId(8, flex13.zurich.ibm.com, 56768, None) hashSet: Set(shuffle_1_0_4) size 1 shufflleClient is org.apache.spark.network.netty.NettyBlockTransferService@60dcb391
17/02/08 11:32:30 24818 INFO ShuffleBlockFetcherIterator: Issuing request to BlockManagerId(3, flex16.zurich.ibm.com, 33005, None) hashSet: Set(shuffle_1_9_4) size 1 shufflleClient is org.apache.spark.network.netty.NettyBlockTransferService@60dcb391
17/02/08 11:32:30 24818 INFO ShuffleBlockFetcherIterator: Issuing request to BlockManagerId(9, flex22.zurich.ibm.com, 34400, None) hashSet: Set(shuffle_1_1_4) size 1 shufflleClient is org.apache.spark.network.netty.NettyBlockTransferService@60dcb391
17/02/08 11:32:30 24819 INFO ShuffleBlockFetcherIterator: Issuing request to BlockManagerId(2, flex19.zurich.ibm.com, 44291, None) hashSet: Set(shuffle_1_7_4) size 1 shufflleClient is org.apache.spark.network.netty.NettyBlockTransferService@60dcb391
17/02/08 11:32:30 24819 INFO ShuffleBlockFetcherIterator: Issuing request to BlockManagerId(4, flex18.zurich.ibm.com, 45864, None) hashSet: Set(shuffle_1_5_4) size 1 shufflleClient is org.apache.spark.network.netty.NettyBlockTransferService@60dcb391
17/02/08 11:32:30 24819 INFO ShuffleBlockFetcherIterator: Issuing request to BlockManagerId(6, flex15.zurich.ibm.com, 43187, None) hashSet: Set(shuffle_1_2_4) size 1 shufflleClient is org.apache.spark.network.netty.NettyBlockTransferService@60dcb391
17/02/08 11:32:30 24819 INFO ShuffleBlockFetcherIterator: Started 9 remote fetches in 3 ms
17/02/08 11:32:30 24820 INFO ShuffleBlockFetcherIterator: total time to fetch the local data 218538684 bytes in 72.162 usec bw : 24282076 Mbps
Creating the sorter in SortExec.createSorter : canUseRedix : true sortOrder.length 1 boundSortExpression: IntegerType
17/02/08 11:32:30 24839 INFO UnsafeInMemorySorter: Sorter init array size : 8388608 usableCapacity: 4194304
17/02/08 11:32:30 24874 INFO CodeGenerator: Code generated in 33.969684 ms
17/02/08 11:32:30 24884 INFO SQLHadoopMapReduceCommitProtocol: Using output committer class org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter
[INFO] allocating a new Atr(NULL)OutputWriter to file: hdfs://flex11-40g0:9000/p-o/_temporary/0/_temporary/attempt_20170208113230_0002_m_000004_0/part-00004-4b8e7a37-89bc-4046-93d1-94ba959b5f2e.atr
17/02/08 11:32:30 24894 INFO ShuffleBlockFetcherIterator: >> after taking, the current result queue has : 0
Constructor to loop: 1367 usec , loop time: 207615 usec, bytes: 218420748 resized: 1 records: 200019 BW (loop: 1052 , e2e: 1045 MB/sec ), perRowSz (approx) 1088 numFields: 1 ||
17/02/08 11:32:31 25595 INFO ShuffleBlockFetcherIterator: >> after taking, the current result queue has : 0
17/02/08 11:32:31 25596 INFO ShuffleBlockFetcherIterator: shuffle_0_8_4 completed, remainingBlocks: Set()
17/02/08 11:32:31 25609 INFO ShuffleBlockFetcherIterator: shuffle_0_7_4 completed, remainingBlocks: Set()
17/02/08 11:32:31 25615 INFO ShuffleBlockFetcherIterator: shuffle_0_0_4 completed, remainingBlocks: Set()
17/02/08 11:32:31 25622 INFO ShuffleBlockFetcherIterator: shuffle_0_5_4 completed, remainingBlocks: Set()
17/02/08 11:32:31 25645 INFO ShuffleBlockFetcherIterator: shuffle_0_4_4 completed, remainingBlocks: Set()
17/02/08 11:32:31 25650 INFO ShuffleBlockFetcherIterator: shuffle_0_2_4 completed, remainingBlocks: Set()
17/02/08 11:32:31 25655 INFO ShuffleBlockFetcherIterator: shuffle_0_3_4 completed, remainingBlocks: Set()
17/02/08 11:32:31 25658 INFO ShuffleBlockFetcherIterator: shuffle_0_6_4 completed, remainingBlocks: Set()
17/02/08 11:32:31 25689 INFO ShuffleBlockFetcherIterator: shuffle_0_9_4 completed, remainingBlocks: Set()
Constructor to loop: 123 usec , loop time: 151071 usec, bytes: 218794212 resized: 1 records: 200361 BW (loop: 1448 , e2e: 1447 MB/sec ), perRowSz (approx) 1088 numFields: 1 ||
17/02/08 11:32:31 25748 INFO ShuffleBlockFetcherIterator: >> after taking, the current result queue has : 7
Constructor to loop: 25 usec , loop time: 138497 usec, bytes: 217891128 resized: 1 records: 199534 BW (loop: 1573 , e2e: 1572 MB/sec ), perRowSz (approx) 1088 numFields: 1 ||
17/02/08 11:32:31 25887 INFO ShuffleBlockFetcherIterator: >> after taking, the current result queue has : 6
Constructor to loop: 20 usec , loop time: 185480 usec, bytes: 218160852 resized: 1 records: 199781 BW (loop: 1176 , e2e: 1176 MB/sec ), perRowSz (approx) 1088 numFields: 1 ||
17/02/08 11:32:32 26073 INFO ShuffleBlockFetcherIterator: >> after taking, the current result queue has : 5
17/02/08 11:32:32 26194 INFO ShuffleBlockFetcherIterator: shuffle_1_7_4 completed, remainingBlocks: Set()
Constructor to loop: 24 usec , loop time: 128465 usec, bytes: 217506744 resized: 1 records: 199182 BW (loop: 1693 , e2e: 1692 MB/sec ), perRowSz (approx) 1088 numFields: 1 ||
17/02/08 11:32:32 26202 INFO ShuffleBlockFetcherIterator: >> after taking, the current result queue has : 4
17/02/08 11:32:32 26235 INFO ShuffleBlockFetcherIterator: shuffle_1_2_4 completed, remainingBlocks: Set()
Constructor to loop: 15 usec , loop time: 125196 usec, bytes: 218558340 resized: 1 records: 200145 BW (loop: 1745 , e2e: 1745 MB/sec ), perRowSz (approx) 1088 numFields: 1 ||
17/02/08 11:32:32 26327 INFO ShuffleBlockFetcherIterator: >> after taking, the current result queue has : 3
17/02/08 11:32:32 26338 INFO ShuffleBlockFetcherIterator: shuffle_1_3_4 completed, remainingBlocks: Set()
17/02/08 11:32:32 26376 INFO ShuffleBlockFetcherIterator: shuffle_1_9_4 completed, remainingBlocks: Set()
17/02/08 11:32:32 26417 INFO ShuffleBlockFetcherIterator: shuffle_1_1_4 completed, remainingBlocks: Set()
17/02/08 11:32:32 26419 INFO ShuffleBlockFetcherIterator: shuffle_1_0_4 completed, remainingBlocks: Set()
Constructor to loop: 45 usec , loop time: 120703 usec, bytes: 218209992 resized: 1 records: 199826 BW (loop: 1807 , e2e: 1807 MB/sec ), perRowSz (approx) 1088 numFields: 1 ||
17/02/08 11:32:32 26448 INFO ShuffleBlockFetcherIterator: >> after taking, the current result queue has : 2
17/02/08 11:32:32 26449 INFO ShuffleBlockFetcherIterator: shuffle_1_6_4 completed, remainingBlocks: Set()
17/02/08 11:32:32 26467 INFO ShuffleBlockFetcherIterator: shuffle_1_8_4 completed, remainingBlocks: Set()
17/02/08 11:32:32 26510 INFO ShuffleBlockFetcherIterator: shuffle_1_5_4 completed, remainingBlocks: Set()
Constructor to loop: 14 usec , loop time: 102033 usec, bytes: 218860824 resized: 1 records: 200422 BW (loop: 2145 , e2e: 2144 MB/sec ), perRowSz (approx) 1088 numFields: 1 ||
17/02/08 11:32:32 26550 INFO ShuffleBlockFetcherIterator: >> after taking, the current result queue has : 1
Constructor to loop: 13 usec , loop time: 92764 usec, bytes: 218704668 resized: 1 records: 200279 BW (loop: 2357 , e2e: 2357 MB/sec ), perRowSz (approx) 1088 numFields: 1 ||
17/02/08 11:32:32 26643 INFO ShuffleBlockFetcherIterator: >> after taking, the current result queue has : 0
Constructor to loop: 14 usec , loop time: 96422 usec, bytes: 218920884 resized: 1 records: 200477 BW (loop: 2270 , e2e: 2270 MB/sec ), perRowSz (approx) 1088 numFields: 1 ||
SortExec: shuffle iterator consumed in 1846698 usec
17/02/08 11:32:32 27051 INFO UnsafeInMemorySorter: animesh : SORT (useRadix = true) and time in usecs: 309701 numRecords: 2000026 arraySize: 8388608 nullBoundaryPos: 0 expanded: 0
17/02/08 11:32:32 27053 INFO ShuffleBlockFetcherIterator: >> after taking, the current result queue has : 9
Constructor to loop: 32 usec , loop time: 144747 usec, bytes: 218538684 resized: 1 records: 200127 BW (loop: 1509 , e2e: 1509 MB/sec ), perRowSz (approx) 1088 numFields: 1 ||
17/02/08 11:32:33 27198 INFO ShuffleBlockFetcherIterator: >> after taking, the current result queue has : 8
Constructor to loop: 14 usec , loop time: 100107 usec, bytes: 218860824 resized: 1 records: 200422 BW (loop: 2186 , e2e: 2185 MB/sec ), perRowSz (approx) 1088 numFields: 1 ||
17/02/08 11:32:33 27298 INFO ShuffleBlockFetcherIterator: >> after taking, the current result queue has : 7
Constructor to loop: 12 usec , loop time: 101582 usec, bytes: 217457604 resized: 1 records: 199137 BW (loop: 2140 , e2e: 2140 MB/sec ), perRowSz (approx) 1088 numFields: 1 ||
17/02/08 11:32:33 27400 INFO ShuffleBlockFetcherIterator: >> after taking, the current result queue has : 6
Constructor to loop: 12 usec , loop time: 134285 usec, bytes: 218496096 resized: 1 records: 200088 BW (loop: 1627 , e2e: 1626 MB/sec ), perRowSz (approx) 1088 numFields: 1 ||
17/02/08 11:32:33 27535 INFO ShuffleBlockFetcherIterator: >> after taking, the current result queue has : 5
Constructor to loop: 16 usec , loop time: 96492 usec, bytes: 218519028 resized: 1 records: 200109 BW (loop: 2264 , e2e: 2264 MB/sec ), perRowSz (approx) 1088 numFields: 1 ||
17/02/08 11:32:33 27632 INFO ShuffleBlockFetcherIterator: >> after taking, the current result queue has : 4
Constructor to loop: 13 usec , loop time: 108175 usec, bytes: 217483812 resized: 1 records: 199161 BW (loop: 2010 , e2e: 2010 MB/sec ), perRowSz (approx) 1088 numFields: 1 ||
17/02/08 11:32:33 27740 INFO ShuffleBlockFetcherIterator: >> after taking, the current result queue has : 3
Constructor to loop: 14 usec , loop time: 521108 usec, bytes: 218428392 resized: 1 records: 200026 BW (loop: 419 , e2e: 419 MB/sec ), perRowSz (approx) 1088 numFields: 1 ||
17/02/08 11:32:34 28261 INFO ShuffleBlockFetcherIterator: >> after taking, the current result queue has : 2
Constructor to loop: 16 usec , loop time: 132750 usec, bytes: 220126452 resized: 1 records: 201581 BW (loop: 1658 , e2e: 1658 MB/sec ), perRowSz (approx) 1088 numFields: 1 ||
17/02/08 11:32:34 28394 INFO ShuffleBlockFetcherIterator: >> after taking, the current result queue has : 1
Constructor to loop: 460 usec , loop time: 122181 usec, bytes: 217859460 resized: 1 records: 199505 BW (loop: 1783 , e2e: 1776 MB/sec ), perRowSz (approx) 1088 numFields: 1 ||
17/02/08 11:32:34 28517 INFO ShuffleBlockFetcherIterator: >> after taking, the current result queue has : 0
Constructor to loop: 470 usec , loop time: 160663 usec, bytes: 218369424 resized: 1 records: 199972 BW (loop: 1359 , e2e: 1355 MB/sec ), perRowSz (approx) 1088 numFields: 1 ||
SortExec: shuffle iterator consumed in 1626059 usec
17/02/08 11:32:34 28892 INFO UnsafeInMemorySorter: animesh : SORT (useRadix = true) and time in usecs: 212620 numRecords: 2000128 arraySize: 8388608 nullBoundaryPos: 0 expanded: 0
17/02/08 11:32:35 29573 INFO FileFormatWriter: join iterator consumption took : 4677891 usec [1st: 3998983 + 678908 usec ], processed 18709
[INFO] closing Atr(Null)OutputWriter. initPause: 4007127.8 usec runTime: 679840.2 usec #InternalRow: 18709 time/row: 36337 nsec
17/02/08 11:32:35 29576 INFO SparkHadoopMapRedUtil: No need to commit output of task because needsTaskCommit=false: attempt_20170208113230_0002_m_000004_0
17/02/08 11:32:35 29587 INFO Executor: Finished task 4.0 in stage 2.0 (TID 24). 4342 bytes result sent to driver, e2etime 4979.999 msec, sparktime: 4900 msec
17/02/08 11:32:35 29587 INFO Executor:
-------------[ USERS: 0 ----
totalBytes: 4368168168 totalRecs: 4000154 sortTime: 522.3219 ms shuffleIteratorTime: 3472.7583 ms networkTime: 1051.9874 ms end2endTime: 4677.8916 ms
[ShuffleStats]: localBytes: 436959432 remoteBytes: 4136136912 localTime: 0 ms, networkTime: 22477 ms
-----------------------------------------------------
=====================================================
RESULTS: bytes: 4368168168 networkTime: 1051.9874 ms deSerTime: 2420.7708 ms joinTime: 682.8112 ms netBW 33.218403 Gbps networkRatio/end2end : 22.488497 %
=====================================================
17/02/08 11:32:44 38762 INFO CoarseGrainedExecutorBackend: Driver commanded a shutdown
17/02/08 11:32:44 38780 INFO CoarseGrainedExecutorBackend: Driver from 10.40.0.11:49221 disconnected during shutdown
17/02/08 11:32:44 38781 INFO CoarseGrainedExecutorBackend: Driver from 10.40.0.11:49221 disconnected during shutdown
17/02/08 11:32:45 39071 INFO MemoryStore: MemoryStore cleared
17/02/08 11:32:45 39072 INFO BlockManager: BlockManager stopped
17/02/08 11:32:45 39077 INFO ShutdownHookManager: Shutdown hook called
demo@flex13:~/userlogs$