Spark java.lang.OutOfMemoryError: Java heap space

My cluster: 1 master, 11 slaves, each node has 6 GB memory. My settings: spark.executor.memory=4g, Dspark.akka.frameSize=512 Here is the problem: First, I read some data (2.19 GB) from HDFS to RDD: val imageBundleRDD = sc.newAPIHadoopFile(…) Second, do something on this RDD: val res = imageBundleRDD.map(data => { val desPoints = threeDReconstruction(data._2, bg) (data._1, desPoints) }) … Read more