JVM申请的memory不够导致无法启动SparkContext

java.lang.IllegalArgumentException: System memory 259522560 must be at least 471859200. Please increase heap size using the --driver-memory option or spark.driver.memory in Spark configuration.

尝试直接在spark里运行程序的时候,遇到下面这个报错:

很明显,这是JVM申请的memory不够导致无法启动SparkContext。但是该怎么设呢?

 

通过查看spark源码,发现源码是这么写的:

/**
* Return the total amount of memory shared between execution and storage,in bytes.
*/
private def getMaxMemory(conf: SparkConf): Long = {
val systemMemory = conf.getLong("spark.testing.memory",Runtime.getRuntime.maxMemory)
val reservedMemory = conf.getLong("spark.testing.reservedMemory",
if (conf.contains("spark.testing")) 0 else RESERVED_SYSTEM_MEMORY_BYTES)
val minSystemMemory = reservedMemory * 1.5
if (systemMemory < minSystemMemory) {
throw new IllegalArgumentException(s"System memory $systemMemory must " +
s"be at least $minSystemMemory. Please use a larger heap size.")
}
val usableMemory = systemMemory - reservedMemory
val memoryFraction = conf.getDouble("spark.memory.fraction",0.75)
(usableMemory * memoryFraction).toLong
}


所以,这里主要是val systemMemory = conf.getLong("spark.testing.memory",Runtime.getRuntime.maxMemory)。

conf.getLong()的定义和解释是

getLong(key: String,defaultValue: Long): Long
Get a parameter as a long,falling back to a default if not set
所以,我们应该在conf里设置一下spark.testing.memory.

通过尝试,发现可以有2个地方可以设置

1. 自己的源代码处,可以在conf之后加上:

    val conf = new SparkConf().setAppName("word count")
    conf.set("spark.testing.memory","2147480000")//后面的值大于512m即可

2. 可以在Eclipse的Run Configuration处,有一栏是Arguments,下面有VMarguments,在下面添加下面一行(值也是只要大于512m即可)

-Dspark.testing.memory=1073741824

其他的参数,也可以动态地在这里设置,比如-Dspark.master=spark://hostname:7077

再运行就不会报这个错误了。

相关文章

jinfo 命令可以用来查看 Java 进程运行的 JVM 参数,命令如下...
原文链接:https://www.cnblogs.com/niejunlei/p/5987611.ht...
java 语言, 开发者不能直接控制程序运行内存, 对象的创建都是...
jvm
1.jvm的简单抽象模型:  2.类加载机制     双亲委派模...
堆外内存JVM启动时分配的内存,称为堆内存,与之相对的,在代...
1.springboot和tomcat2.springcloud的请求如何通过网关鉴权?...