0 votes
1 view
in Big Data Hadoop & Spark by (11.5k points)

I am using Spark and I can smoothly run my programs in Spark prompt, using pyspark scripts. But there are various messages popping up on my Spark shell and I don’t want that.

I'm receiving various log msgs on spark shell and I want to remove them. On a website, I came to know that it can be done by editing the log4j.properties file.
However, I can not figure out how to stop all of the verbose INFO logging after each command.

Here are the contents of log4j.properties:

# Define the root logger with appender file
log4j.rootCategory=WARN, console
log4j.appender.console.layout.ConversionPattern=%d{yy/MM/dd HH:mm:ss} %p %c{1}: %m%n

# Settings to quiet third party logs that are too verbose

1 Answer

0 votes
by (31.4k points)
edited by

If you don’t want to see any logs messages at all, just start the Spark shell and write these commands :

import org.apache.log4j.Logger

import org.apache.log4j.Level

Logger.getLogger("org").setLevel(Level.OFF) Logger.getLogger("akka").setLevel(Level.OFF)


If you don’t want to see logging INFO messages, go to your log4j.properties file in the conf folder and make changes in this one single line :

log4j.rootCategory=INFO, console


log4j.rootCategory=ERROR, console

If you want more information regarding the same, refer the following video:


Welcome to Intellipaat Community. Get your technical queries answered by top developers !