Intellipaat Back

Explore Courses Blog Tutorials Interview Questions
0 votes
in Big Data Hadoop & Spark by (11.4k points)

I am using Spark and I can smoothly run my programs in Spark prompt, using pyspark scripts. But there are various messages popping up on my Spark shell and I don’t want that.

I'm receiving various log msgs on spark shell and I want to remove them. On a website, I came to know that it can be done by editing the file.
However, I can not figure out how to stop all of the verbose INFO logging after each command.

Here are the contents of

# Define the root logger with appender file
log4j.rootCategory=WARN, console
log4j.appender.console.layout.ConversionPattern=%d{yy/MM/dd HH:mm:ss} %p %c{1}: %m%n

# Settings to quiet third party logs that are too verbose$exprTyper=INFO$SparkILoopInterpreter=INFO

1 Answer

0 votes
by (32.3k points)
edited by

If you don’t want to see any logs messages at all, just start the Spark shell and write these commands :

import org.apache.log4j.Logger

import org.apache.log4j.Level

Logger.getLogger("org").setLevel(Level.OFF) Logger.getLogger("akka").setLevel(Level.OFF)


If you don’t want to see logging INFO messages, go to your file in the conf folder and make changes in this one single line :

log4j.rootCategory=INFO, console


log4j.rootCategory=ERROR, console

If you want more information regarding the same, refer the following video:


Browse Categories