Intellipaat Back

Explore Courses Blog Tutorials Interview Questions
0 votes
2 views
in Data Science by (17.6k points)

I would like to make some modifications to the scala code of spark.ml.classification.LogisticRegression without having to rebuild the whole Spark. Since we can append jar files to the execution of either spark-submit, or pySpark. Is it possible to compile a modified copy of LogisticRegression.java and override the default methods of Spark, or at least create new ones? Thanks.

1 Answer

0 votes
by (41.4k points)

1.Create a new Class 

2.Then, extend org.apache.spark.ml.classification.LogisticRegression.

3.After that override the respective methods without any modification of the source code.

class CustomLogisticRegression extends LogisticRegression { override def toString(): String = "This is overridden Logistic Regression Class" }

Running Logistic Regression With the new CustomLogisticRegression class

val data = sqlCtx.createDataFrame(MLUtils.loadLibSVMFile(sc, "/opt/spark/spark-1.5.2-bin-hadoop2.6/data/mllib/sample_libsvm_data.txt")) val customLR = new CustomLogisticRegression() .setMaxIter(10) .setRegParam(0.3) .setElasticNetParam(0.8) val customLRModel = customLR.fit(data) val originalLR = new LogisticRegression() .setMaxIter(10) .setRegParam(0.3) .setElasticNetParam(0.8) val originalLRModel = originalLR.fit(data) // Print the intercept for logistic regression println(s"Custom Class's Intercept: ${customLRModel.intercept}") println(s"Original Class's Intercept: ${originalLRModel.intercept}") println(customLR.toString()) println(originalLR.toString())

Output: Custom Class's Intercept: 0.22456315961250317 Original Class's Intercept: 0.22456315961250317 This is overridden Logistic Regression Class logreg_1cd811a145d7

31k questions

32.8k answers

501 comments

693 users

Browse Categories

...