Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

stop/start does not work #37

Open
parente opened this issue May 24, 2017 · 2 comments
Open

stop/start does not work #37

parente opened this issue May 24, 2017 · 2 comments
Labels

Comments

@parente
Copy link
Contributor

parente commented May 24, 2017

For my spylon notebook I:

  1. Did spark.stop()
  2. Did not restart the notebook kernel
  3. Ran all the %%init_spark and spark to start up a Spark application again

what I found was that most operations work, like reading datasets using the sparkSession and showing them and stuff

However, when I tried to use the sparkContext, it thinks it's not running. Here's the code I was running and the error:

val bRetailersList = (sparkSession.sparkContext
                      .broadcast(trainedModel.itemFactors.select("id")
                                 .rdd.map(x => x(0).asInstanceOf[Int]).collect)
                      )

java.lang.IllegalStateException: Cannot call methods on a stopped SparkContext.
This stopped SparkContext was created at:

org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)
sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
java.lang.reflect.Constructor.newInstance(Constructor.java:423)
py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:240)
py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
py4j.Gateway.invoke(Gateway.java:236)
py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:80)
py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:69)
py4j.GatewayConnection.run(GatewayConnection.java:214)
java.lang.Thread.run(Thread.java:745)

The currently active SparkContext was created at:

org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:823)
org.apache.spark.ml.util.BaseReadWrite$class.sparkSession(ReadWrite.scala:69)
org.apache.spark.ml.util.MLReader.sparkSession(ReadWrite.scala:189)
org.apache.spark.ml.util.BaseReadWrite$class.sc(ReadWrite.scala:80)
org.apache.spark.ml.util.MLReader.sc(ReadWrite.scala:189)
org.apache.spark.ml.recommendation.ALSModel$ALSModelReader.load(ALS.scala:317)
org.apache.spark.ml.recommendation.ALSModel$ALSModelReader.load(ALS.scala:311)
org.apache.spark.ml.util.MLReadable$class.load(ReadWrite.scala:227)
org.apache.spark.ml.recommendation.ALSModel$.load(ALS.scala:297)
<init>(<console>:53)
<init>(<console>:58)
<init>(<console>:60)
<init>(<console>:62)
<init>(<console>:64)
<init>(<console>:66)
<init>(<console>:68)
<init>(<console>:70)
<init>(<console>:72)
<init>(<console>:74)
<init>(<console>:76)

  at org.apache.spark.SparkContext.assertNotStopped(SparkContext.scala:101)
  at org.apache.spark.sql.SparkSession.<init>(SparkSession.scala:80)
  at org.apache.spark.sql.SparkSession.<init>(SparkSession.scala:77)
  ... 44 elided
@parente parente added the bug label May 24, 2017
@mariusvniekerk
Copy link
Collaborator

What is mp.sparkSession?

@parente
Copy link
Contributor Author

parente commented May 24, 2017

Sorry, this was transferred from vericast/spylon#45. It's a SparkSession.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants