Quantcast
Viewing all articles
Browse latest Browse all 12

Answer by Heapify for Spark Error - Unsupported class file major version

Just wanted to add my two cents here as it will save several hours of time for people who are using PyCharm (especially the run configuration). After changing your .bashrc or .bash_profile to point to Java 8 by modifying JAVA_HOME and PATH env variables (like most people here have recommended), you'll notice that when you run your Spark using the run configuration of PyCharm, it will still not pick up the right Java. Looks like there is some issue with PyCharm (I'm using PyCharm Professional 2020.2 in Mac Catalina). Additionally, when you run it using the terminal of PyCharm, it works fine. That confirms something is wrong with PyCharm. In order for the run configuration of PyCharm to pick up new JAVA, I had to specifically add JAVA_HOME environment variable in the run configuration as shown below-Image may be NSFW.
Clik here to view.
enter image description here

and it worked!

Another option that also works is checking the Include system environment variables option in the Environment Variables window in the run configuration (see screenshot above) and restarting PyCharm


Viewing all articles
Browse latest Browse all 12

Trending Articles