A pyspark project that is running locally requires JAVA_HOME environment variable setup. If you’re using conda or anaconda-project to manage packages, then you do not need to install the bloated Oracle Java JDK but just add the java-jdk package from bioconda (linux) or cyclus (linux and win) channel and point JAVA_HOME property to the bin folder of your conda env as that will have the java.exe
Tags
-
Recent Posts
- Run pyspark on your windows machine
- java-jdk in pyspark project
- Handle bimodal dataset
- Handle numpy log of smaller values or zero
- Change comma separator in Excel to handle Lakhs instead of Millions formatting convention
- Setup Apache Spark in JetBrains IntelliJ (Scala Version)
- Setup Oracle Java (JDK – 8) on Ubuntu
- Install JetBrains IntelliJ IDE with Scala plugin
- Python – Matlab Reference Cheatsheet
- Installing Sun JDK 1.6 on Ubuntu
Archives
Recent Comments
Login