First, you download a compiled Spark package from the Spark official web page and invoke spark-shell with a compiled Hivemall binary.

$ spark-shell --jars target/hivemall-all-<version>-incubating-SNAPSHOT.jar

Installation via Spark Packages

In another way to install Hivemall, you can use a --packages option.

$ spark-shell --packages org.apache.hivemall:hivemall-all:<version>

You find available Hivemall versions on Maven repository.


If you would like to try Hivemall functions on the latest release of Spark, you just say bin/spark-shell in a Hivemall package. This command automatically downloads the latest Spark version, compiles Hivemall for the version, and invokes spark-shell with the compiled Hivemall binary.

Then, you load scripts for Hivemall functions.

scala> :load resources/ddl/define-all.spark

results matching ""

    No results matching ""