Counting 2,412 Big Data & Machine Learning Frameworks, Toolsets, and Examples...
Suggestion? Feedback? Tweet @stkim1

Last Commit
Feb. 22, 2018
Sep. 24, 2017


John Snow Labs Spark-NLP is a natural language processing library built on top of Apache Spark ML. It provides simple, performant & accurate NLP annotations for machine learning pipelines, that scale easily in a distributed environment.

Project's website

Take a look at our official spark-nlp page: for user documentation and examples



This library has been uploaded to the spark-packages repository .

To use the most recent version just add the --packages JohnSnowLabs:spark-nlp:1.4.0 to you spark command

spark-shell --packages JohnSnowLabs:spark-nlp:1.4.0
pyspark --packages JohnSnowLabs:spark-nlp:1.4.0
spark-submit --packages JohnSnowLabs:spark-nlp:1.4.0

If you want to use and old version check the spark-packages websites to see all the releases.

Maven central

Our package is deployed to maven central. In order to add this package as a dependency in your application:




libraryDependencies += "com.johnsnowlabs.nlp" % "spark-nlp_2.11" % "1.4.0"

If you are using scala 2.11

libraryDependencies += "com.johnsnowlabs.nlp" %% "spark-nlp" % "1.4.0"

Using the jar manually

If for some reason you need to use the jar, you can download the jar from the project's website:

From there you can use it in your project setting the --classpath

To add jars to spark programs use the --jars option

spark-shell --jars spark-nlp.jar

The preferred way to use the library when running spark programs is using the --packages option as specified in the spark-packages section.


We appreciate any sort of contributions:

  • ideas
  • feedback
  • documentation
  • bug reports
  • nlp training and testing corpora
  • development and testing

Clone the repo and submit your pull-requests! Or directly create issues in this repo.


[email protected]

John Snow Labs