Skip to content

Commit 55e0ee2

Browse files
authored
build: remove scala 2.12 and mentions of spark 3 (#821)
1 parent 5952424 commit 55e0ee2

File tree

60 files changed

+5121
-5120
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

60 files changed

+5121
-5120
lines changed

.husky/pre-commit

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -2,3 +2,5 @@
22

33
./mvnw sortpom:sort spotless:apply -f .teamcity
44
./mvnw sortpom:sort spotless:apply
5+
6+
git update-index --again

.snyk

Lines changed: 9 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -7,10 +7,14 @@ ignore:
77
reason: Spark Core is provided dependency
88
expires: 2050-01-01T00:00:00.000Z
99
created: 2025-09-18T08:33:31.014
10-
- 'org.apache.spark:spark-core_2.12':
1110
<<: *spark-core
12-
'SNYK-JAVA-COMFASTERXMLJACKSONCORE-7569538': *spark-core-exclusions
13-
'SNYK-JAVA-COMGOOGLEPROTOBUF-8055227': *spark-core-exclusions
14-
'SNYK-JAVA-ORGAPACHEIVY-5847858': *spark-core-exclusions
15-
'SNYK-JAVA-ORGAPACHEZOOKEEPER-5961102': *spark-core-exclusions
11+
'SNYK-JAVA-ORGGLASSFISHJERSEYCORE-14049172': *spark-core-exclusions
12+
'SNYK-JAVA-IOAIRLIFT-14412703': &spark-sql-exclusions
13+
- 'org.apache.spark:spark-sql_2.13': &spark-sql
14+
reason: Spark SQL is provided dependency
15+
expires: 2050-01-01T00:00:00.000Z
16+
created: 2025-09-18T08:35:12.345
17+
<<: *spark-sql
18+
'SNYK-JAVA-ORGLZ4-14151788': *spark-core-exclusions
19+
'SNYK-JAVA-ORGLZ4-14219384': *spark-core-exclusions
1620
patch: {}

README.md

Lines changed: 8 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -10,35 +10,33 @@ This neo4j-connector-apache-spark is Apache 2 Licensed
1010

1111
The documentation for Neo4j Connector for Apache Spark lives at https://github.com/neo4j/docs-spark repository.
1212

13-
## Building for Spark 3
13+
## Building for Spark 4
1414

15-
You can build for Spark 3.x with both Scala 2.12 and Scala 2.13
15+
You can build for Spark 4.x with Scala 2.13
1616

1717
```
18-
./maven-release.sh package 2.12
1918
./maven-release.sh package 2.13
2019
```
2120

2221
These commands will generate the corresponding targets
23-
* `spark-3/target/neo4j-connector-apache-spark_2.12-<version>_for_spark_3.jar`
24-
* `spark-3/target/neo4j-connector-apache-spark_2.13-<version>_for_spark_3.jar`
22+
* `spark/target/neo4j-connector-apache-spark_2.13-<version>_for_spark_4.jar`
2523

2624

2725
## Integration with Apache Spark Applications
2826

2927
**spark-shell, pyspark, or spark-submit**
3028

31-
`$SPARK_HOME/bin/spark-shell --jars neo4j-connector-apache-spark_2.12-<version>_for_spark_3.jar`
29+
`$SPARK_HOME/bin/spark-shell --jars neo4j-connector-apache-spark_2.12-<version>_for_spark_4.jar`
3230

33-
`$SPARK_HOME/bin/spark-shell --packages org.neo4j:neo4j-connector-apache-spark_2.12:<version>_for_spark_3`
31+
`$SPARK_HOME/bin/spark-shell --packages org.neo4j:neo4j-connector-apache-spark_2.13:<version>_for_spark_4`
3432

3533
**sbt**
3634

3735
If you use the [sbt-spark-package plugin](https://github.com/databricks/sbt-spark-package), in your sbt build file, add:
3836

3937
```scala
4038
resolvers += "Spark Packages Repo" at "http://dl.bintray.com/spark-packages/maven"
41-
libraryDependencies += "org.neo4j" % "neo4j-connector-apache-spark_2.12" % "<version>_for_spark_3"
39+
libraryDependencies += "org.neo4j" % "neo4j-connector-apache-spark_2.13" % "<version>_for_spark_4"
4240
```
4341

4442
**maven**
@@ -50,8 +48,8 @@ In your pom.xml, add:
5048
<!-- list of dependencies -->
5149
<dependency>
5250
<groupId>org.neo4j</groupId>
53-
<artifactId>neo4j-connector-apache-spark_2.12</artifactId>
54-
<version>[version]_for_spark_3</version>
51+
<artifactId>neo4j-connector-apache-spark_2.13</artifactId>
52+
<version>[version]_for_spark_4</version>
5553
</dependency>
5654
</dependencies>
5755
```

common/LICENSES.txt

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@ libraries. For an overview of the licenses see the NOTICE.txt file.
44

55
------------------------------------------------------------------------------
66
Apache Software License, Version 2.0
7-
IntelliJ IDEA Annotations
7+
JetBrains Java Annotations
88
Kotlin Stdlib
99
Neo4j Bolt Connection (Bolt Provider reference impl)
1010
Neo4j Bolt Connection (Pooled Source impl)

common/NOTICE.txt

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -19,7 +19,7 @@ Third-party licenses
1919
--------------------
2020

2121
Apache Software License, Version 2.0
22-
IntelliJ IDEA Annotations
22+
JetBrains Java Annotations
2323
Kotlin Stdlib
2424
Neo4j Bolt Connection (Bolt Provider reference impl)
2525
Neo4j Bolt Connection (Pooled Source impl)

common/src/main/scala/org/neo4j/spark/service/SchemaService.scala

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -734,7 +734,7 @@ class SchemaService(
734734
tx => {
735735
tx.run(
736736
s"CREATE CONSTRAINT $constraintName IF NOT EXISTS FOR $asciiRepresentation REQUIRE ($props) IS $constraintType"
737-
)
737+
).consume()
738738
},
739739
sessionTransactionConfig
740740
)

common/src/test/scala/org/neo4j/spark/util/ValidationsTest.scala

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -30,7 +30,7 @@ class ValidationsTest extends SparkConnectorScalaBaseTSE {
3030
.map { _.version }
3131
.getOrElse("UNKNOWN")
3232
try {
33-
Validations.validate(ValidateSparkMinVersion("3.10000"))
33+
Validations.validate(ValidateSparkMinVersion("4.10000"))
3434
fail(s"should be thrown a ${classOf[IllegalArgumentException].getName}")
3535
} catch {
3636
case e: IllegalArgumentException =>

0 commit comments

Comments
 (0)