Skip to content

Commit f305ebd

Browse files
rgarciateholdenk
authored andcommitted
Updates version on README.md to 0.7.4 (holdenk#207)
1 parent e7e7681 commit f305ebd

File tree

2 files changed

+6
-6
lines changed

2 files changed

+6
-6
lines changed

README.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -12,10 +12,10 @@ This is not my beautiful code.
1212

1313
## How?
1414

15-
So you include com.holdenkarau.spark-testing-base [spark_version]_0.7.2 and extend one of the classes and write some simple tests instead. For example to include this in a project using Spark 2.2.0:
15+
So you include com.holdenkarau.spark-testing-base [spark_version]_0.7.4 and extend one of the classes and write some simple tests instead. For example to include this in a project using Spark 2.2.0:
1616

1717
```scala
18-
"com.holdenkarau" %% "spark-testing-base" % "2.2.0_0.7.2" % "test"
18+
"com.holdenkarau" %% "spark-testing-base" % "2.2.0_0.7.4" % "test"
1919
```
2020

2121
or
@@ -24,7 +24,7 @@ or
2424
<dependency>
2525
<groupId>com.holdenkarau</groupId>
2626
<artifactId>spark-testing-base_2.11</artifactId>
27-
<version>${spark.version}_0.7.2</version>
27+
<version>${spark.version}_0.7.4</version>
2828
<scope>test</scope>
2929
</dependency>
3030
```

RELEASE_NOTES.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -4,17 +4,17 @@
44
- Re-add Scala 2.10 support up to and including Spark 2.2.X series
55
- Attempt to make it so that users doing SQL tests without Hive don't need the hive jars.
66
- Don't reset the SparkSession provider when in reuse mode.
7-
- Add workaround for inaccessiable active context info in Spark 2.0
7+
- Add workaround for inaccessible active context info in Spark 2.0
88
- Upgrade to Hadoop 2.8.1 for mini cluster
99
- Change build env after travis changes
1010
# 0.7.2
11-
- Add expiremental support to for reusing a SparkContext/Session accross multiple suites. For Spark 2.0+ only.
11+
- Add experimental support to for reusing a SparkContext/Session across multiple suites. For Spark 2.0+ only.
1212
# 0.7.1
1313
- Upgrade mini cluster hadoop dependencies
1414
- Add support for Spark 2.2.0
1515
- YARNCluster now requires SPARK_HOME to be set so as to configure spark.yarn.jars (workaround for YARN bug from deprecated code in Spark 2.2).
1616
# 0.7
17-
- Add Python RDD comparisions
17+
- Add Python RDD comparisons
1818
- Switch to JDK8 for Spark 2.1.1+
1919
- Add back Kafka tests
2020
- Make it easier to disable Hive support when running tests

0 commit comments

Comments
 (0)