Skip to content

Commit e5949c2

Browse files
committed
[SPARK-6966][SQL] Use correct ClassLoader for JDBC Driver
Otherwise we cannot add jars with drivers after the fact. Author: Michael Armbrust <[email protected]> Closes apache#5543 from marmbrus/jdbcClassloader and squashes the following commits: d9930f3 [Michael Armbrust] fix imports 73d0614 [Michael Armbrust] [SPARK-6966][SQL] Use correct ClassLoader for JDBC Driver
1 parent 1e43851 commit e5949c2

File tree

1 file changed

+2
-1
lines changed

1 file changed

+2
-1
lines changed

sql/core/src/main/scala/org/apache/spark/sql/jdbc/JDBCRelation.scala

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -28,6 +28,7 @@ import org.apache.spark.sql.SQLContext
2828
import org.apache.spark.sql.catalyst.expressions.Row
2929
import org.apache.spark.sql.sources._
3030
import org.apache.spark.sql.types.StructType
31+
import org.apache.spark.util.Utils
3132

3233
/**
3334
* Data corresponding to one partition of a JDBCRDD.
@@ -99,7 +100,7 @@ private[sql] class DefaultSource extends RelationProvider {
99100
val upperBound = parameters.getOrElse("upperBound", null)
100101
val numPartitions = parameters.getOrElse("numPartitions", null)
101102

102-
if (driver != null) Class.forName(driver)
103+
if (driver != null) Utils.getContextOrSparkClassLoader.loadClass(driver)
103104

104105
if (partitionColumn != null
105106
&& (lowerBound == null || upperBound == null || numPartitions == null)) {

0 commit comments

Comments
 (0)