Skip to content

Commit 96fae31

Browse files
SparkConnectPlugin and SparkConnectService
1 parent a9b10c5 commit 96fae31

File tree

2 files changed

+19
-10
lines changed

2 files changed

+19
-10
lines changed

docs/server/SparkConnectPlugin.md

Lines changed: 5 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1,8 +1,10 @@
11
# SparkConnectPlugin
22

3-
`SparkConnectPlugin` is a Spark driver plugin (a `SparkPlugin` ([Apache Spark]({{ book.spark_core }}/plugins/SparkPlugin)) with the [driver-side component](#driverPlugin) only).
3+
`SparkConnectPlugin` is a Spark driver plugin that [starts a SparkConnectService](SparkConnectService.md#start) upon initialization.
44

5-
`SparkConnectPlugin` is the main entry point for Spark Connect in Apache Spark applications.
5+
`SparkConnectPlugin` is a `SparkPlugin` ([Spark Core]({{ book.spark_core }}/plugins/SparkPlugin)) with the [driver-side component](#driverPlugin) only.
6+
7+
`SparkConnectPlugin` can be installed into Spark applications using `spark.plugins` ([Spark Core]({{ book.spark_core }}/configuration-properties/#spark.plugins)) configuration property (e.g., [Spark Connect Shell](../spark-connect-shell.md)).
68

79
## Driver-Side Component { #driverPlugin }
810

@@ -12,7 +14,7 @@
1214
driverPlugin(): DriverPlugin
1315
```
1416

15-
`driverPlugin` is part of the `SparkPlugin` ([Apache Spark]({{ book.spark_core }}/plugins/SparkPlugin/#driverPlugin)) abstraction.
17+
`driverPlugin` is part of the `SparkPlugin` ([Spark Core]({{ book.spark_core }}/plugins/SparkPlugin/#driverPlugin)) abstraction.
1618

1719
`driverPlugin` creates a new `DriverPlugin` (Apache Spark) that does the following:
1820

docs/server/SparkConnectService.md

Lines changed: 14 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@
77
* Apache Spark applications (as a [Spark driver plugin](SparkConnectPlugin.md#driverPlugin))
88
* On command line as a [SparkConnectServer](SparkConnectServer.md) standalone application
99

10-
Right after `SparkConnectService` has been [started](#start), [a `SparkListenerConnectServiceStarted` event is posted](#postSparkConnectServiceStarted) with all the network connectivity information.
10+
Right after having been [started](#start), `SparkConnectService` [posts a SparkListenerConnectServiceStarted event](#postSparkConnectServiceStarted) with all the network connectivity information.
1111

1212
## Creating Instance
1313

@@ -61,12 +61,6 @@ Configuration Property | Default Value
6161

6262
`startGRPCService` [builds the server](#server) and starts it.
6363

64-
If successful, prints out the following INFO message to the logs:
65-
66-
```text
67-
Successfully started service [serviceName] on port [port].
68-
```
69-
7064
### createListenerAndUI { #createListenerAndUI }
7165

7266
```scala
@@ -139,3 +133,16 @@ postSparkConnectServiceStarted(): Unit
139133
`releaseSession` is generated by the gRPC proto compiler from `spark/connect/base.proto`.
140134

141135
`releaseSession` creates a [SparkConnectReleaseSessionHandler](SparkConnectReleaseSessionHandler.md) to [handle](SparkConnectReleaseSessionHandler.md#handle) the `ReleaseSessionRequest` request.
136+
137+
## Logging
138+
139+
Enable `ALL` logging level for `org.apache.spark.sql.connect.service.SparkConnectService` logger to see what happens inside.
140+
141+
Add the following line to `conf/log4j2.properties`:
142+
143+
```text
144+
logger.SparkConnectService.name = org.apache.spark.sql.connect.service.SparkConnectService
145+
logger.SparkConnectService.level = all
146+
```
147+
148+
Refer to [Logging](../logging.md).

0 commit comments

Comments
 (0)