Skip to content

fix back end spelling #344

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 7 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -156,7 +156,7 @@ We welcome contributions! Please review our [contribution guide](CONTRIBUTING.md

This project would not have been possible without the outstanding work from the following communities:

- [Apache Spark](https://spark.apache.org/): Unified Analytics Engine for Big Data, the underlying backend execution engine for .NET for Apache Spark
- [Apache Spark](https://spark.apache.org/): Unified Analytics Engine for Big Data, the underlying back-end execution engine for .NET for Apache Spark
- [Mobius](https://github.com/Microsoft/Mobius): C# and F# language binding and extensions to Apache Spark, a pre-cursor project to .NET for Apache Spark from the same Microsoft group.
- [PySpark](https://spark.apache.org/docs/latest/api/python/index.html): Python bindings for Apache Spark, one of the implementations .NET for Apache Spark derives inspiration from.
- [sparkR](https://spark.apache.org/docs/latest/sparkr.html): one of the implementations .NET for Apache Spark derives inspiration from.
Expand Down
2 changes: 1 addition & 1 deletion deployment/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -35,7 +35,7 @@ Deploying your App on the Cloud
```

# Preparing Worker Dependencies
Microsoft.Spark.Worker is a backend component that lives on the individual worker nodes of your Spark cluster. When you want to execute a C# UDF (user-defined function), Spark needs to understand how to launch the .NET CLR to execute this UDF. Microsoft.Spark.Worker provides a collection of classes to Spark that enable this functionality.
Microsoft.Spark.Worker is a back-end component that lives on the individual worker nodes of your Spark cluster. When you want to execute a C# UDF (user-defined function), Spark needs to understand how to launch the .NET CLR to execute this UDF. Microsoft.Spark.Worker provides a collection of classes to Spark that enable this functionality.

## Microsoft.Spark.Worker
1. Select a [Microsoft.Spark.Worker](https://github.com/dotnet/spark/releases) Linux netcoreapp release to be deployed on your cluster.
Expand Down
2 changes: 1 addition & 1 deletion docs/developer-guide.md
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ spark-submit \
<path-to-microsoft-spark-jar> \
debug
```
and you will see the followng output:
and you will see the following output:
```
***********************************************************************
* .NET Backend running debug mode. Press enter to exit *
Expand Down
4 changes: 2 additions & 2 deletions src/csharp/Microsoft.Spark.E2ETest/SparkFixture.cs
Original file line number Diff line number Diff line change
Expand Up @@ -41,7 +41,7 @@ public class EnvironmentVariableNames

private readonly Process _process = new Process();
private readonly TemporaryDirectory _tempDirectory = new TemporaryDirectory();

public const string DefaultLogLevel = "ERROR";

internal SparkSession Spark { get; }
Expand Down Expand Up @@ -110,7 +110,7 @@ public SparkFixture()
.Config("spark.ui.showConsoleProgress", false)
.AppName("Microsoft.Spark.E2ETest")
.GetOrCreate();

Spark.SparkContext.SetLogLevel(DefaultLogLevel);

Jvm = Spark.Reference.Jvm;
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ namespace Microsoft.Spark.Services
{
/// <summary>
/// Implementation of configuration service that helps getting config settings
/// to be used in .NET backend.
/// to be used in .NET back end.
/// </summary>
internal sealed class ConfigurationService : IConfigurationService
{
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ internal interface IConfigurationService
TimeSpan JvmThreadGCInterval { get; }

/// <summary>
/// The port number used for communicating with the .NET backend process.
/// The port number used for communicating with the .NET back-end process.
/// </summary>
int GetBackendPortNumber();

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -112,7 +112,7 @@ object DotnetRunner extends Logging {
val dotnetBackendThread = new Thread("DotnetBackend") {
override def run() {
// need to get back dotnetBackendPortNumber because if the value passed to init is 0
// the port number is dynamically assigned in the backend
// the port number is dynamically assigned in the back end
dotnetBackendPortNumber = dotnetBackend.init(dotnetBackendPortNumber)
logInfo(s"Port number used by DotnetBackend is $dotnetBackendPortNumber")
initialized.release()
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -112,7 +112,7 @@ object DotnetRunner extends Logging {
val dotnetBackendThread = new Thread("DotnetBackend") {
override def run() {
// need to get back dotnetBackendPortNumber because if the value passed to init is 0
// the port number is dynamically assigned in the backend
// the port number is dynamically assigned in the back end
dotnetBackendPortNumber = dotnetBackend.init(dotnetBackendPortNumber)
logInfo(s"Port number used by DotnetBackend is $dotnetBackendPortNumber")
initialized.release()
Expand Down