Skip to content

Add WordCount template sample #1469

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Jun 18, 2019
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Add WordCount template sample
  • Loading branch information
davidcavazos committed Jun 17, 2019
commit c682b3fb6d0fc24624691be9d436cb57ae6ff777
152 changes: 152 additions & 0 deletions dataflow/templates/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,152 @@
# Cloud Dataflow Templates

Samples showing how to create and run an [Apache Beam] on [Google Cloud Dataflow].

## Before you begin

1. Install the [Cloud SDK].

1. [Create a new project].

1. [Enable billing].

1. [Enable the APIs](https://console.cloud.google.com/flows/enableapi?apiid=dataflow,compute_component,logging,storage_component,storage_api,bigquery,pubsub,datastore.googleapis.com,cloudresourcemanager.googleapis.com): Dataflow, Compute Engine, Stackdriver Logging, Cloud Storage, Cloud Storage JSON, BigQuery, Pub/Sub, Datastore, and Cloud Resource Manager.

1. Setup the Cloud SDK to your GCP project.

```bash
gcloud init
```

1. [Create a service account key] as a JSON file.
For more information, see [Creating and managing service accounts].

* From the **Service account** list, select **New service account**.
* In the **Service account name** field, enter a name.
* From the **Role** list, select **Project > Owner**.

> **Note**: The **Role** field authorizes your service account to access resources.
> You can view and change this field later by using the [GCP Console IAM page].
> If you are developing a production app, specify more granular permissions than **Project > Owner**.
> For more information, see [Granting roles to service accounts].

* Click **Create**. A JSON file that contains your key downloads to your computer.

1. Set your `GOOGLE_APPLICATION_CREDENTIALS` environment variable to point to your service account key file.

```bash
export GOOGLE_APPLICATION_CREDENTIALS=path/to/your/credentials.json
```

1. Create a Cloud Storage bucket.

```bash
gsutil mb gs://your-gcs-bucket
```

## Setup

The following instructions will help you prepare your development environment.

1. Download and install the [Java Development Kit (JDK)].
Verify that the [JAVA_HOME] environment variable is set and points to your JDK installation.

1. Download and install [Apache Maven] by following the [Maven installation guide] for your specific operating system.

1. Clone the `java-docs-samples` repository.

```bash
git clone https://github.com/GoogleCloudPlatform/java-docs-samples.git
```

1. Navigate to the sample code directory.

```bash
cd java-docs-samples/dataflow/templates
```

## Templates

### WordCount

* [WordCount.java](src/main/java/com/example/dataflow/templates/WordCount.java)
* [WordCount_metadata](WordCount_metadata)

First, select the project and template location.

```bash
PROJECT=$(gcloud config get-value project)
BUCKET=your-gcs-bucket
TEMPLATE_LOCATION=gs://$BUCKET/dataflow/templates/WordCount
```

Then, to create the template in the desired Cloud Storage location.

```bash
# Create the template.
mvn compile exec:java \
-Dexec.mainClass=com.example.dataflow.templates.WordCount \
-Dexec.args="\
--isCaseSensitive=false \
--project=$PROJECT \
--templateLocation=$TEMPLATE_LOCATION \
--runner=DataflowRunner"

# Upload the metadata file.
gsutil cp WordCount_metadata "$TEMPLATE_LOCATION"_metadata
```

> For more information, see [Creating templates].

Finally, you can run the template via `gcloud` or through the [GCP Console create Dataflow job page].

```bash
JOB_NAME=wordcount-$(date +'%Y%m%d-%H%M%S')
INPUT=gs://apache-beam-samples/shakespeare/kinglear.txt

gcloud dataflow jobs run $JOB_NAME \
--gcs-location $TEMPLATE_LOCATION \
--parameters inputFile=$INPUT,outputBucket=$BUCKET
```

> For more information, see [Executing templates].

You can check your submitted jobs in the [GCP Console Dataflow page].

## Cleanup

To avoid incurring charges to your GCP account for the resources used:

```bash
# Delete only the files created by this sample.
gsutil -m rm -rf \
"gs://$BUCKET/dataflow/templates/WordCount*" \
"gs://$BUCKET/dataflow/wordcount/"

# [optional] Remove the entire dataflow Cloud Storage directory.
gsutil -m rm -rf gs://$BUCKET/dataflow

# [optional] Remove the Cloud Storage bucket.
gsutil rb gs://$BUCKET
```

[Apache Beam]: https://beam.apache.org/
[Google Cloud Dataflow]: https://cloud.google.com/dataflow/docs/

[Cloud SDK]: https://cloud.google.com/sdk/docs/
[Create a new project]: https://console.cloud.google.com/projectcreate
[Enable billing]: https://cloud.google.com/billing/docs/how-to/modify-project
[Create a service account key]: https://console.cloud.google.com/apis/credentials/serviceaccountkey
[Creating and managing service accounts]: https://cloud.google.com/iam/docs/creating-managing-service-accounts
[GCP Console IAM page]: https://console.cloud.google.com/iam-admin/iam
[Granting roles to service accounts]: https://cloud.google.com/iam/docs/granting-roles-to-service-accounts

[Java Development Kit (JDK)]: https://www.oracle.com/technetwork/java/javase/downloads/index.html
[JAVA_HOME]: https://docs.oracle.com/javase/8/docs/technotes/guides/troubleshoot/envvars001.html
[Apache Maven]: http://maven.apache.org/download.cgi
[Maven installation guide]: http://maven.apache.org/install.html

[Creating templates]: https://cloud.google.com/dataflow/docs/guides/templates/creating-templates
[GCP Console create Dataflow job page]: https://console.cloud.google.com/dataflow/createjob
[Executing templates]: https://cloud.google.com/dataflow/docs/guides/templates/executing-templates
[GCP Console Dataflow page]: https://console.cloud.google.com/dataflow
38 changes: 38 additions & 0 deletions dataflow/templates/WordCount_metadata
Original file line number Diff line number Diff line change
@@ -0,0 +1,38 @@
// Copyright 2019 Google Inc.
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
{
"name": "WordCount",
"description": "An example pipeline that counts words in the input file.",
"parameters": [
{
"name": "inputFile",
"label": "Input GCS File Pattern",
"help_text: "Google Cloud Storage file pattern glob of the file(s) to read from.",
"regexes": ["^gs:\/\/[^\n\r]+$"],
"is_optional": true
},
{
"name": "outputBucket",
"label": "Output GCS Bucket",
"help_text: "Google Cloud Storage bucket to store the outputs.",
"regexes": ["^[a-z0-9][-_.a-z0-9]+[a-z0-9]$"]
},
{
"name": "withSubstring",
"label": "With Substring",
"help_text: "Filter only words containing the specified substring.",
"is_optional": true
},
]
}
167 changes: 167 additions & 0 deletions dataflow/templates/pom.xml
Original file line number Diff line number Diff line change
@@ -0,0 +1,167 @@
<?xml version="1.0" encoding="UTF-8"?>
<!--
Copyright 2018 Google LLC

Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at

http://www.apache.org/licenses/LICENSE-2.0

Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
-->
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>

<groupId>com.example</groupId>
<artifactId>dataflow-templates</artifactId>
<version>1.0</version>

<properties>
<maven.compiler.source>1.8</maven.compiler.source>
<maven.compiler.target>1.8</maven.compiler.target>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>

<beam.version>2.13.0</beam.version>

<maven-compiler-plugin.version>3.8.1</maven-compiler-plugin.version>
<maven-exec-plugin.version>1.6.0</maven-exec-plugin.version>
<maven-jar-plugin.version>3.1.2</maven-jar-plugin.version>
<maven-shade-plugin.version>3.2.1</maven-shade-plugin.version>
<slf4j.version>1.7.26</slf4j.version>
</properties>

<repositories>
<repository>
<id>apache.snapshots</id>
<name>Apache Development Snapshot Repository</name>
<url>https://repository.apache.org/content/repositories/snapshots/</url>
<releases>
<enabled>false</enabled>
</releases>
<snapshots>
<enabled>true</enabled>
</snapshots>
</repository>
</repositories>

<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>${maven-compiler-plugin.version}</version>
</plugin>

<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-jar-plugin</artifactId>
<version>${maven-jar-plugin.version}</version>
<configuration>
<archive>
<manifest>
<addClasspath>true</addClasspath>
<classpathPrefix>lib/</classpathPrefix>
<mainClass>com.example.dataflow.templates.WordCount</mainClass>
</manifest>
</archive>
</configuration>
</plugin>

<!--
Configures `mvn package` to produce a bundled jar ("fat jar") for runners
that require this for job submission to a cluster.
-->
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>${maven-shade-plugin.version}</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<finalName>${project.artifactId}-bundled-${project.version}</finalName>
<filters>
<filter>
<artifact>*:*</artifact>
<excludes>
<exclude>META-INF/LICENSE</exclude>
<exclude>META-INF/*.SF</exclude>
<exclude>META-INF/*.DSA</exclude>
<exclude>META-INF/*.RSA</exclude>
</excludes>
</filter>
</filters>
<transformers>
<transformer implementation="org.apache.maven.plugins.shade.resource.ServicesResourceTransformer"/>
</transformers>
</configuration>
</execution>
</executions>
</plugin>
</plugins>

<pluginManagement>
<plugins>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>exec-maven-plugin</artifactId>
<version>${maven-exec-plugin.version}</version>
<configuration>
<cleanupDaemonThreads>false</cleanupDaemonThreads>
</configuration>
</plugin>
</plugins>
</pluginManagement>
</build>

<dependencies>
<dependency>
<groupId>org.apache.beam</groupId>
<artifactId>beam-sdks-java-core</artifactId>
<version>${beam.version}</version>
</dependency>

<!--
By default, the starter project has a dependency on the Beam DirectRunner
to enable development and testing of pipelines. To run on another of the
Beam runners, add its module to this pom.xml according to the
runner-specific setup instructions on the Beam website:
http://beam.apache.org/documentation/#runners
-->
<dependency>
<groupId>org.apache.beam</groupId>
<artifactId>beam-runners-direct-java</artifactId>
<version>${beam.version}</version>
<scope>runtime</scope>
</dependency>

<dependency>
<groupId>org.apache.beam</groupId>
<artifactId>beam-runners-google-cloud-dataflow-java</artifactId>
<version>${beam.version}</version>
<scope>runtime</scope>
</dependency>

<!-- slf4j API frontend binding with JUL backend -->
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
<version>${slf4j.version}</version>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-jdk14</artifactId>
<version>${slf4j.version}</version>
</dependency>
</dependencies>
</project>
Loading