Skip to content

Hadoop DSL SparkJob should support namedAppParams construct #111

@convexquad

Description

@convexquad

Currently you declare the arguments to a Hadoop DSL SparkJob using "appParams" where you just list the job arguments directly.

Another way to do this would be to support "namedAppParams" that takes a list of keys for job properties that you have already set on the job and looks up their corresponding values as the values to use for the job argument. This can thrown an error if you haven't declared the given key for the job.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions