Skip to content

Conversation

ddelong
Copy link

@ddelong ddelong commented Feb 9, 2024

Problem

On hosts that do not have libHadoop.so installed, but have SECCOMP enforcement, the execCommand chmod invocation is denied, leading to failures.

Solution

Use APIs introduced in Java 7 that enable direct file permission control. They only function on POSIX filesystems, but that's acceptable here because SECCOMP doesn't exist on Windows.

Testing

Re-ran several Spark jobs that invoke this code path when interacting with S3A due to local file copying. SECCOMP violations didn't happen and the job ran normally.

Alternative Solutions in Progress

If either of these solutions can be realized, this commit can be reverted:

  1. libHadoop.so is installed on relevant hosts. (Infra Support request: https://hubspot.slack.com/archives/C0DGP7SR3/p1707405452982579)
  2. This change is upstreamed.

@charlesconnell
Copy link

I'd definitely prefer adding libhadoop to our base docker image over this, but if not possible, then this is a fine plan B.

@ddelong
Copy link
Author

ddelong commented Feb 14, 2024

For clarity, I'm going to proceed with this as our solution for now. I'll give myself a task to get this upstreamed for a future version, but practically speaking, this code change solves it for all environments we care about without changing the environment itself. Using the library will require spinning up a more special purpose RPM and altering all relevant images (and then perhaps run into this issue again later if we missed an image).

I'll take ownership on this.

@ddelong ddelong merged commit 819b6f6 into hubspot-3.3.6 Feb 14, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants