Spark plugin does not pass JAVA_OPTS and configurations
Bug #1452127 reported by
Ekasit Kijsipongse
This bug affects 1 person
Affects | Status | Importance | Assigned to | Milestone | |
---|---|---|---|---|---|
Sahara |
Confirmed
|
Medium
|
Unassigned |
Bug Description
I used Spark plugin (Juno) to submit a spark job. I set some JAVA_OPTS, configurations, and arguments. It seems that only arguments are passed to spark-submit; but not JAVA_OPTS and configurations. I wonder this is in progress or it's a bug.
Changed in sahara: | |
importance: | Undecided → Medium |
Changed in sahara: | |
milestone: | liberty-3 → liberty-rc1 |
Changed in sahara: | |
assignee: | nobody → Ekasit Kijsipongse (ekasit-kijsipongse) |
status: | Triaged → In Progress |
Changed in sahara: | |
milestone: | liberty-rc1 → mitaka-1 |
Changed in sahara: | |
milestone: | mitaka-1 → mitaka-2 |
Changed in sahara: | |
milestone: | mitaka-2 → next |
Changed in sahara: | |
milestone: | next → none |
assignee: | Ekasit Kijsipongse (ekasit-kijsipongse) → nobody |
status: | In Progress → Confirmed |
Changed in sahara: | |
assignee: | nobody → Mikhail (mlelyakin) |
Changed in sahara: | |
assignee: | Mikhail (mlelyakin) → nobody |
To post a comment you must log in.
never implemented, although I believe there is a TODO comment for it. The Spark EDP engine only passes args at this point.
Can you provide an short example with realistic JAVA_OPTS and config values, and any suggestions on how to verify that they are correctly seen and used by the Spark job?
It is maybe possible that this could land in L3