Skip to content
This repository was archived by the owner on Nov 23, 2017. It is now read-only.

Conversation

@shivaram
Copy link
Contributor

Fixes #33

There is one problem that is not addressed by this PR. Spark 2.0 and higher only support Hadoop 2.3.0 or higher and thus should also use hadoop_major_version as YARN in spark-ec2.

@shivaram
Copy link
Contributor Author

Added check for hadoop version.

cc @thisisdhaas

spark_ec2.py Outdated

if tachyon_v == "":
print("No valid Tachyon version found; Tachyon won't be set up")
modules = filter(lambda x: x != "tachyon", modules)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

more pythonic is just modules.remove('tachyon')

@thisisdhaas
Copy link
Member

code LGTM w/ a minor comment, I'm not the right person to validate your hadoop versioning logic.

@shivaram
Copy link
Contributor Author

Thanks @thisisdhaas for taking a look. I tested this by launching a 3 node cluster and 2.0.0-preview came up fine. Merging this branch-2.0

@shivaram shivaram merged commit d89a22e into amplab:branch-2.0 Jun 15, 2016
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants