Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

New Hadoop 2.7 release. Also, more straightforward pkg_pre_download #62

Open
wants to merge 5 commits into
base: master
Choose a base branch
from

Conversation

djr7m
Copy link
Contributor

@djr7m djr7m commented Jan 2, 2018

@jahhulbert-ccri this is super simple. I was tempted to push straight onto master.

@djr7m djr7m requested a review from jahhulbert-ccri January 2, 2018 20:41
@@ -38,8 +38,7 @@ function download_packages {
if [[ ! -z ${pkg_pre_download+x} ]]; then
# Does that folder actually exist?
if [[ -d ${pkg_pre_download} ]] ; then
test -d ${CLOUD_HOME}/pkg || rmdir ${CLOUD_HOME}/pkg
test -h ${CLOUD_HOME}/pkg && rm ${CLOUD_HOME}/pkg
rm -f pkg
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I can agree with the simplification but I don't always run commands from the $CLOUD_HOME directory. Should we make it rm -f ${CLOUD_HOME}/pkg to be sure?

@djr7m
Copy link
Contributor Author

djr7m commented Jan 2, 2018

Good catch, my bad. Will fix that.

@@ -38,8 +38,7 @@ function download_packages {
if [[ ! -z ${pkg_pre_download+x} ]]; then
# Does that folder actually exist?
if [[ -d ${pkg_pre_download} ]] ; then
test -d ${CLOUD_HOME}/pkg || rmdir ${CLOUD_HOME}/pkg
test -h ${CLOUD_HOME}/pkg && rm ${CLOUD_HOME}/pkg
rm -f ${CLOUD_HOME}/pkg
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

you'll likely need -rf or rmdir here? any reason not to keep the test as is? that way no errors are thrown if the directory does not exist I think...

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

also the previous test -h unlinked and relinked the pkg_pre_download dir so that if it changes the script works?

Copy link
Contributor

@jahhulbert-ccri jahhulbert-ccri Jan 2, 2018

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So here was the problem i hit...basically if you have previously tried to download packages (which creates a pkg dir) then want to enable the shared dir it fails:

ahulbert@nuc16:/tmp/cloud-local$ bin/cloud-local.sh init
rm: cannot remove '/tmp/cloud-local/pkg': Is a directory
ln: failed to create symbolic link '/tmp/cloud-local/pkg/packages': File exists

I was thinking that block could remove the existing pkg dir if it exists and then link in the shared dir?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

My feeling is primarily that the user should decide what they want to do with the contents of the existing folder... e.g. copy into synology? just plain delete? rename to folder to keep as other stuff?

Copy link
Contributor

@jahhulbert-ccri jahhulbert-ccri Jan 2, 2018

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yeah that sounds like a good idea. Lets not do an RM at all then? we can instead just set a variable to be the pkg dir and pull stuff out of there? That would require making something new like setting a cl_pkg_dir that if the pre download is not set would become ${CLOUD_HOME}/pkg ? that way we don't remove anything and just use the new dir as is and no symlinks. nothing to delete/remove

so by default if you had a ${CLOUD_HOME}/pkg dir and then enable the pkg_pre_download it just leaves the existing pkg dir there i guess

@@ -98,7 +97,7 @@ function download_packages {
"${mirror}/zookeeper/zookeeper-${pkg_zookeeper_ver}/zookeeper-${pkg_zookeeper_ver}.tar.gz"
"${mirror}/spark/spark-${pkg_spark_ver}/spark-${pkg_spark_ver}-bin-${pkg_spark_hadoop_ver}.tgz")


Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

👍

@@ -27,7 +27,7 @@ pkg_src_maven="https://repo1.maven.org/maven2"
pkg_accumulo_ver="1.8.1"
pkg_hbase_ver="1.2.6"
# Note pkg_spark_hadoop_ver below if modifying
pkg_hadoop_ver="2.7.4"
pkg_hadoop_ver="2.7.5"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

👍 actually we need to use the "archive" site in general i'll add a ticket for this. That way this would keep working instead of having to upgrade

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

#63

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@ddseapy yeah, here is the convo

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants