-
Notifications
You must be signed in to change notification settings - Fork 5
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
New Hadoop 2.7 release. Also, more straightforward pkg_pre_download #62
base: master
Are you sure you want to change the base?
Conversation
bin/cloud-local.sh
Outdated
@@ -38,8 +38,7 @@ function download_packages { | |||
if [[ ! -z ${pkg_pre_download+x} ]]; then | |||
# Does that folder actually exist? | |||
if [[ -d ${pkg_pre_download} ]] ; then | |||
test -d ${CLOUD_HOME}/pkg || rmdir ${CLOUD_HOME}/pkg | |||
test -h ${CLOUD_HOME}/pkg && rm ${CLOUD_HOME}/pkg | |||
rm -f pkg |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I can agree with the simplification but I don't always run commands from the $CLOUD_HOME directory. Should we make it rm -f ${CLOUD_HOME}/pkg
to be sure?
Good catch, my bad. Will fix that. |
@@ -38,8 +38,7 @@ function download_packages { | |||
if [[ ! -z ${pkg_pre_download+x} ]]; then | |||
# Does that folder actually exist? | |||
if [[ -d ${pkg_pre_download} ]] ; then | |||
test -d ${CLOUD_HOME}/pkg || rmdir ${CLOUD_HOME}/pkg | |||
test -h ${CLOUD_HOME}/pkg && rm ${CLOUD_HOME}/pkg | |||
rm -f ${CLOUD_HOME}/pkg |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
you'll likely need -rf
or rmdir
here? any reason not to keep the test as is? that way no errors are thrown if the directory does not exist I think...
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
also the previous test -h
unlinked and relinked the pkg_pre_download
dir so that if it changes the script works?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
So here was the problem i hit...basically if you have previously tried to download packages (which creates a pkg dir) then want to enable the shared dir it fails:
ahulbert@nuc16:/tmp/cloud-local$ bin/cloud-local.sh init
rm: cannot remove '/tmp/cloud-local/pkg': Is a directory
ln: failed to create symbolic link '/tmp/cloud-local/pkg/packages': File exists
I was thinking that block could remove the existing pkg dir if it exists and then link in the shared dir?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
My feeling is primarily that the user should decide what they want to do with the contents of the existing folder... e.g. copy into synology? just plain delete? rename to folder to keep as other stuff?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yeah that sounds like a good idea. Lets not do an RM at all then? we can instead just set a variable to be the pkg dir and pull stuff out of there? That would require making something new like setting a cl_pkg_dir
that if the pre download is not set would become ${CLOUD_HOME}/pkg
? that way we don't remove anything and just use the new dir as is and no symlinks. nothing to delete/remove
so by default if you had a ${CLOUD_HOME}/pkg
dir and then enable the pkg_pre_download
it just leaves the existing pkg dir there i guess
bin/cloud-local.sh
Outdated
@@ -98,7 +97,7 @@ function download_packages { | |||
"${mirror}/zookeeper/zookeeper-${pkg_zookeeper_ver}/zookeeper-${pkg_zookeeper_ver}.tar.gz" | |||
"${mirror}/spark/spark-${pkg_spark_ver}/spark-${pkg_spark_ver}-bin-${pkg_spark_hadoop_ver}.tgz") | |||
|
|||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
👍
conf/cloud-local.conf
Outdated
@@ -27,7 +27,7 @@ pkg_src_maven="https://repo1.maven.org/maven2" | |||
pkg_accumulo_ver="1.8.1" | |||
pkg_hbase_ver="1.2.6" | |||
# Note pkg_spark_hadoop_ver below if modifying | |||
pkg_hadoop_ver="2.7.4" | |||
pkg_hadoop_ver="2.7.5" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
👍 actually we need to use the "archive" site in general i'll add a ticket for this. That way this would keep working instead of having to upgrade
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@ddseapy yeah, here is the convo
@jahhulbert-ccri this is super simple. I was tempted to push straight onto master.