Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

New Hadoop 2.7 release. Also, more straightforward pkg_pre_download #62

Open
wants to merge 5 commits into
base: master
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from 2 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
7 changes: 3 additions & 4 deletions bin/cloud-local.sh
Original file line number Diff line number Diff line change
Expand Up @@ -38,8 +38,7 @@ function download_packages {
if [[ ! -z ${pkg_pre_download+x} ]]; then
# Does that folder actually exist?
if [[ -d ${pkg_pre_download} ]] ; then
test -d ${CLOUD_HOME}/pkg || rmdir ${CLOUD_HOME}/pkg
test -h ${CLOUD_HOME}/pkg && rm ${CLOUD_HOME}/pkg
rm -f ${CLOUD_HOME}/pkg
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

you'll likely need -rf or rmdir here? any reason not to keep the test as is? that way no errors are thrown if the directory does not exist I think...

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

also the previous test -h unlinked and relinked the pkg_pre_download dir so that if it changes the script works?

Copy link
Contributor

@jahhulbert-ccri jahhulbert-ccri Jan 2, 2018

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So here was the problem i hit...basically if you have previously tried to download packages (which creates a pkg dir) then want to enable the shared dir it fails:

ahulbert@nuc16:/tmp/cloud-local$ bin/cloud-local.sh init
rm: cannot remove '/tmp/cloud-local/pkg': Is a directory
ln: failed to create symbolic link '/tmp/cloud-local/pkg/packages': File exists

I was thinking that block could remove the existing pkg dir if it exists and then link in the shared dir?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

My feeling is primarily that the user should decide what they want to do with the contents of the existing folder... e.g. copy into synology? just plain delete? rename to folder to keep as other stuff?

Copy link
Contributor

@jahhulbert-ccri jahhulbert-ccri Jan 2, 2018

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yeah that sounds like a good idea. Lets not do an RM at all then? we can instead just set a variable to be the pkg dir and pull stuff out of there? That would require making something new like setting a cl_pkg_dir that if the pre download is not set would become ${CLOUD_HOME}/pkg ? that way we don't remove anything and just use the new dir as is and no symlinks. nothing to delete/remove

so by default if you had a ${CLOUD_HOME}/pkg dir and then enable the pkg_pre_download it just leaves the existing pkg dir there i guess

ln -s ${pkg_pre_download} ${CLOUD_HOME}/pkg
echo "Skipping downloads... using ${pkg_pre_download}"
return 0
Expand Down Expand Up @@ -98,7 +97,7 @@ function download_packages {
"${mirror}/zookeeper/zookeeper-${pkg_zookeeper_ver}/zookeeper-${pkg_zookeeper_ver}.tar.gz"
"${mirror}/spark/spark-${pkg_spark_ver}/spark-${pkg_spark_ver}-bin-${pkg_spark_hadoop_ver}.tgz")


Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

👍

if [[ "$kafka_enabled" -eq 1 ]]; then
urls=("${urls[@]}" "${mirror}/kafka/${pkg_kafka_ver}/kafka_${pkg_kafka_scala_ver}-${pkg_kafka_ver}.tgz")
fi
Expand All @@ -119,7 +118,7 @@ function download_packages {
fname=$(basename "$x");
echo "fetching ${x}";
wget -c -O "${CLOUD_HOME}/pkg/${fname}" "$x" || { rm -f "${CLOUD_HOME}/pkg/${fname}"; echo "Error Downloading: ${fname}"; errorList="${errorList} ${x} ${NL}"; };
done
done

if [[ -n "${errorList}" ]]; then
echo "Failed to download: ${NL} ${errorList}";
Expand Down
2 changes: 1 addition & 1 deletion conf/cloud-local.conf
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,7 @@ pkg_src_maven="https://repo1.maven.org/maven2"
pkg_accumulo_ver="1.8.1"
pkg_hbase_ver="1.2.6"
# Note pkg_spark_hadoop_ver below if modifying
pkg_hadoop_ver="2.7.4"
pkg_hadoop_ver="2.7.5"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

👍 actually we need to use the "archive" site in general i'll add a ticket for this. That way this would keep working instead of having to upgrade

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

#63

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@ddseapy yeah, here is the convo

# Note, just the major+minor from Hadoop, not patch level
hadoop_base_ver=${pkg_hadoop_ver:0:3}

Expand Down