Skip to content

Athlon1600/useful

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

86 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

đź’ˇ useful

A list of useful commands and bash scripts for many popular stacks.

Install Docker / Docker Compose

curl -sSL https://get.docker.com/ | sh

Squid Http Proxy

Install anonymous proxy server with port 8080 fully open and accessible to everyone:

bash <(wget -O - https://raw.githubusercontent.com/Athlon1600/useful/master/squid/install.sh)

Generate custom Squid installation script:


## number of "open files" by process

lsof | awk '{print $1}' | uniq -c | sort -n

youtube-dl/yt-dlp

Download binaries:

curl -L https://yt-dl.org/downloads/latest/youtube-dl -o /usr/local/bin/youtube-dl && chmod a+rx /usr/local/bin/youtube-dl
curl -L https://github.com/yt-dlp/yt-dlp/releases/latest/download/yt-dlp_linux -o /usr/local/bin/yt-dlp && chmod a+rx /usr/local/bin/yt-dlp

Check if YouTube is blocked:

curl -I -X GET https://www.youtube.com/watch?v=S0DRch3YLh0

Download all URLs in file:

yt-dlp --batch-file "batch.txt" -o "%(upload_date)s-%(title)s.%(ext)s"

Download entire channel slowly:

youtube-dl -o "%(upload_date)s %(title)s.%(ext)s" -f best -ciw --sleep-interval 30 --max-sleep-interval 90 https://www.youtube.com/@AlfoMedia
yt-dlp -o "%(upload_date)s %(title)s.%(ext)s" -f best -ciw --sleep-interval 30 --max-sleep-interval 90 https://www.youtube.com/@AlfoMedia

Apache Bench

Download Windows binaries from - https://www.apachelounge.com/download/

# -n = number of requests
# -c = Number of multiple requests to perform at a time
# -k = enable KeepAlive which is off by default

ab -n 1000 -c 30 -k http://example.com

Siege

sudo apt-get install siege -y
siege -c 500 -r 5000 https://example.com/
siege -c 600 --time=5M https://example.com/

Also edit:

vim $HOME/.siege/siege.conf 

and raise limit from default 255.

NGINX + PHP 7.3

sudo add-apt-repository ppa:ondrej/php

Basic:

sudo apt-get -y install nginx php7.3-fpm

Complete:

sudo apt-get -y install nginx php7.3-fpm php7.3-curl php7.3-mbstring php7.3-zip php7.3-dom php7.3-mysql

Extra:

apt-get -y install php7.3-zip
apt-get -y install build-essential
ln -s /etc/nginx/sites-available/newsite.com /etc/nginx/sites-enabled/

Caddy

Install Caddy

curl -sS https://raw.githubusercontent.com/Athlon1600/chat/master/scripts/caddy.sh | sh

From static binary (Ubuntu)

curl -sL https://github.com/caddyserver/caddy/releases/download/v2.7.6/caddy_2.7.6_linux_amd64.tar.gz | tar -xz && rm -f caddy_2.7.6_linux_amd64.tar.gz LICENSE README.md && chmod +x caddy && mv caddy /usr/local/bin/caddy

From caddyserver.com with rate limiting extension:

curl -o caddy -sL "https://caddyserver.com/api/download?os=linux&arch=amd64&p=github.com/RussellLuo/caddy-ext/ratelimit" && chmod +x caddy && mv caddy /usr/local/bin/caddy

Launch simple static file server in current directory with automatic HTTTPS:

pkill caddy
nohup caddy file-server --root ./ --browse --listen :50519 > /dev/null 2>&1 %

Show requests as they happen:

tail -F /var/log/caddy/access.log | jq -r '"\(.request.host)\t\(.request.uri)"'

Top Request URIs:

tail -n 9000 /var/log/caddy/access.log | jq -r '.request.uri' | sort | uniq -c | sort -nr | head -n 30

Top Request URIs in the last hour:

cat /var/log/caddy/access.log | jq --argjson time_ago $(date -d '1 hour ago' +%s) 'select(.ts > $time_ago)' | jq -r '.request.uri' | sort | uniq -c | sort -nr | head -n 30

Top Request URIs from some IP:

tail -n 9000 /var/log/caddy/access.log | jq 'select(.request.remote_ip == "192.0.2.4")' | jq -r '.request.uri' | sort | uniq -c | sort -nr | head -n 30

Top client IPs (useful during DDOS attack):

tail -n 9000 /var/log/caddy/access.log | jq -r '.request.remote_ip' | sort | uniq -c | sort -nr | head -n 30

Composer

curl -sS https://getcomposer.org/installer | sudo php -- --install-dir=/usr/local/bin --filename=composer

Laravel

sudo chown -R www-data:www-data /var/www/unblockvideos.com/storage
php artisan key:generate
wget -O deploy.sh https://raw.githubusercontent.com/Athlon1600/useful/master/deploy-laravel.sh

Node.js

curl -fsSL https://deb.nodesource.com/setup_22.x | sudo -E bash - && sudo apt-get install -y nodejs

Nginx + Express

Configuration to proxy all requests backend to node express server running on port 3000.

wget -O /etc/nginx/nginx.conf https://raw.githubusercontent.com/Athlon1600/useful/master/nginx/express.conf

Restart the server.

Let's Encrypt

sudo snap install --classic certbot

sudo certbot --nginx --agree-tos --register-unsafely-without-email --redirect
sudo certbot --nginx --agree-tos --register-unsafely-without-email --no-redirect

ngxtop

sudo apt-get -y install build-essential python-pip
pip install ngxtop

Usage:

ngxtop -n 30
docker logs --tail 1000 -f nginx | cat | ngxtop -n 30

// Request count by IP:
ngxtop top remote_addr -n 30
docker logs --tail 1000 -f nginx | cat | ngxtop top remote_addr -n 30

Block IP:

iptables -A INPUT -s 195.201.192.154 -j DROP

Add block_ip IP_ADDRESS command to your linux:

echo 'block_ip() { iptables -A INPUT -s "$1" -j DROP; }' >> ~/.bashrc && source ~/.bashrc

php + SSL

cURL error 60: SSL certificate problem: unable to get local issuer certificate (see https://curl.haxx.se/libcurl/c/libcurl-errors.html)

wget https://curl.haxx.se/ca/cacert.pem
mv cacert.pem /etc/php/7.3/cacert.pem

echo "curl.cainfo=\"/etc/php/7.3/cacert.pem\"" >> /etc/php/7.3/cli/php.ini

misc other:

php -r "var_dump(ini_get('curl.cainfo'));"

php -r "var_dump(openssl_get_cert_locations());"

Aria2 - download file in parts in parallel

sudo apt-get -y install snapd
sudo snap install aria2c

// extra
sudo snap install core
export PATH=$PATH:/snap/bin

aria2c -x 16 -j 16 https://archive.org/download/YouTubeCrawlSurveyDataset2009-2010/ytdata.sql.xz

urls.txt:

https://www.google.com/images/branding/googlelogo/1x/googlelogo_color_272x92dp.png
	out=logo.png
aria2c -iurls.txt -x 16 -j 16

Streamlink

streamlink https://site.com/index.m3u8 best -o best.mp4 --http-header "Referer=https://www.youtube.com"
streamlink https://site.com/index.m3u8 best -o streamlink.mp4 --http-header "Referer=https://www.youtube.com" --retry-streams 5 --retry-max 9999 --stream-segment-attempts 9999 --hls-playlist-reload-attempts 9999 --dash-manifest-reload-attempts 9999 --stream-segment-attempts 9999

Redis

sudo apt-get -y install redis-server

Google Big Query

bash <(wget -O - https://raw.githubusercontent.com/Athlon1600/useful/master/google/bigquery.sh)

Java

sudo apt-get -y install default-jre
wget https://raw.githubusercontent.com/Athlon1600/useful/master/java/run.sh

AWS

apt-get install -y unzip curl
curl -L "https://awscli.amazonaws.com/awscli-exe-linux-x86_64.zip" -o "awscliv2.zip" && unzip awscliv2.zip && sudo ./aws/install

## authenticate
aws configure set aws_access_key_id ACCESS_KEY_HERE ; aws configure set aws_secret_access_key SECRET_KEY_GOES_HERE

## copy all files from current directory to a directory on s3 bucket
aws s3 cp . s3://bucket/folder --recursive --storage-class ONEZONE_IA

MySQL - Backup/Restore

Install client first:

sudo apt-get -y install mysql-client

Logical Backup:

mysqldump -u root -pPASSWORD --quick --max_allowed_packet=512M --verbose --databases db_name > db_name.sql
mysqldump -u root -pPASSWORD --quick --max_allowed_packet=512M --verbose --all-databases > all_databases.sql

## backup directly to AWS S3
mysqldump -u root -p --single-transaction --databases db1 db2 | gzip -9 | aws s3 cp - s3://bucket_name/$(date +%Y-%m-%d-%s)/dump.sql.gz --storage-class ONEZONE_IA

## backup from REMOTE server
mysqldump --ssl-mode=DISABLED --column-statistics=0 --host mysql.server.com -u root -pPASSWORD database > database.sql

Physical Backup:

sudo service mysql stop
sudo cp -R /var/lib/mysql/. /root/backups
tar cfvz /root/db-$(date +%F).tar.gz /root/backups/*
sudo service mysql start

Move /var/lib/mysql from one server to another

sudo service mysql stop
rsync --archive --verbose --stats --progress --human-readable /var/lib/mysql/* [email protected]:/var/lib/mysql/
sudo service mysql start

Store Backups to AWS S3:

aws s3 cp db_name.sql s3://bucket_name/backups/$(date +%Y-%m-%d)/db_name.sql --storage-class ONEZONE_IA
aws s3 cp db_name.sql s3://bucket_name/backups/$(date +%Y-%m-%d-%s)/db_name.sql --storage-class ONEZONE_IA

More links

Speedtest by Ookla

One liner install for Ubuntu:

curl -sL https://install.speedtest.net/app/cli/ookla-speedtest-1.2.0-linux-x86_64.tgz | tar -xz && chmod +x speedtest && mv speedtest /usr/local/bin/speedtest

Misc

dmesg -T

Recently modified log files.

ls -lhrt /var/log

Git

Force current folder force to existing repository.

git init
git add .
git commit -m "First commit"
git remote add origin https://github.com/Athlon1600/proxy-checker.git

git remote -v
git remote show origin

git push -f origin master

ffmpeg

Install ffmpeg-amd64 from static binary, and move to /usr/local/bin:

curl -L https://johnvansickle.com/ffmpeg/builds/ffmpeg-git-amd64-static.tar.xz | tar -xJf - --strip-components=1 && rm -rf GPLv3.txt manpages model readme.txt && chmod +x ffmpeg ffprobe && mv ffmpeg /usr/local/bin/ffmpeg && mv ffprobe /usr/local/bin/ffprobe

add subtitles to existing video

ffmpeg -i subtitles.srt subtitles.ass
ffmpeg -i video.mkv -vf ass=subtitles.ass output.mp4

ffmpeg

extract exactly 1 hour of video starting at 10 seconds

ffmpeg -i "input.mp4" -ss 00:00:10.0 -t 01:00:10.0 -c copy output.mp4
ffmpeg -i "input.mp4" -ss 00:00:01.0 -t 00:00:10.0 -c:v libx264 -c:a aac -strict experimental -b:a 128k output.mp4

Combine image and audio into video:

ffmpeg -loop 1 -f image2 -r 2 -i image.jpg -i audio.mp3 -shortest result.mp4

download playlist

ffmpeg -i https://example.com/index.m3u8 -c copy -bsf:a aac_adtstoasc vod.mp4

Links

https://github.com/digitalocean/do_user_scripts