b0ecf4f049
- signed-off-by: trimstray <trimstray@gmail.com> |
||
---|---|---|
doc | ||
lib | ||
skel | ||
src | ||
.gitignore | ||
.travis.yml | ||
CODE_OF_CONDUCT.md | ||
CONTRIBUTING.md | ||
LICENSE.md | ||
README.md |
A collection of awesome lists, manuals, blogs, hacks, one-liners and tools for Awesome Ninja Admins.
Who is Ninja Admins?
- race of pure evil who rule the network through a monarchistic feudelic system
- they never opened the door for strangers (or anyone at all)
- they know very nasty piece of code like a fork bombs
- they can make dd is not a destroyer of disks
- they know that
#!/usr/bin/env bash
superior to#!/bin/bash
- they know that
su -
logs in completely as root - they miss and cry for Slackware on production
- they love the old admin nix-world
☑️ Todo
- Add useful shell functions
- Add one-liners for collection tools (eg. CLI Tools)
- Add Ninja Admins T-Shirt stickers
- Generate Awesome Ninja Admins book (eg. pdf format)
Ninja Admins Collection
CLI Tools
▪️ Shells
🔸 Oh My ZSH! - the best framework for managing your Zsh configuration.
🔸 bash-it - a community Bash framework.
▪️ Managers
🔸 Midnight Commander - visual file manager, licensed under GNU General Public License.
🔸 screen - full-screen window manager that multiplexes a physical terminal.
🔸 tmux - terminal multiplexer, lets you switch easily between several programs in one terminal.
▪️ Network
🔸 Curl - command line tool and library
for transferring data with URLs.
🔸 HTTPie - a user-friendly HTTP client.
🔸 gnutls-cli - client program to set up a TLS connection to some other computer.
🔸 netcat - networking utility which reads and writes data across network connections, using the TCP/IP protocol.
🔸 tcpdump - powerful command-line packet analyzer.
▪️ Databases
🔸 pgcli - postgres CLI with autocompletion and syntax highlighting.
🔸 mycli - terminal client for MySQL with autocompletion and syntax highlighting.
Web Tools
▪️ SSL
🔸 SSL Server Test - free online service performs a deep analysis of the configuration of any SSL web server.
🔸 SSL Server Test (DEV) - free online service performs a deep analysis of the configuration of any SSL web server.
🔸 ImmuniWeb® SSLScan - test SSL/TLS (PCI DSS, HIPAA and NIST).
🔸 Report URI - monitoring security policies like CSP and HPKP.
🔸 CSP Evaluator - allows developers and security experts to check if a Content Security Policy.
🔸 Common CA Database - repository of information about CAs, and their root and intermediate certificates.
🔸 CERTSTREAM - real-time certificate transparency log update stream.
▪️ HTTP Headers
🔸 Security Headers - analyse the HTTP response headers (with rating system to the results).
🔸 Observatory by Mozilla - set of tools to analyze your website.
▪️ DNS
🔸 ViewDNS - one source for free DNS related tools and information.
🔸 DNS Spy - monitor, validate and verify your DNS configurations.
🔸 DNSlytics - online investigation tool.
🔸 MX Toolbox - all of your MX record, DNS, blacklist and SMTP diagnostics in one integrated tool.
▪️ Mass scanners (search engines)
🔸 Censys - platform that helps information security practitioners discover, monitor, and analyze devices.
🔸 Shodan - the world's first search engine for Internet-connected devices.
🔸 GreyNoise - mass scanner (such as Shodan and Censys).
▪️ Net-tools
🔸 Netcraft - detailed report about the site, helping you to make informed choices about their integrity.
🔸 Security Trails - APIs for Security Companies, Researchers and Teams.
🔸 Online Curl - curl test, analyze HTTP Response Headers.
🔸 Ping.eu - online Ping, Traceroute, DNS lookup, WHOIS and others.
🔸 Network-Tools - network tools for webmasters, IT technicians & geeks.
🔸 URL Encode/Decode - tool from above to either encode or decode a string of text.
🔸 Hardenize - deploy the security standards.
▪️ Performance
🔸 GTmetrix - analyze your site’s speed and make it faster.
🔸 Sucuri loadtimetester - test here the
performance of any of your sites from across the globe.
▪️ Passwords
🔸 Random.org - generate random passwords.
🔸 Gotcha? - list of 1.4 billion accounts circulates around the Internet.
🔸 have i been pwned? - check if you have an account that has been compromised in a data breach.
Manuals/Howtos/Tutorials
▪️ Bash
🔸 pure-bash-bible - a collection of pure bash alternatives to external processes.
🔸 The Bash Hackers Wiki - hold documentation of any kind about GNU Bash.
▪️ Unix tutorials
🔸 nixCraft - linux and unix tutorials for new and seasoned sysadmin.
🔸 TecMint - the ideal Linux blog for Sysadmins & Geeks.
▪️ Hacking
🔸 Hacking Articles - LRaj Chandel's Security & Hacking Blog.
Blogs
🔸 Brendan Gregg's Blog - Brendan Gregg is an industry expert in computing performance and cloud computing.
🔸 Gynvael "GynDream" Coldwind - Gynvael is a IT security engineer at Google.
🔸 Michał "lcamtuf" Zalewski - "white hat" hacker, computer security expert.
🔸 Mattias Geniar - developer, Sysadmin, Blogger, Podcaster and Public Speaker.
🔸 Nick Craver - Software Developer and Systems Administrator for Stack Exchange.
🔸 Robert Penz - IT security Expert.
🔸 Scott Helme - Security Researcher, international speaker and founder of securityheaders.com and report-uri.com.
🔸 Kacper Szurek - Detection Engineer at ESET.
🔸 Troy Hunt - Microsoft Regional Director and Microsoft Most Valuable Professional for Developer Security.
🔸 Linux Audit - the Linux security blog about Auditing, Hardening, and Compliance by Michael Boelen.
🔸 The Grymoire - collection of useful incantations for wizards, be you computer wizards, magicians, or whatever.
Systems/Services
▪️ Systems
🔸 Slackware - the most "Unix-like" Linux distribution.
🔸 OpenBSD - multi-platform 4.4BSD-based UNIX-like operating system.
🔸 HardenedBSD - HardenedBSD aims to implement innovative exploit mitigation and security solutions.
▪️ HTTP(s) Services
🔸 Varnish HTTP Cache - HTTP accelerator designed for content-heavy dynamic web sites.
▪️ Security/hardening
🔸 Emerald Onion - Seattle-based encrypted-transit internet service provider.
Lists
🔸 Awesome Sysadmin - amazingly awesome open source sysadmin resources.
🔸 Awesome Shell - awesome command-line frameworks, toolkits, guides and gizmos.
🔸 Awesome-Hacking - awesome lists for hackers, pentesters and security researchers.
Hacking/Penetration testing
▪️ Bounty programs
🔸 Openbugbounty - allows any security researcher reporting a vulnerability on any website.
🔸 hackerone - global hacker community to surface the most relevant security issues.
🔸 bugcrowd - crowdsourced cybersecurity for the enterprise.
🔸 Crowdshield - crowdsourced Security & Bug Bounty Management.
▪️ Web Training Apps
🔸 DVWA - PHP/MySQL web application that is damn vulnerable.
🔸 OWASP Mutillidae II - free, open source, deliberately vulnerable web-application.
🔸 OWASP Juice Shop Project - the most bug-free vulnerable application in existence.
🔸 OWASP WebGoat Project - insecure web application maintained by OWASP designed to teach web app security.
🔸 Security Ninjas - open source application security training program.
One-liners
Table of Contents
Tool: terminal
Close shell keeping all subprocess running
disown -a && exit
Exit without saving shell history
kill -9 $$
Perform a branching conditional
true && { echo success;} || { echo failed; }
Pipe stdout and stderr to separate commands
some_command > >(/bin/cmd_for_stdout) 2> >(/bin/cmd_for_stderr)
Pipe stdout and stderr to separate commands
(some_command 2>&1 1>&3 | tee errorlog ) 3>&1 1>&2 | tee stdoutlog
List of commands you use most often
history | awk '{ a[$2]++ } END { for(i in a) { print a[i] " " i } }' | sort -rn | head
Quickly backup a file
cp filename{,.orig}
Delete all files in a folder that don't match a certain file extension
rm !(*.foo|*.bar|*.baz)
Edit a file on a remote host using vim
vim scp://user@host//etc/fstab
Create a directory and change into it at the same time
mkd () { mkdir -p "$@" && cd "$@"; }
Convert uppercase files to lowercase files
rename 'y/A-Z/a-z/' *
Print a row of characters across the terminal
printf "%`tput cols`s" | tr ' ' '#'
Tool: mount
Mount a temporary ram partition
mount -t tmpfs tmpfs /mnt -o size=64M
-t
- filesystem type-o
- mount options
Tool: fuser
Kills a process that is locking a file
fuser -k filename
Show what PID is listening on specific port
fuser -v 53/udp
Tool: ps
Show a 4-way scrollable process tree with full details
ps awwfux | less -S
Processes per user counter
ps hax -o user | sort | uniq -c | sort -r
Tool: find
Find files that have been modified on your system in the past 60 minutes
find / -mmin 60 -type f
Find all files larger than 20M
find / -type f -size +20M
Find duplicate files (based on MD5 hash)
find -type f -exec md5sum '{}' ';' | sort | uniq --all-repeated=separate -w 33
Tool: top
Use top to monitor only all processes with the specific string
top -p $(pgrep -d , <str>)
<str>
- process containing str (eg. nginx, worker)
Tool: diff
Compare two directory trees
diff <(cd directory1 && find | sort) <(cd directory2 && find | sort)
Tool: tail
Annotate tail -f with timestamps
tail -f file | while read; do echo "$(date +%T.%N) $REPLY"; done
Analyse an Apache access log for the most common IP addresses
tail -10000 access_log | awk '{print $1}' | sort | uniq -c | sort -n | tail
Tool: cpulimit
Limit the cpu usage of a process
cpulimit -p pid -l 50
Tool: pwdx
Show current working directory of a process
pwdx <pid>
Tool: taskset
Start a command on only one CPU core
taskset -c 0 <command>
Tool: tr
Show directories in the PATH, one per line
tr : '\n' <<<$PATH
Tool: chmod
Remove executable bit from all files in the current directory
chmod -R -x+X *
Tool: who
Find last reboot time
who -b
Tool: screen
Start screen in detached mode
screen -d -m [<command>]
Tool: du
Show 20 biggest directories with 'K M G'
du | sort -r -n | awk '{split("K M G",v); s=1; while($1>1024){$1/=1024; s++} print int($1)" "v[s]"\t"$2}' | head -n 20
Tool: inotifywait
Init tool everytime a file in a directory is modified
while true ; do inotifywait -r -e MODIFY dir/ && ls dir/ ; done;
Tool: curl
curl -Iks https://www.google.com
-I
- show response headers only-k
- insecure connection when using ssl-s
- silent mode (not display body)
curl -Iks --location -X GET -A "x-agent" https://www.google.com
--location
- follow redirects-X
- set method-A
- set user-agent
curl -Iks --location -X GET -A "x-agent" --proxy http://127.0.0.1:16379 https://www.google.com
--proxy [socks5://|http://]
- set proxy server
Tool: httpie
http -p Hh https://www.google.com
-p
- print request and response headersH
- request headersB
- request bodyh
- response headersb
- response body
http -p Hh --follow --max-redirects 5 --verify no https://www.google.com
-F, --follow
- follow redirects--max-redirects N
- maximum for--follow
--verify no
- skip SSL verification
http -p Hh --follow --max-redirects 5 --verify no --proxy http:http://127.0.0.1:16379 https://www.google.com
--proxy [http:]
- set proxy server
Tool: ssh
Compare a remote file with a local file
ssh user@host cat /path/to/remotefile | diff /path/to/localfile -
SSH connection through host in the middle
ssh -t reachable_host ssh unreachable_host
Run command over ssh on remote host
cat > cmd.txt << __EOF__
cat /etc/hosts
__EOF__
ssh host -l user $(<cmd.txt)
Tool: linux-dev
Testing remote connection to port
timeout 1 bash -c "</dev/<proto>/<host>/<port>" >/dev/null 2>&1 ; echo $?
<proto
- set protocol (tcp/udp)<host>
- set remote host<port>
- set destination port
Read and write to TCP or UDP sockets with common bash tools
exec 5<>/dev/tcp/<host>/<port>; cat <&5 & cat >&5; exec 5>&-
Tool: tcpdump
tcpdump -ne -i eth0 -Q in host 192.168.252.1 and port 443
-n
- don't convert addresses-e
- print the link-level headers-i [iface]
- set interface-Q|-D [in|out|inout]
- choose send/receive direction (-D
- for old tcpdump versions)host [ip|hostname]
- set host, also[host not]
[and|or]
- set logicport [1-65535]
- set port number, also[port not]
tcpdump -ne -i eth0 -Q in host 192.168.252.1 and port 443 -c 5 -w tcpdump.pcap
-c [num]
- capture only num number of packets-w [filename]
- write packets to file,-r [filename]
- reading from file
Tool: ngrep
ngrep -d eth0 "www.google.com" port 443
-d [iface|any]
- set interface[domain]
- set hostnameport [1-65535]
- set port number
ngrep -d eth0 "www.google.com" (host 10.240.20.2) and (port 443)
(host [ip|hostname])
- filter by ip or hostname(port [1-65535])
- filter by port number
ngrep -d eth0 -qt -O ngrep.pcap "www.google.com" port 443
-q
- quiet mode (only payloads)-t
- added timestamps-O [filename]
- save output to file,-I [filename]
- reading from file
ngrep -d eth0 -qt 'HTTP' 'tcp'
HTTP
- show http headerstcp|udp
- set protocol[src|dst] host [ip|hostname]
- set direction for specific node
Tool: hping3
hping3 -V -p 80 -s 5050 <scan_type> www.google.com
-V|--verbose
- verbose mode-p|--destport
- set destination port-s|--baseport
- set source port<scan_type>
- set scan type-F|--fin
- set FIN flag, port open if no reply-S|--syn
- set SYN flag-P|--push
- set PUSH flag-A|--ack
- set ACK flag (use when ping is blocked, RST response back if the port is open)-U|--urg
- set URG flag-Y|--ymas
- set Y unused flag (0x80 - nullscan), port open if no reply-M 0 -UPF
- set TCP sequence number and scan type (URG+PUSH+FIN), port open if no reply
hping3 -V -c 1 -1 -C 8 www.google.com
-c [num]
- packet count-1
- set ICMP mode-C|--icmptype [icmp-num]
- set icmp type (default icmp-echo = 8)
hping3 -V -c 1000000 -d 120 -S -w 64 -p 80 --flood --rand-source <remote_host>
--flood
- sent packets as fast as possible (don't show replies)--rand-source
- random source address mode-d --data
- data size-w|--win
- winsize (default 64)
Tool: netcat
nc -kl 5000
-l
- listen for an incoming connection-k
- listening after client has disconnected>filename.out
- save receive data to file (optional)
nc 192.168.0.1 5051 < filename.in
< filename.in
- send data to remote host
nc -vz 10.240.30.3 5000
-v
- verbose output-z
- scan for listening daemons
nc -vzu 10.240.30.3 1-65535
-u
- scan only udp ports
Transfer data file (archive)
server> nc -l 5000 | tar xzvfp -
client> tar czvfp - /path/to/dir | nc 10.240.30.3 5000
Launch remote shell
server> nc -l 5000 -e /bin/bash
client> nc 10.240.30.3 5000
Simple file server
while true ; do nc -l 5000 | tar -xvf - ; done
Simple HTTP Server
Restarts web server after each request - remove
while
condition for only single connection.
cat > index.html << __EOF__
<!doctype html>
<head>
<meta charset="utf-8">
<meta http-equiv="X-UA-Compatible" content="IE=edge,chrome=1">
<title></title>
<meta name="description" content="">
<meta name="viewport" content="width=device-width, initial-scale=1">
</head>
<body>
<p>
Hello! It's a site.
</p>
</body>
</html>
__EOF__
server> while : ; do \
(echo -ne "HTTP/1.1 200 OK\r\nContent-Length: $(wc -c <index.html)\r\n\r\n" ; cat index.html;) \
| nc -l -p 5000 \
; done
-p
- port number
Simple HTTP Proxy (single connection)
#!/usr/bin/env bash
if [[ $# != 2 ]] ; then
printf "%s\\n" \
"usage: ./nc-proxy listen-port bk_host:bk_port"
fi
_listen_port="$1"
_bk_host=$(echo "$2" | cut -d ":" -f1)
_bk_port=$(echo "$2" | cut -d ":" -f2)
printf " lport: %s\\nbk_host: %s\\nbk_port: %s\\n\\n" \
"$_listen_port" "$_bk_host" "$_bk_port"
_tmp=$(mktemp -d)
_back="$_tmp/pipe.back"
_sent="$_tmp/pipe.sent"
_recv="$_tmp/pipe.recv"
trap 'rm -rf "$_tmp"' EXIT
mkfifo -m 0600 "$_back" "$_sent" "$_recv"
sed "s/^/=> /" <"$_sent" &
sed "s/^/<= /" <"$_recv" &
nc -l -p "$_listen_port" <"$_back" \
| tee "$_sent" \
| nc "$_bk_host" "$_bk_port" \
| tee "$_recv" >"$_back"
server> chmod +x nc-proxy && ./nc-proxy 8080 192.168.252.10:8000
lport: 8080
bk_host: 192.168.252.10
bk_port: 8000
client> http -p h 10.240.30.3:8080
HTTP/1.1 200 OK
Accept-Ranges: bytes
Cache-Control: max-age=31536000
Content-Length: 2748
Content-Type: text/html; charset=utf-8
Date: Sun, 01 Jul 2018 20:12:08 GMT
Last-Modified: Sun, 01 Apr 2018 21:53:37 GMT
Create a single-use TCP or UDP proxy
### TCP -> TCP
nc -l -p 2000 -c "nc [ip|hostname] 3000"
### TCP -> UDP
nc -l -p 2000 -c "nc -u [ip|hostname] 3000"
### UDP -> UDP
nc -l -u -p 2000 -c "nc -u [ip|hostname] 3000"
### UDP -> TCP
nc -l -u -p 2000 -c "nc [ip|hostname] 3000"
Tool: socat
Testing remote connection to port
socat - TCP4:10.240.30.3:22
-
- standard input (STDIO)TCP4:<params>
- set tcp4 connection with specific params[hostname|ip]
- set hostname/ip[1-65535]
- set port number
Redirecting TCP-traffic to a UNIX domain socket under Linux
socat TCP-LISTEN:1234,bind=127.0.0.1,reuseaddr,fork,su=nobody,range=127.0.0.0/8 UNIX-CLIENT:/tmp/foo
TCP-LISTEN:<params>
- set tcp listen with specific params[1-65535]
- set port numberbind=[hostname|ip]
- set bind hostname/ipreuseaddr
- allows other sockets to bind to an addressfork
- keeps the parent process attempting to produce more connectionssu=nobody
- set userrange=[ip-range]
- ip range
UNIX-CLIENT:<params>
- communicates with the specified peer socketfilename
- define socket
Tool: lsof
Show process that use internet connection at the moment
lsof -P -i -n
Show process that use specific port number
lsof -i tcp:443
Lists all listening ports together with the PID of the associated process
lsof -Pan -i tcp -i udp
List all open ports and their owning executables
lsof -i -P | grep -i "listen"
Show open ports
lsof -Pni4 | grep LISTEN | column -t
List all files opened by a particular command
lsof -c "process"
View user activity per directory
lsof -u username -a +D /etc
Tool: netstat
Graph # of connections for each hosts
netstat -an | grep ESTABLISHED | awk '{print $5}' | awk -F: '{print $1}' | grep -v -e '^[[:space:]]*$' | sort | uniq -c | awk '{ printf("%s\t%s\t",$2,$1) ; for (i = 0; i < $1; i++) {printf("*")}; print "" }'
Monitor open connections for specific port including listen, count and sort it per IP
watch "netstat -plan | grep :443 | awk {'print \$5'} | cut -d: -f 1 | sort | uniq -c | sort -nk 1"
Tool: rsync
Rsync remote data as root using sudo
rsync --rsync-path 'sudo rsync' username@hostname:/path/to/dir/ /local/
Tool: awk
Remove duplicate entries in a file without sorting
awk '!x[$0]++' filename
Exclude multiple columns using AWK
awk '{$1=$3=""}1' filename
Tool: sed
To print a specific line from a file
sed -n 10p /path/to/file
Remove a specific line from a file
sed -i 10d /path/to/file
Remove a range of lines from a file
sed -i <file> -re '<start>,<end>d'
Tool: grep
Search for a "pattern" inside all files in the current directory
grep -RnisI "pattern" *
fgrep "pattern" * -R
Remove blank lines from a file and save output to new file
grep . filename > newfilename
Except multiple patterns
grep -vE '(error|critical|warning)' filename