Bash Scripts
BASH SCRIPTS
Hits Since 16 Oct 2004
Contact Me

I switched over to the bash shell from tcsh in early 2004. bash is such a powerful shell that I wish I had made the switch years ago. Since the switch I ported a couple of my tcsh scripts to bash and greatly enhanced them. I've also written a few more. Below are brief descriptions of each script. I try to make all my scripts self documenting. Most of them have online help that is accessible by putting the "-h" option on the command line. I use these scripts everyday in my work and they save me a lot of time. All of my command line scripts have a unique debugging facility built in that allows the user to easily choose what debug information to display. The command line option "-hd" outputs a help page describing how it works. All these scripts are copyright by John R Larsen, but are freely available for personal use. Contact the author about commercial use.
backup         dir_array         diskusage         h         pt         scan         ssh_tunnel         .bashrc



ssh_tunnel (Right click to download)
ssh_tunnel is a script that establishes an ssh connection between a client behind a firewall and an ssh server somewhere on the internet. It is run on the client either as a cron job or a daemon. It forwards ports in both directions based on settings in a configuration file. It uses "heartbeats" to detect a broken connection and reestablishes the connection automatically. It sends emails whenever the tunnel goes down and when it is restablished. It keeps logfiles on both the client and the server which are automatically rotated weekly. It can periodically update a local or remote webserver with a statistics page using scp. Below is the output of the -h command line option.
This script has been tested on the following platforms:
  • Mandrake Linux 8.2, 9.1, and 10.1
  • Red Hat Workstation Enterprise 3
  • Solaris 8
  • Cygwin running on Windows 2000 and Windows XP Home and Professional

       ----- ssh_tunnel ($SCRIPT_VER $SCRIPT_DATE) --------------------------------------------
       usage:
       ssh_tunnel -p {loader | tunnel | pulse} [options]
       
       This program is started by a cron job on an ssh client.  It makes an ssh
       connection with an ssh server and forwards ports as defined in "tunnel.conf".  The
       client and server are defined in required file loader.conf.  All the files are
       located in the same directory as ssh_tunnel, which must be invoked with the full
       path.  Considerable configuration is required.  Use "ssh_tunnel -hs" to read.
    
       -p loader - Runs on ssh client.  Started by a cronjob, /etc/init.d/tunnel,
                   or manually.  It launches "tunnel".  It checks ssh server for 
                   updated tunnel.conf and restarts "tunnel" when new file is found.  
                   It does heartbeat verification on the client side and restarts 
                   "tunnel" upon heartbeat failure or if "tunnel" pid is inactive.
       -p tunnel - Runs on ssh client.  Started by "loader".  Establishes the ssh 
                   tunnel with the ssh server using port forwardings defined in 
                   tunnel.conf.  It writes and copies client and server heartbeats.  
                   It runs forever until killed by "loader" or kills itself if it 
                   detects incorrect pid.
       -p pulse -  Runs on ssh server.  Started by "tunnel" when it establishes the ssh
                   tunnel.  It does heartbeat verification on the server side.  It
                   runs forever until the tunnel is killed, heartbeat verification
                   fails, or it detects incorrect pid.
    
       Options:
       -c      cronjob processing
       -D n    Daemon mode of operation and daemon sleep time in seconds
       -d n    debug level (ie. -d 0x4040 )
       -dl     delete logs
       -ds     delete statistics
       -e adr  email_address (If "none" then no email is sent)
               (Separate multiple email addresses with commas and no spaces)
       -h      This help screen
       -hd     Help on debugging
       -hs     Help on configuration setup
       -k      kill a running instance of ssh_tunnel
       -md     Make an init.d daemon mode startup file. You MUST modify this for your situation!
       -ml     Make a loader.conf file. You MUST modify this for your situation!
       -mt     Make a tunnel.conf. You MUST modify this for your situation!
       -mw     Make a webpage.copy script. You MUST modify this for your situation!
       -s      show statistics
       -st n   sleep_time in seconds (Default: $SLEEP_TIME)
       -wr n   Web page update Rate in minutes
               Note: This feature enabled if script webpage.copy exists in `pwd`.
               The name of the generated web page (webpage.html) is passed
               in \$1 to the script to allow renaming.  webpage.copy must be stand 
               alone containing all steps needed to transfer the file, ie. scp or cp
                  Defaults: 10 minutes when tunnel is enabled
                            Each cron invocation when the tunnel is disabled
    
       Copyright: 2004 by John R Larsen
       Free for personal use.  Contact the author for commercial use.
       http://larsen-family.us            [email protected]
    


    Note: I ported ssh_tunnel to perl in 2006. It is much more stable. The perl version is available on Source Forge.


  • pt (Right click to download)
    pt (ping test) is a script that periodically pings an IP address or domain name. The ping rate is user selectable. The user selects a threshold number of dropped pings at which an email is sent to inform you that network connectivity to the site is down. Another email is sent when a ping is successfully received, indicating that the site is back up. The script maintains overall statistics, weekly statistics, daily statistics, and optionally hourly statistics. Emails contain this statistical report. Also in the report is a log of all emails sent. This makes it easy to see all the up / down activity. The script also performs a traceroute to the site. The script maintains logfiles which are automatically rotated weekly. It can periodically update a local or remote webserver with a statistics page using scp. Click here to see working webpages. pt is run as a cron job. It is automatically restarted if it hangs for some reason. Below is the output of the -h option.
    This script has been tested on the following platforms:
  • Mandrake Linux 8.2, 9.1, and 10.1
  • Red Hat Workstation Enterprise 3
  • Solaris 8
  • Cygwin running on Windows 2000 and Windows XP Home and Professional

       ----- ping_test ($SCRIPT_VER $SCRIPT_DATE) --------------------------------------------
       usage:
       pt IP | Domain_name  [ options ]
       
       The IP or domain name is pinged once every ping_rate until the program is
       halted.  Ping results stored in \$IP.logfile.  Email log is \$IP.emails.
       
       Options:
       -c       Cron job processing
       -d n     debug level (ie. -d 0x4040 )
       -ds      Delete Statistics
       -dl      Delete Logs
       -s       Show statistics
       -e adr   Email address (Default: $EMAIL_ADDRESS)
                (Separate multiple email addresses with commas and no spaces)
       -h       Help screen
       -hd      Help Debug
       -hourly  Enable hourly tracking of ping statistics (Default: Disabled)
       -k       Kill a running instance of pt
       -m       Make a \$IP.copy script. You MUST modify this for your situation!
       -mv IP   Rename all the files associated with this test to the IP entered.
       -r n     ping Rate in seconds (Default: $PING_RATE)
       -t n     Threshold number of dropped packets before sending an email (Default: $DROPPED_THRESHOLD)
       -tto n   Traceroute TimeOut value in seconds (Default: 15)
       -wr n    Web page update Rate in minutes (Default: 10)
                (Note: This feature enabled if script \$IP.copy exists in `pwd`.
                 The name of the generated web page (\$IP.html) is passed
                 in \$1 to the script to allow renaming.  \$IP.copy must be stand 
                 alone containing all steps needed to transfer the file, ie. scp or cp)
    			


    Note: I ported pt to perl in 2006. It is much more stable. The perl version is available on Source Forge.


  • backup (Right click to download)
    backup is a script that runs daily as a cron job. backup is very user configurable. As its name implies, it backs up directories and files using zip. I chose zip instead of tgz because zip is common everywhere. The user defines the starting directories in a config file. backup begins at the starting directory and recurses the directory structure looking for directories that have a file "backup.list" in them. When it finds that file it uses its contents to direct the backup operation. The user defines the directory where the backups will be stored. backup has "daily" and "weekly" modes. The daily mode simply freshens an existing archive with any new files. The weekly mode creates a date unique archive file. backup's operation is directed by simple control files that the user puts in the directories. The user can specify that a directory shouldn't be entered by putting a "backup.norecurse" file in it. The user can define commands that should be executed upon entry to a directory by putting a "backup.enter_cmds" in the directory. Likewise it performs commands before exiting if the file "backup.exit_cmds" is found. A template "backup.exit_cmds" can be generated by backup that performs a remote copy of an archive using scp. This makes it easy to perform secure offsite backups of critical information. backup sends emails if errors are encountered and an email at the end of the backup operation. backup can update a local or remote webserver progressively as a backup proceeds. Click here to view a sample webpage. This makes it convenient to see how far a backup has gone if backups take a long time. backup shows beginning and ending disk usage of all mounted volumes. backup has a test mode that allows the user to verify the configuration of all files. It performs a dry run of the backup outputting a report indicating what would have been backed up. Below is the output of the -h command line option.
    This script has been tested on the following platforms:
  • Mandrake Linux 8.2, 9.1, and 10.1
  • Red Hat Workstation Enterprise 3
  • Solaris 8
  • Cygwin running on Windows 2000 and Windows XP Home and Professional

       ----- backup ($SCRIPT_VER $SCRIPT_DATE) --------------------------------------------
       usage:
       backup {daily | weekly} [ options ]
       
       daily    Perform a daily backup.  This updates the zipfile with any changes.
       weekly   Perform a weekly backup.  This creates time stamped zip files.
       
       Options:
       -d 0xN   Debug flags value.  Use the "-hd" option to see how to use them.
       -e adr   Email address (Default is none)
                (Separate multiple email addresses with commas and no spaces)
       -f file  Name of File containing directories to backup (default: backup.dirlist)
       -fs n    FailSafe recursion level (default: 100)
       -h       Help screen
       -hd      Help Debug. This describes how the builtin debugging works.
       -hs      Help setup. This gives detailed operating and setup instructions.
       -test    Do everything but zip the files.  This shows what would have been done.
       
                You MUST modify these scripts for your situation!
       -md      Make template backup.dirlist script
       -mdf     Make template backup.df script
       -me      Make template backup.exit_cmds script
       -mw      Make template backup.webpage script
       

    Note: I ported backup to perl in 2006. In 2010 I renamed it to "yabs" which is an acronym for "Yet Another Backup Script". This version greatly improves the old bash version. It now supports the "rsync" utitility which is much more powerful. The perl version is available here.




  • scan (Right click to download)
    scan is a script that uses ping to walk through a user entered range if IP addresses to find active IP addresses. The timeout period is user selectable but defaults to 2 seconds. scan outputs to the terminal and optionally to a file. This is a useful script for finding active IPs on a LAN or on the internet. Below is the output of the -h command line option.

       ----- scan ($SCRIPT_VER  -  $SCRIPT_DATE) --------------------------------------------
       usage:
       scan start_IP end_IP [ options ]
       
       Each IP address is tested between start_IP and end_IP.  Live IP addresses are reported.
       
       Options:
       -f name  Filename for output (Default is standard out)
       -h       This help screen
       -hd      Debug help
       -to n    Ping timeout count in seconds (Default 2)
    
    
       Note: 19 Apr 2010 - Fix made by Jim Guion ([email protected]) at line 486. Jim
                           figured out that strings needed to be converted to numbers for
                           proper comparison.  Thanks Jim for the updated version of scan.
    
             



    dir_array (Right click to download)
    dir_array is a script that is sourced by a user's .bashrc script. It improves on the pushd and popd bash builtin commands which are hard to use. dir_array provides a very flexible way to create and manipulate a list of directories. Single letter commands let you jump quickly between directories. Directory lists can be saved and reloaded across sessions. The list below shows the output when "s" is pressed.

        ------------------------------------------------------( help: s ? )-- $DIR_ARRAY_VERSION ----
        Command summary:
        a - Append a directory to the end of the list (Accepts ".")
        b - Backward one directory in the list with wrap around
        c - Clear the directory list and put \`pwd\` in as entry 1 (See "u" below)
        d - Delete an entry in the list. (Accepts "." for current index position.)
        f - Forward one directory in the list with wrap around
        i - Insert a directory before the current directory (Accepts ".")
        j - Jump to directory in the list ("." is a no-op ) (Calls "a" if it isn't in the list)
        s - Show the directory list
            ?  - Show this command summary
            a  - Append default file "$DEFAULT_FILE" to directory list
            af - Append a file from "$FILE_DIR" to directory list
            df - Delete a file in "$FILE_DIR"
            l  - Load directory list from default file "$DEFAULT_FILE"
            lf - Load directory list from file in "$FILE_DIR"
            s  - Sort the directory list
            w  - Write directory list to default file "$DEFAULT_FILE"
            wf - Write directory list to file in "$FILE_DIR"
        t - Toggle back to previous directory
        u - Undo "c"
    			



    h (Right click to download)
    "h" is a function to be included in a user's .bashrc script. It provides a nice interface into the history feature of the bash shell. It displays the command history in reverse order. The number of commands displayed is set by $list_size. After the initial listing a "> " prompt appears. The user has three choices: 1) Press dot "." to exit doing nothing more, 2) Press "enter" to display the next listing of commands, or 3) Enter the number of a command which is executed. Since the command is executed from this function it isn't added to the command history. Below is a sample output with $list_size set to 10. "h" was pressed followed by "enter" followed by a dot "." to exit without executing a command.
       1000     man logrotate
       999      cat logrotate.conf
       998      cd /etc
       997      rm mnt.david-covey.c.Program^Files.Palm.zip
       996      cd /var/www/html/backups
       995      vi VLOG-2004-09-larsen-family.us.log
       994      vi VLOG-2004-10-www.larsen-family.us.log
       993      vi VLOG-2004-10-larsen-family.us.log
       992      cd /var/log/httpd
       >
       991      cd /var/log/http
       990      vi auth.log
       989      vi info
       988      cd /var/log/mail
       987      ln -s /home/jlarsen/bin/backup backup
       986      ln -s /home/jlarsen/bin/scan scan
       985      ln -s /home/jlarsen/bin/dir_array dir_array
       984      ln -s /home/jlarsen/bin/pt pt
       983      cp ../ssh_tunnel/.htaccess .
       982      ln -s /home/jlarsen/bin/ssh_tunnel ssh_tunnel
       > .
    		

    .bashrc (Right click to download)
    .bashrc is the script that bash uses to setup a user's environment when starting a shell. When I was first learning bash it would have been helpful to have an example of a working .bashrc. I'm including mine here. There are quite a few useful functions in it that enhance the shell's normal builtin functions. My .bashrc also shows how to detect different operating systems and setup the environment based on OS.

    diskusage (Right click to download)
    diskusage is a script that is run as a cron job. Each time it is run it gathers the disk usage of all mounted drives. It reports usage by sending an email and it can also post the report to a local or remote webserver. Previous reports are kept in the email and webpage up to a user configurable number. The default is 10. The disks to report on can be configured with a simple file. This script is a very stripped down version of "backup" leaving just enough to handle email and webpage generation. Click here to view a sample webpage. Below is the output of the -h command line option.

       ----- diskusage (v1.7 2004-10-30) --------------------------------------------
       usage:
       diskusage [ options ]
       
       Options:
       -d 0xN   Debug flags value.  Use the "-hd" option to see how to use them.
       -e adr   Email address (Default is none)
                (Separate multiple email addresses with commas and no spaces)
       -h       Help screen
       -hd      Help Debug. This describes how the builtin debugging works.
       -hs      Help setup. This gives detailed operating and setup instructions.
       -n       Number of previous reports to include (Default: 10)
       
                You MUST modify these scripts for your situation!
       -mdf     Make template diskusage.df script
       -mw      Make template diskusage.webpage script