C. AP CHECKS
"chk_awp"
The script "chk_awp" (formerly named "checkit"), will type for you all the commands required to do different types of System Administration checks on 20/50 stations: APs, WPs, and AWs.
In our refinery, every 15 days at 7:00 am, all stations in our systems run this script and report their status. At 7:15, all those reports are captured via FTP (or rmount) by an station connected to the company’s LAN. After consolidating those individual reports into one for that plant, this report is sent to my ccMail (or any Internet email) account.
You can use this report to find very easily which station requires your immediate attention. You can also compare consecutive reports to find important differences in performance, etc.
The script will determine the station type and it will run only the corresponding commands. If you want less information, just delete the commands that you do not want from the script.
To install it in your system, just copy the script to all your APs, AWs, WPs on your system. Modify the crontab file on each one of the stations and add a line to run the script every 15 days (or weekly, monthly, etc).
If you want to have a consolidated report see next section.
The parts of the script and a brief explanation follows:
If you want the script to be executed automatically, modify the crontab file on each one of the 20/50 stations adding a line like this:
00 7 1,15 * * /opt/ac/chk_awp > /opt/ac/LTBUG.chk 2>&1
On this example, the script will be executed twice a month (on days 1, and 15) at 7:00 am. If this station's letterbug is HLAW01 for example, the output of the script will be: HLAW01.chk
You might want to take a look to a sample report for an AW51B.
If you want to learn more about those numbers from vmstat, processes, tables, etc, I would recommend to buy the book: "System Performance Tuning" ($25), by Mike Loukides (O'Reilly & Associates, Inc), subtitled: Help for UNIX System Administrators. Cheaper if bought from www.amazon.com .
BIG FILES ANALISIS:
The list of big files: /opt/ac/bigs can be analyzed in several steps:
If you hide all those files you may find the real big new files. Initially do it step by step, viewing those files first, and then hiding them (first: grep, and then: grep -v)
All greps in one command:
cat bigs |grep -v sample|grep -v informix|grep -v openwin|grep -v "199[2-6]"|more
CONSOLIDATED REPORT:
To have a consolidated plant report on one station, so you do not have to go to each one of the stations to get the individual reports, create an script similar to "retrievr" on one of your stations, better if it is one connected to company's LAN.
After testing, modify your crontab file to run the retriever script a few minutes after the scheduled time for "chk_awp".
15 7 1,15 * * /opt/ac/retrievr > /dev/null 2>&1
For example, for our HL plant where we have the following stations:
HLAP01 (AP50), HLAW01 (AW50), HLAP02 (AP51A), HLAW03 (AW51B), HLPW01 (PW), (WP50s): HLWP01, HLWP02,
HLWP03, HLWP04, HLWP05, HLWP06.
The "retrievr" script is as follows:
#!/bin/sh # # retrievr # # Script to get individual reports from other stations, # consolidate them into a final Plant report (hl.rpt) , # and finally send it to my email account. # # Author: Angel Corbera, TSID1, Refinery Isla, Curacao,N.A. # cd /opt/ac # Retrieve reports via FTP for x in HLAP01 HLAW01 HLAW03 HLWP01 HLWP02 HLWP03 HLWP04 HLWP05 HLWP06 do echo "user root xxxxxx" > $x.ftp echo "ascii" >> $x.ftp echo "get /opt/ac/$x.chk" >> $x.ftp echo "bye" >> $x.ftp ftp -n $x < $x.ftp sleep 5 done # # Retrieve reports via rmount rmount HLPW01 /rem/HLPW01 cp /rem/HLPW01/usr/ac/HLPW01.chk /opt/ac/HLPW01.chk rumount HLPW01 # # Consolidate reports mv PLANT.rpt PLANT.old echo "AP-AW-WP STATUS REPORT for `date`\n" > PLANT.rpt for x in HLAP01 HLAP02 HLAW01 HLAW03 HLWP01 HLWP02 HLWP03 HLWP04 HLWP05 HLWP06 do echo "\n\n========================= $x =========================\n" >> PLANT.rpt cat $x.chk >> PLANT.rpt done # Send final report to my master mail tsid1@email.isla.pdv.com < /opt/ac/PLANT.rp
Replace "xxxxxx" with your root's password.
This page hosted by
Get your own Free Home Page