Wednesday, April 3, 2013

Unix commands and tools you just can't live without

http://www.itworld.com/operating-systems/350405/unix-commands-and-tools-you-just-cant-live-without


Are you someone who never met a Unix command you didn't like? OK, maybe not. But are there commands you just can't imagine living without? Let's look at some that have made a big difference on my busiest days and those that people I've worked with over the years have said are their most important essentials.

On Unix systems, there are commands that do what they need to do and then there are commands that knock your socks off day after day, saving you gobs of time, taking the tedium out of systems administration and giving you the insights that you need to keep your systems humming without making you work too hard. Here are some of the I-can't-live-without commands that me and my Unix buddies find we need every day -- and can't imagine living without -- and some of the ways we use them.

top

There's just no getting along without top. While there are other performance commands that provide a lot more detail on how a system is performing, top provides the most critical information about how your system is working in the most succinct fashion. Fortunately, top is installed by default on a lot of systems. If you don't have it, get it. This command show what your system's load looks like, as well as highlight those processes which are hogging the bulk of your system's resources. Top also displays memory stats and swapping activity. Top is one of my all time favorite Unix commands and one I couldn't manage without.

ping

The ping command was one of the first things that a Unix consultant I worked with many years ago taught me and I've used it many thousands of times since. This command can tell you whether other systems up and even whether your own system is functioning on the network. If I'm sitting in my home office and wondering whether my network connection is up, I'll ping a familiar system much sooner than I'll go look the state of my network interface and generally will know very quickly if my network connection is up.

tail -f

The tail command is handy for many things, but the -f option is special. The top -f command allows you to watch as entries are being added to your log files. Much better than just using tail, it shows entries as they're being added. Do something in one window and watch the resultant log entries in another. This helps you tie cause and effect together without having to think too hard about which log entries relate to which activities.

grep

I doubt that a day goes by without my "grepping" on something. I may be checking on running processes, pulling lines from a log file or looking through text files to analyze a problem, but grep is always at the tip of my tongue.

tee

I don't use the tee command much at all, but friends of mine swear by it. They say that being able to add command output to log files while examining it saves them tons of time and helps them tremendously. They view the output that tells them what's happening on their systems while creating a record of their output at the same time.

find

I still use find quite often to hunt down large files or files with permissions I'd rather not support.


  The find command is wonderfully versatile in that you can search for files by so many different criteria -- ownership, size, permissions, type, modification date, inode number, group, whether it's newer than some reference file, the number of links ... and, of course, name! You can even use find to locate files that have no recognized group or owner (i.e., no groups or owners on the system that are associated with the particular GIDs and UIDs). And then you can decide what to do with your finds -- just print the information or take some action such as removing the files or changing their permissions or ownership.

du -sk [dir]

The du command is, of course, valuable when evaluating disk space. You can use the du -sk * command to see how much space each file and directory in your current file system location is using or du -sk . to see the space occupied by everything in your current directory. I've become particular fond of these commands.

df -k .

I may have come to where I am sitting in the file system by some circuitous route, following symbolic links or not. This df command both shows me what file system I'm sitting in and how much space is available in it.
$ df -k .
Filesystem           1K-blocks      Used Available Use% Mounted on
boson:/data          201582336   4991232 186351104   3% /data/boson

lsof

The lsof (list open files) command is a powerful tool for displaying open files. It doesn't matter what kind of files are open or even if they're the kind of thing that most of us don't normally think of as files -- such as pipes, character and block special files, directories and sockets. The lsof command will provide valuable information. Want to see all open files? Just use the lsof command by itself. Want to see what processes are using a specific file? Use the command lsof filename. The lsof -u username command will show you all files currently open by a particular user. Very valuable information indeed!

fuser

The fuser command is one which I only learned sometime in the last ten years or so (i.e., recent for me). It is definitely the right tool for the job when you want to know what process is using a particular file or why you can unmount a file system that the system keeps saying is "busy".

netstat

I truly appreciate the netstat command, especially netstat -rn which shows you a system's routing table and netstat -a | grep "LISTEN " which shows you listening ports on Linux (netstat -a | grep LISTEN on Solaris).

awk

Another all-time winner for me is awk. Being able to select a single column from a file or from command output provides a huge number of shortcuts. I often use awk to manipulate huge data files.


  If I want to know, for example, all the possible values that the third field in such a file can assume, a command like this works wonders:
awk -F: '{print $3}' | sort | uniq -c
That command will show me each unique value along with a count of how many times each appears in the file. Plus it uses the colon character to know where "fields" start and stop. Commands like this are invaluable for getting quick answers from unwieldy data files.

sed and tr

I use sed as needed. Some of my Unix buddies use it as much as they do awk, but I use awk probably 50 times as often as I use sed. It's still among the basic tools that I need, just not as beloved as awk. I also use tr at least as often as I use sed. Both commands provide a way to modify text between pipes, just differently.

rsync

I've been deeply impressed by rsync ever since I was first introduced to the command. For super efficient synchronization of files and directories between servers, rsync is a godsend. And, yes, I can't imagine working without it.

scp

When I have to copy a file or set of files from one system to another, scp is my friend. I like that I can set it up for password-free operation for those automatic file transfers that I have to do from time to time.

perl

I'm still not a wizard when it comes to Perl scripting, but I'm good enough to do a lot of really cool file reformatting and
manipulation. Perl's use of regular expressions gives it a high ranking in my list of vital tools.

sar

I can't say that I use sar every day, but I definitely benefit from it every day. I get email from sar scripts that send me performance reports on some of my most critical servers. Every day. Long gone are the days that I only looked into system performance data when something way definitely wrong with my servers. These days, I look at performance data every day -- because it comes to me -- and I know what normal performance generally looks like on my systems.

for loops

Lastly, but not leastly, I depend heavily in for loops. I can't go a day without some form of for SOMETHING in `some command`. For loops save me lots of time every single day. And I can't imagine how I'd get all my work done or stay focused if I didn't have the option of looping through a complete set of values, regardless of their source. Whether systems, files or values of some other kind, for loops take the tedium out of having to check N things or run the same command for some large number of members of a particular data set.
And, of course, I take vi/vim and commands like date for granted, like breathing. And cron for getting work done while I'm asleep.

No comments:

Post a Comment