Monday, August 11, 2008

The War Against Tedium In The GUI

(Reproduced from The Furtive Penguin)

Extend Your Context Menu With Nautilus-Actions

In a recent post I extolled the virtues of the command line. Today I want to suggest a tip for avoiding GUI tedium. I know that many people argue that the 'average user' is comforted by uniformity in the appearance of the desktop apps that he/she uses. We are all supposed to prefer using ONE app for the same task on a recurring basis. Personally I crave both variety and easy access to it.


Nautilus is a redoubtable file browser and I have nothing but admiration for it. Occasionally though I may want to use 'Thunar' or 'Endeavour' or marvel at the concentric ring analysis of my current file usage that 'Filelight' provides. On these occasions its nice to have simple context menu access to the appropriate app. Thanks to the wonder of 'Nautilus-Actions' this is easy to arrange.


In order to duplicate this setup you will need four packages:- thunar, filelight, endeavour2 and nautilus-actions. If you are using Ubuntu they can all be downloaded via synaptic or apt-get.


Once the nautilus-actions package has been installed you will find an icon in System --> Preferences called Nautilus Actions Configuration. Click on it and click 'Add". If you want to add the programs listed above to your right context menu simply ensure that the three configuration panels in the 'Add a New Action' panel look like the screenshots below.






Repeat three times, once for each package, and if all goes well ( difficult to see why it wouldn't ) your context menu should look something like this :-




Now you can browse folders and subfolders with four different filebrowsers simultaneously and all from the comfort of your context menu. An exercise in futility? Perhaps, but it all helps in the war against monotony on the desktop. Of course there are many other ways in which the nautilus-actions package can be used to customize the desktop, many of them no doubt, much more useful than the above.


In conclusion it should be pointed out that filelight isn't really a filebrowser, more of a sophisticated and aesthetically pleasing disk usage analyzer. But since it will let you drill down into directories and open many types of file it almost does the job. Here's a screenshot for anyone who may not be familiar with it:-





Protect Your Files and Folders with Chattr and Lcap

(Reproduced from The Furtive Penguin)

A recurring theme in the endless series of "Is Linux Ready for the Desktop?" articles is the proposition that using the Bash shell is too complex for the average user. The underlying assumption being that the "average user" is only capable of clicking buttons in a GUI and will be confused beyond all hope of recall if he/she has to type a couple of syllables in a terminal. I believe that this is every bit as false as it is insulting. Heres the truth:-

The bash shell is:-

1. Easy

2. Fun

The specific purpose of this article is to introduce the chattr command and the LCAP utility. Both of these tools are easy to master and of considerable use to any linux user who wishes to protect vital files or folders. Let's suppose that your computer and user account are shared. Perhaps you allow the kids to use it from time to time to play godawful online flash-based games. The day will inevitably come when they decide to explore the contents of your home folder and just as inevitably they will want to experiment with the right-click context menu. How can you prevent an orgy of "accidental" file deletion and protect your vital work or finance-related folders?

Most distro's come with chattr installed by default. Lcap will need to be installed independently, though if you use Ubuntu it is available in the repositories. Simply fire up synaptic and search for "lcap". If you are using another distro packages are available from the following sources.

packetstorm

caspian.dotconf.net

Now we will set the immutable bit on the files that we wish to protect. Files or folders with the immutable bit set cannot be moved, deleted, renamed or appended to. They are immutable and consequently safe from the ravages of the juvenile hordes. So, how does it work? Open a terminal. Firstly you will need to su to root on most linux distro's. On Ubuntu, of course you would use the sudo command and issue your admin password when requested. Heres the full command:-

chattr +i /some/file/or/folder OR ( on Ubuntu )

sudo chattr +i /some/file/or/folder

This command effectively sets the immutable bit on your selected file or folder. If you want to make a folder and all its contents immutable, do this:-

chattr -R +i /some/file/or/folder OR ( on Ubuntu )

sudo chattr -R +i /some/file/or/folder

To remove the immutable bit you simply issue the following command:-

chattr -i /some/file/or folder OR ( on Ubuntu )

sudo chattr -i /some/file/or/folder

What could be simpler?

If on the other hand you seek protection from a slightly more sophisticated threat, perhaps from someone with whom you share a computer who also knows your administrative password, you might resort to using lcap. Lcap removes from the superuser the capability to set or unset the immutable bit ( amongst other things ). If you summon lcap with no arguments you will be presented with a list of capabilities, we are primarily interested in CAP_LINUX_IMMUTABLE. To remove root's ability to set or unset this bit, do the following:-

lcap CAP_LINUX_IMMUTABLE

Below are some shots of the terminal before and after issuing this command. You will plainly see that the asterisk after CAP_LINUX_IMMUTABLE is missing from the second shot denoting that this capability has been successfully removed from the superuser. But dont worry this is not irreversible! It can only be reversed however, by rebooting the system.

Before

After


OK so this is not foolproof but it does provide a fair degree of protection and should be sufficient to safeguard against any but the most determined and knowledgeable vandals. Anyone seeking further information about chattr or lcap should consult the appropriate man pages or the links provided on this article's linkslist page. Hope someone finds this helpful.



Share And Protect

(Reproduced from The Furtive Penguin)


Access Controls With ACL and Eiciel

Traditionally Unix-based systems have dealt with the issue of sharing access to files and folders by tweaking group permissions. As the numbers of new Linux adopters grows in the wake of the Dell and Walmart initiatives perhaps it is time to publicize a more intuitive ( and GUI based ) option. Linux is designed from the ground up to be a multi-user system and family machines are likely to have more than one user account if only to keep the kids from 'accidentally' deleting important files. But what if you want to share certain resources amongst all users on a system? The easiest solution is to create a 'shared' folder in the /home directory and manage access using ACL's.

To do this you need to issue the following command as root:-

mkdir /home/shared

or sudo mkdir /home/shared ( if you are using Ubuntu )

If you are reasonably confident that no adverse security consequences will result you can make this folder world-writable thus:-

chmod 666 /home/shared

or sudo chmod 666 /home/shared ( Ubuntu )

If you want some other user ( besides root ) to own this folder, lets say 'userone', you would do the following:-

chown userone /home/shared

or sudo chown userone /home/shared ( Ubuntu )

Now in order to give you fine-grained control over the contents of this folder and generally make the whole thing work as intended we need to install two packages and tweak one configuration file. The packages in question are 'acl' and 'eiciel', On an Ubuntu system these can be installed with the following commands ( or via Synaptic if you prefer to use the GUI ):-

apt-get install acl

apt-get install eiciel

The 'acl' package gives you access to two commands, 'getfacl' and 'setfacl' which allow you to view and set access control lists at the command line. The 'eiciel' package adds a new tab to the 'properties' view in Nautilus which essentially does the same thing in the GUI. See screenshot below:-

Access Control List Tab in the Nautilus Properties Dailog Box

As you can see this panel allows me, the owner, ( userone ) to grant usertwo read, write or execute permissions on a per file basis. Consequently you can add files to your shared folder with confidence. Each file can have its own individualized user profile and no one need have more permissions than they need or can be trusted with. At the same time everything in the folder can be made readable by all users on the system.

In order to make this work there is one more essential step. You need to edit a system file called /etc/fstab.( BE CAREFUL! Back it up first in case of disaster. ) You will need to open an editor and insert 'acl' in the appropriate place. See 'before' and 'after' example below:-

BEFORE

/dev/hda1 /boot ext3 defaults 0 2 #size=100

/dev/hda2 none swap sw 0 0 #size=250

/dev/hda3 / ext3 defaults,errors=remount-ro 0 1 #size=remaining

/dev/fd0 /floppy auto defaults,user,noauto 0 0

/dev/cdrom /cdrom iso9660 defaults,ro,user,noauto 0 0

proc /proc proc defaults 0 0

AFTER

/dev/hda1 /boot ext3 defaults 0 2 #size=100

/dev/hda2 none swap sw 0 0 #size=250

/dev/hda3 / ext3 defaults,errors=remount-ro,acl 0 1 #size=remaining

/dev/fd0 /floppy auto defaults,user,noauto 0 0

/dev/cdrom cdrom iso9660 defaults,ro,user,noauto 0 0

proc /proc proc defaults 0 0

Insert 'acl' in the line that refers to the partition you want to use access control lists on and reboot. When your machine restarts you will be able to use eiciel in the GUI ( or 'getfacl' and 'setfacl' from the command line ) to set up acl's.

And thats all there is to it! I hope someone finds this helpful.


"Brevity Is The Soul Of Wit"..... Not According To Google!

(Reproduced from The Furtive Penguin)

An Experiment With 'Code to Text' Ratios


I have only recently begun to initiate myself into the mysteries of Search Engine Optimization. We all know that Inbound Links from highly ranked sites are the main determinants of Page Rank. Keywords continue to play a role with some of the minor search engines. I have been told that all but 'Teoma' and 'AllTheWeb' disregard the keywords metatag now. But what about code to text ratio? This is often overlooked and I wondered if it might help to explain one or two anomalies.

"Code to text ratio' is exactly what you might expect - its a comparison of the quantity of code and text on a given page expressed as a ratio. I am reliably informed that it plays a role in the Google and Yahoo page-ranking algorithms. Of course no one really knows precisely how these algorithms work ( except Google and Yahoo ) and they are constantly changing anyway. But according a higher rank to pages which have a lot of text in them does seem logical. The search engines presumably want to prioritize content-rich pages that are informative or useful to their visitors. A page that has nothing but links on it will have a poor 'code to text ratio' because the code involved will outweigh the text.

Does anyone need to be concerned about this? Well, if youre a blogger...probably not. The code to text ratio of the average blog is fairly high, typically above 30 percent. Furtive Penguin weighs in at about 32.98% . This is a consequence of the fact that much of the code needed to generate a blog does not appear on the index page. If, however, you are serving up static html, it is a different matter.

I have a site called 'Americymru' ( pronounced amerikumree ) which is a Welsh American Heritage Site. It has a Page Rank of 1. Or at least some of its pages do. The 'Index' and the 'News' pages for instance. On this site there is a calendar called "This Day In Welsh History" which, for obvious reason, has twelve pages ( here is a sample ). None of them have Page Rank and none of them have any external links that I am aware of. In my opinion these pages offer more of value to the target audience than much of the rest of the site. When I performed a 'code to text ratio' analysis on the 'calendar' pages ( tools for this can be found here and elsewhere on the web ) they scored a miserable average of 4.1%.

So!! I have included a block of text at the bottom of the pages ( some generic and some page specific ) which has boosted the code/text ratio to 15-20%. I am now eagerly awaiting the googlebot's next visit. Will an improved code/text ratio be enough in this case to increase Page Rank and bring the 'calendar' into line with the rest of the site?

An interesting experiment...or at least I think so! But then I am incorrigible.

As a side note one wonders how all this impacts sites which are constructed entirely out of image files. Unless one does some fancy footwork and includes an overlay with the text in a form which can be read by the bots, such pages will appear to be devoid of content. Doing things this way also solves the issue of development of content which is comlpiant with US standards for the disabled. The text overlay would, of course, be readable by a text reader.




Sunday, August 10, 2008

D.I.Y. Apps: Part III The PDF Manager

(Reproduced from The Furtive Penguin )

Of the many tools which can be used to create and manipulate PDF files on a Linux system pdftk is probably the most powerful and useful. It can:-


Merge PDF Documents

Split PDF Pages into a New Document

Rotate PDF Pages or Documents

Decrypt Input as Necessary (Password Required)

Encrypt Output as Desired

Fill PDF Forms with FDF Data or XFDF Data and/or Flatten Forms

Apply a Background Watermark or a Foreground Stamp

Report on PDF Metrics such as Metadata, Bookmarks, and Page Labels

Update PDF Metadata

Attach Files to PDF Pages or the PDF Document

Unpack PDF Attachments

Burst a PDF Document int

o Single Pages

Uncompress and Re-Compress Page Streams

Repair Corrupted PDF (Where Possible)

This is quite an impressive feature set but of course, as supplied, it is a command line tool. A GUI frontend is available for it but you will need to install the Lazarus Pascal compiler before it will run. PDFTK-GUI is available here and the Lazarus compiler together with instructions for installing on Ububtu can be found here. This is quite an overhead just to run one GUI front-end so I am offering an alternative script. The script allows you to access much of the functionality of pdftk without mastering the command line syntax. Just for good measure pdftotext is included as well.


In order to use the script you will need to install the following packages:-pdftk, poppler-utils (or xpdf-utils).Both packages are available from the Debian/Ububtu repositories. Once installed I suggest that you create a PDF directory in your home folder and store both the script and your collection of PDF's there.This script could be easily adapted for use with 'Dialog'. Anyone wishing to do so could usefully consult the previous two articles in this series. There are examples and plenty of code to cannibalize in Parts 1 and 2.

The obligatory screenshots are included below:-

The PDFManager Script (Available here)

PDFTK-GUI

Whichever you decide to install...have fun!

D.I.Y Apps Part IV Project Manager 2

( Reproduced from The Furtive Penguin )

Get the Script here.

This is the latest in a series of articles designed to encourage people to make their own apps on Linux. It is hoped that either:-


a) The script will be useful to someone as it stands, or:-

b) The code can be cannibalized and put to better use by anyone wishing to experiment and customize.


The script makes use of a few simple functions followed by a menu which is defined using the case/esac statement. In this sense it is not dissimilar from the last script in this series although, of course, it serves a completely different purpose. In Part II of this series I posted a script designed to serve a similar end but it was somewhat buggy and offered considerably less functionality. The current revision offers the following options:-

0 Create Project Folder

1 List Folder Contents

2 Open Folder

3 Open Files For Editing (Gedit)

4 Open Files For Editing (OpenOffice)

5 Open Files For Editing (Bluefish)

6 Backup

7 Backup Individual File (You will need to enter the full path for both target and destination)

8 Encrypt Folder

9 Decrypt folder


Most of these are self-explanatory and the overall purpose of the script is fairly clear. It is designed to allow convenient grouping of associated files in 'project' folders. New folders can be created and files can be accessed with a variety of applications dependent upon their type and file extension. There is also provision for backup ( both of the entire folder and individual files ) and encryption. You will need to install 'ccrypt' and 'bluefish' to take advantage of options 5,8 and 9. The best fun you can have with it tho, is to modify it to suit your own individual requirements. Enjoy!

(If you wish to create a launcher for this script simply right-click on the desktop and select 'Create Launcher' from the context menu. Give it a name and supply the path to the script. YOU MUST also check the 'Run In Terminal' box. Click on the 'No Icon' button and you will be presented with a selection of icons. Choose one and click 'OK'. The icon will appear on your desktop. Drag it onto your top Desktop toolbar. Subsequently all you will need to do is click on the icon on your toolbar and a terminal with the 'Project Manager2' menu will appear on your desktop.)

Get the script here.






D.I.Y Apps Part 5 Text Substitution with RPL

( Reproduced from The Furtive Penguin )

Script here

Recursive text substitution in multiple files is not a task that the average end user is called upon to perform very often. But lets suppose that you have a couple of web sites either with a hosting company or on your own server. Let us suppose further, that you want to change the mailto link address on every page on your site. Not a problem if you only have 5 or 6 pages but what if you have five or six hundred? Clearly, in the absence of an automated text replacement utility, you are going to be spending a lot of quality time with the WISYWIG editor of your choice.

Of course you could always employ the venerable 'sed' command with 'find' and 'exec' but that has limitations and the syntax is possibly the most bizarre and grotesque construction in the whole of Unix! Here is an example:-

find ./path/to/directory -type f -exec sed -i 's/oldtext/newtext/' {} \;

Enter 'rpl'!! The program was written for Debian as a free replacement for the non-free rpl program by Joe Laffey which can be found here. Rpl defines its function in the following terms (from the manual):-

"Basic usage is to specify two strings and one or more filenames or directories on the command line.The first string is the string to replace,and the second string is the replacement string."

One of the joys of 'rpl' is that it will replace text recursively by simply specifying the -R option. If you are running Ubuntu/Debian 'rpl' is available from the repositories. It is of course a command line tool but the man page is amongst the most intelligible and comprehensible that I have ever read.

In keeping with the spirit of this series of articles I could not resist writing a 'Dialog' front end for the 'rpl' program which allows the user to deploy some of its most useful functionality from the GUI. Here is the help file included with the script:-

OPTION 1. prints this help file - OPTION 2. will replace all instances of a text string with a new string in a given file - OPTION 3. will replace all instances of a text string with a new string in all files in a given directory. - OPTION 4. will replace all instances of a text string with a new string in all files in a given directory and all its sub-directories. WORKS WITH TEXT AND HTML FILES ONLY! You will need to enter the full path to all files and folders. This front end script should work equally well for single and multiple word substitutions . RPL is a command line program and it is capable of much more than this. In order to acquaint yourself with the full range of its capabilities consult the manual - man rpl. Enjoy!

As you can see the script allows you to replace text in a single file; in a group of files in a directory or in an entire directory tree. Having access to a tool like this can save hours of arduous labour with an HTML editor. In order to make this work you will need to install 'dialog' and 'rpl'. They are both in the Debian/Ubuntu repositories. I have tested this fairly extensively and it seems to work OK. if you find otherwise please let me know so that I can fix it. Enjoy!


Script here







Click to Enlarge


Saturday, February 23, 2008

Embedding Documents With Scribd


Here at the 'Jolly Penguin' we love to embed stuff. What blogger doesn't? Easy access to free quality content is a dream come true. In the past the mighty 'Google' has led the way in this department:- YouTube, Spreadsheet entries via Forms (see earlier post) but today we want to sing the praises of Scribd.

Basically Scribd is a site where you can upload, publish and share your documents. They accept documents in a wide variety of formats (including non-proprietary ones) and offer free accounts with unlimited storage. It is also possible to embed documents in web pages or blog posts, although of course, if you are not the creator of the document you will need to check the licence first.

We have included an example document on this page that we think deserves to be more widely known. It is a collection of 'Unix Administration Horror Stories' compiled by Anatoly N Ivasyuk. The central thesis is that:- "More systems have been wiped out by admins than any hacker could do in a lifetime". Some parts of it make for very chucklesome reading.


Read this doc on Scribd: Unix Administration Horror Stories!!!



Below is another document from Scribd. It consists of the first two installments of an occasional series that we are putting together on Hubpages entitled: "The Linux Command Line For Beginners".

Embedding documents in this way allows you to escape the formatting restrictions of html and the many online wisywig editors provided by blogger, squidoo, hubpages etc. It means that documents can be prepared for easy online distribution using Word Processing or Desktop Publishing software. Scribd is an extremely useful site with great potential.


Read this doc on Scribd: The Linux Command Line For Beginners


Tuesday, February 19, 2008

Cron: The Gentleman's Gentleman

(Reproduced from The Furtive Penguin)


Putting Cron To Use On The Desktop

Cron can be very intimidating for new Linux users. It can only be used from the command line and for some unfathomable reason most Linux distributions insist on opening users crontabs with a command line editor.... sometimes nano, sometimes vi. Either one requires an hour long tutorial in order to achieve basic competence. The easisest way to tame it is to open your .bashrc file and insert the following line:-

export EDITOR=gedit

The .bashrc file lives in your home folder and can viewed be checking "Show Hidden Files" in the 'View' menu in Nautilus. Open it with gedit and insert the above line. This ensures that when you type crontab -e in the terminal to access your crontab file it will always open with gedit. There is no easier way to learn than by example so without further ado I proudly present my own crontab file for your delight and delectation:-



# m h dom mon dow command

5 12 * * * rm /home/userone/.thumbnails/normal/*

5 12 * * * rm /home/userone/.thumbnails/fail/gnome-thumbnail-factory

5 12 * * * rm /home/userone/.xsession-errors

5 12 * * * rm /home/userone/.recently-used

10 12 * * * du -h /home/userone/dback | tee /home/userone/Desktop/duback

5 22 * * * /home/userone/backup


( see 'Examples' below for an explanation of these commands )


The abbreviations in the first line above translate as follows:-


Minute Hour Day Month Day of the week Command

Values for these fields are entered according to the following table:-

1. Minute [0,59]

2. Hour [0,23]

3. Day of the month [1,31]

4. Month of the year [1,12]

5. Day of the week ([0,6] 0=Sunday)

Simple, right? Well not exactly. Cron entries can take some getting used to and the whole procedure is easily forgotten if you dont use it regularly. Fortunately there are a number of sites which will generate your cron entries for you from the comfort of the web browser of your choice. Take your pick from the following list:-


Clockwatchers

Csgnetwork

Alternatively you might elect to use one of the many GUI front-ends that are available for the cron command :-

Kcron

Gcrontab ( also available in the Ubuntu repositoiries )

Vcron

The manual page for cron can be found here.

Of course theres much more to say about cron. You might even choose to discard it altogether in favor of fcron which is an updated and more sophisticated version of the basic program. If you run a server its uses are legion and your crontab file will rapidly become one of your main administrative tools. But, mastering the basics for everyday desktop use need not be too much of a chore and ultimately it will pay dividends in terms of saved time and increased efficiency.

Please visit our suggestions page for some ideas for useful things to do with cron on the desktop.

Friday, February 08, 2008

Rootin' for Root





'Unicize' Your Ubuntu Box Now
Reprinted from Furtivepenguin.net


A recent' Global Announcement' in the Ubuntu Forums has no doubt given many people cause for concern. The announcement entitled ' ATTENTION ALL USERS: Malicious Commands ' notes that there has been an increase in the number of malicious commands masquerading as friendly advice to new users being posted in the forums. There are an estimated 3 to 6 million Ubuntu users in the world today and the number is increasing rapidly due to the distributions' high profile and legendary ease of use. Given this tidal wave of adoption, it is not surprising that some flotsam and jetsam has washed up on the beach. As script-kiddies chortle with schoolboyish glee at their latest 'rm -rf' posting, perhaps it is time to consider increasing security.


I have never been entirely at one with the decision to scrap the old Unix division between 'root' and non-privileged users. Of course Ubuntu doesnt entirely abandon it either but it does dilute it significantly. Entering the root password before issuing system commands is a far more sobering prospect then simply typing 'sudo' and reissuing your normal pass. It conveys a more acute impression of the gravity, and perhaps finality, of what you are about to do. This is my real complaint. The Ubuntu way of doing things trivializes system administration in the interests of usability. I know that it is claimed that there are security advantages to this arrangement but the fact that the root password is never exposed on the network does not compensate, in my view, for the shortcomings of this approach.

What if you are the 'sysadmin' on a family machine which has multiple users? It is to be hoped that all these bright new shiny Everex 200 machines are going to homes where there is at least one person who is prepared to master the basics of system security. How can you prevent grandma or the kids from being taken in by one of these rogue forum postings? The easiest way in my opinion is to bring your Ubuntu 'box' in line with the vast majority of Linux and Unix distros and issue a root password which will be known only to you ( the 'benevolent dictator' ) and strip all other users of their admin privileges.

In order to do this you simply open a terminal and issue the following command:-

sudo passwd root

You will be asked for your normal login password. Enter it. You will be asked for a new Unix password for 'root'. Enter it and confirm when prompted. Then open users-admin and click on each individual user entry. Go to the 'User privileges' tab and uncheck 'Executing system administration tasks'. This will disable access to the system commands in /sbin for normal users and vastly decrease the scope of other commands to do damage to your system.


It will perhaps be objected that normal users can be denied access to system commands without resurrecting the old 'root' password arrangement. I agree, but I happen to think that the old system is the best.

If you follow this advice there are two things that you will need to remember:-

1. NEVER lose or forget your root password.

2. You will need to learn the names of all those programs in the 'System' > "Administration" menu because you will need to su to root in a terminal and issue the appropriate command there in order to open them.

Six Daily Checks For Server Health

Six Commands To Run Daily on A Linux Server To Monitor Performance and Security


Having acquainted themselves with Linux on the Desktop many people have opted to run an internet or home intranet server on their distribution of choice. Running a home server on a LAMP stack can seem an intimidating prospect. Installation from disk is easy enough and there are a wide range of distro's to choose from but there is also much to learn before your installation will do your bidding. Supposing that you persevere and succeed in hosting a few internet sites on your server, How do you then monitor its performance and secure it against intruders? It is not my intention to add to the plethora of excellent installation and setup guides which already exist in various places on the web. The purpose of this article is to introduce a number of elementary procedures which, if practiced regularly, should ensure healthy system performance and a reasonable degree of security. (The prescribed commands are highlighted in red)



Update and Upgrade

You should probably check for updates on a daily basis and if they are available, upgrade immediately. Platitudes, platitudes! But in all seriousness this is probably the single most important thing to do if you want to remain secure. In the world of open source the 'many eyes' pouring over the code ensure that the good guys spot possible exploits first. To benefit from this constant scrutiny you must keep up to date. Of course if you are running an 'Ubuntu' server this is as simple as:-

1. apt-get update

apt-get upgrade

Security and Performance Monitoring

The next step in your daily security routine should be security and performance monitoring. The netstat command will display your incoming and outgoing network connections. If used with the appropriate options it will tell you which services are running and on which ports. Here is my preferred combination:-


2. netstat -pltun

And here is some sample output:-

Active Internet connections (only servers)
Proto Recv-Q Send-Q Local Address Foreign Address State PID/Program name

tcp 0 0 127.0.0.1:3306 0.0.0.0:* LISTEN 4715/mysqld

tcp6 0 0 :::80 :::* LISTEN 5144/apache2

tcp6 0 0 :::21 :::* LISTEN 4847/sshd

udp 0 0 0.0.0.0:68 0.0.0.0:* 4209/dhclient

As you can see this server is running Apache, SSH and MySQL. If this is what you expected to see then all is well. If on the other hand some unidentified service is running on a non-standard port you may have a problem. The function of the p,l,t,u and n options are explained in the netstat man pages which can be found here. It may be the case that a different combination of options are better suited to your needs. It is well worth the trouble to acquaint yourself intimately with this powerful and versatile tool. If you have any suspicious processes running on your machine you should investigate them using 'lsof' e.g:-

lsof -c dhclient

Of course there is nothing suspicious about the dhclient process in this case but nonetheless 'lsof -c' will provide us with a list of all the open files that the process is using. We are then in a position to investigate further by checking for permissions on individual files etc. There are many options for lsof. It is one of the most critical tools to master on a unix/linux system. Ideally you should study the man page but failing that here are two 'lsof' resources, one short and sweet and another which is much more detailed.



3. cat /var/log/auth.log

This command will present you with a list of all recent login attempts made on your server. This is particularly important if you are running SSH. If you find that continuous login attempts are being made with a variety of usernames then it is likely that you are being targetted by an automated script. Read this article for further details. If this is the case you should adopt one or more of these remedial measures immediately:-

a.If possible deny remote logins and use SSH on your intranet only. To achieve this you simply need to disable portforwarding on your router.


b.Consider running SSH on a nonstandard port. This involves a few changes to the configuration files.


c.Abandon password logins and switch to pkcrypto. This will defeat any password based login attempts, automated or otherwise.


d.Install and configure the excellent Denyhosts script. ( not necessary if you resort to a. or c. above )

4. ps; sleep 2; ps

This command will check that your server is not spawning an excessive number of processes. The output should be somewhat similar to this:-


PID TTY TIME CMD

26327 pts/3 00:00:00 bash

26351 pts/3 00:00:00 ps

PID TTY TIME CMD

26327 pts/3 00:00:00 bash

26353 pts/3 00:00:00 ps

Note that the PID of the second 'ps' command (26353) is two numbers higher than the PID of the first command (26351). This is as it should be. If the second PID number is consistently much higher than the first ( assuming that you repeat this test a number of times ) then you have a problem. PID numbers are assigned in sequence so that if the second number is 10, 20 or a 100 times greater than the first it follows that a great many processes are being spawned in a short period of time. If your server is not especially busy then this is problematic. Extensive troubleshooting may be necessary in order to resolve the issue.


For a much more in-depth analysis of server performance consult the man page for the 'vmstat' command. This command should be left to run for an extended period of time and this is definitely not something that needs to be done on a daily basis.. Analysis of the results will reveal much about your servers' current performance.

Check For Rootkits

But what if you have been duped? If a rootkit has been installed on your box then the output from all of the above commands is likely to be bogus. Rootkits install their own version of the very sytem binaries which you would use to detect them. They are obviously doctored in order to conceal the nefarious activities of the hacker who installed them. In order to guard against this possibility you should run daily rootkit checks. Chkrootkit, which is available from the Debian/Ubuntu repositories, runs a battery of tests which will detect the presence of known rootkits on your system. Since it relies on a number of system binaries in order to do its job it is wise to back these up to an independent medium immediately after installing your server. CD is best....I don't think they all fit on a floppy. You should then run chkrootkit using the '-p' flag to specify the path to your "known-good" binaries. A sample command would be:-

/path/to/chkrootkit -p /mnt/cdrom

The binaries you need to back up are as follows:-

id, cut, ps, find, head, awk, ls, netstat, egrep, uname, sed, strings

Assuming that you have not backed up and are using the installed system binaries just run:-

5. chkrootkit


Another tool which does roughly the same job ( plus a few extras ) is Rootkit Hunter. Rkhunter also has a much more pleasing interface, though still command-line based. The project page for Rkhunter can be found here. Its worth running both because it never hurts to doublecheck. The command to run is:-

6. rkhunter -c

So...we have updated, monitored logins, connections and processes and checked for rootkits. I am not suggesting that the daily half-dozen listed above will preserve your server from all ills for ever and ever more. It is much more likely however, that you will have a trouble free experience if you stick to the above regimen.

(I know that all this material is available elsewhere on the web. My presumption in presenting it here is partly justified by the fact that I know of few other sites where it is all gathered in one place. Hope this helps!........If anyone wishes to suggest any other security or performance related commands which could usefully be run on a regular basis please post in the comments section below and I will add them to the list.)



Monday, January 07, 2008

My Top 5 Linux Funnies of 2007

OK so some of them are a little older than that but it's still a fine collection:-

Error Messages

Linux and Vista error messages compared and contrasted.

Penguin Attack

Penguin assaults Windows 2OOO box with large hammer.


Linux Kernel Swear Count

Vital statistical resource.

Why Linux will not displace Windows

An expert speaks!

Running Windows Viruses With Wine

A golden oldie worth revisiting.