Hello everyone! I have awesome news – I just wrote my first e-book ever. It's called Awk One-Liners Explained and it's based on the Awk One-Liners Explained article series that I wrote here on my blog a few years ago and that has been read over 2,000,000 times.
I went through all the one-liners in the article series, improved the explanations, fixed the mistakes, added an introduction chapter to Awk one-liners, and two new chapters on the most commonly used Awk special variables and on idiomatic Awk.
Table of Contents
The e-book contains exactly 70 well-explained Awk one-liners. It's divided into the following chapters:
- Preface
- 1. Introduction to Awk One-Liners
- 2. Line Spacing
- 3. Numbering and Calculations
- 4. Text Conversion and Substitution
- 5. Selective Printing and Deleting of Certain Lines
- 6. String and Array Creation
- Appendix A: Awk Special Variables
- Appendix B: Idiomatic Awk
- Index
What is Awk?
Awk is this awesome, little program that's present on nearly ever Unix machine. It's designed to carry out various text processing tasks easily, such as numbering lines, replacing certain words, deleting and printing certain lines.
Let's take a look at several examples.
Example 1: Print the second column from a file
awk '{ print $2 }'
That's all there is to it. Awk automatically splits each line into columns and puts each column in variables $1
, $2
, $3
, etc. This one-liner prints just the 2nd column, which is in variable $2
.
You can also specify the symbol or word that you want to split on with the -F
command line switch. This switch is explained in more details in the e-book and in the last example below.
Example 2: Number lines in a file
awk '{ print NR ": " $0 }' file
The whole line itself goes into variable $0
. This one-liner prints it but prepends the NR
special variable and a colon ": "
before it. The special variable NR
always contains the current line number.
There are many other special variables and they're all explained in the e-book and summarized in the appendix.
Example 3: Count the number of words in a file
awk '{ total = total + NF } END { print total }'
Here another special variable is used. It's the NF
that stands for number of fields, or number of columns, or number of words in the current line. This one-liner then just sums the total number of words up and prints them before quitting in the END
block.
Example 4: Print only lines shorter than 64 characters
awk 'length < 64'
This one-liner uses the length
function to determine the length of the current line. If the current line is less than 64 characters in length, then length < 64
evaluates to true that instructs awk to print the line.
Finally, let's take a look at an example that compares an Awk program with an equivalent C program. Suppose you want to print the list of all users on your system. With Awk it's as simple as this one-liner:
awk -F: '{ print $1 }' /etc/passwd
This one-liner says, "Take each line from /etc/passwd, split it on the colon and print the first field of each line." Very straightforward and easy to write once you know Awk!
Suppose you didn't know Awk. Then you'd have to write it in some other language, such as C. Compare the example above with the example in C language:
#include <stdio.h>
#include <stdlib.h>
#include <string.h>
#define MAX_LINE_LEN 1024
int main () {
char line [MAX_LINE_LEN];
FILE *in = fopen ("/etc/passwd", "r");
if (!in) exit (EXIT_FAILURE);
while (fgets(line, MAX_LINE_LEN, in) != NULL) {
char *sep = strchr(line , ':');
if (!sep) exit (EXIT_FAILURE);
*sep = '\0';
printf("%s\n", line);
}
fclose(in);
return EXIT_SUCCESS ;
}
This is much longer and you have to compile the program first, only then you can run it. If you make any mistakes, you have to recompile it again. That's why one-liners are called one-liners! They are short, easy to write and they do one thing really well. I am pretty sure you're starting to see how mastering Awk and one-liners can make you much more efficient when working in the shell and with text files in general.
Once you read the e-book and work through the examples, you'll be able to solve the most common text processing tasks, such as joining lines in a file, numbering lines, replacing certain words and printing certain lines.
Book Preview
I prepared a book preview that contains the first 11 pages of the book. It includes the table of contents, preface, introduction and the first page of the second chapter.
Buy It Now
The price of the e-book is just $19.99 and you can buy it through PayPal.
After you have made the payment, my automated e-book processing system will send the PDF e-book to you in a few minutes!
Testimonials
Iain Dooley, CEO and founder of Working Software LTD:
It never ceases to amaze me that, even though I spend 50% - 70% of my day using a *nix command line and have done so for the past 6 years, there are still countless thousands of useful tools and tips to learn, each with their corresponding little productivity boosts. The trouble, of course, is finding the time to organise and prioritise them, deciding which are the most important and useful to learn. "Awk One Liners Explained" is a fantastic resource of curated examples that helped me rapidly pick up a few cool tricks that have already provided many times the value I initially paid for the book. Any professional who spends time working with *nix systems can benefit from this book.
Tweet About My Book
I am really excited about my book and I would appreciate your help spreading the word via Twitter. Here is a quick link for tweeting:
My Other Books
I am so passionate about programming and writing about programming that I have now written my second e-book called Sed One-Liners Explained. It's written in the same style as this e-book and it explains sed, the Superman of Unix stream editing. Sed One-Liners Explained contains 100 well-explained one-liners and it's 98 pages long. Take a look!
And I am not stopping here - I am going to release several other books. My next e-book is called Perl One-Liners Explained and it's based on my Perl One-Liners Explained article series.
Enjoy!
Enjoy the book and let me know what you think about it. See you next time!