PDA

View Full Version : Help with awk or sed


turtle
2012-11-17, 13:42
So I'm trying to run a one-liner that will parse a stack of log files and then give me a nice trimmed set of results. I'm getting stuck at one point because there is a pattern I want to use but can't seem to find the syntax to use it. My preference is to use awk. Here is what I currently have:

# cat status.log*|grep "activated"|sort

Which gives a result of:
2012-11-17 12:17:45 [STATUS] turtle [/192.168.x.x:60172] activated

There are more lines than this, but this is the example I need. What I want to do is sed out that "[STATUS]" and then cut off the IP so I output would be date time and username like this:
2012-11-17 12:17:45 turtle

If we go that route then I would remove the leading "[" and be able to use it in my awk statement and be just fine. I would just awk -F [ {' print $1 '} or use cut and have the results I want.

Minimally I would like to use the "[/" as the awk point and use something like awk -F [/ {' print $1 '}. This only results in failure though. :(
awk: fatal: Unmatched [ or [^: /[//

So how about some unix guys helping a budding admin out? Man pages are looking like Chinese to me at this point.

Brad
2012-11-17, 18:15
I like a small challenge! :)

Here's my solution:
sed 's/\[[^]]*\]//' | awk '{print $1" "$2" "$3}'

And an example:
$ echo '2012-11-17 12:17:45 [STATUS] turtle [/192.168.x.x:60172] activated' | sed 's/\[[^]]*\]//' | awk '{print $1" "$2" "$3}'
2012-11-17 12:17:45 turtle

Here's the regex breakdown:

\[ = match a literal '[' character
[ = start a character group
^] = not the ']' character
] end the character group
* = match as many that are found in that character group
[^]]* = put those together, and this says, "match everything up to the first ']' character"
\] = match a literal ']' character
// = squash the matched string from the output

Then, I piped the output to awk to print the three words with spaces between.

Dunzo!

turtle
2012-11-17, 21:04
Thank you very much for the help and explanation! It worked perfectly for me. :) I even know better how to adapt it to other logs I have to go through. The more I thought about this one (personal) the more I realized I could use some of this stuff for ones I go through at work too.

Access logs are a pain to sift through without good filters.