Enterprise Integration Zone is brought to you in partnership with:

Mark O'Neill is VP Innovation at Axway. Previously, he was CTO and co-founder at Vordel, acquired by Axway in 2012. He is the author of the McGraw-Hill book "Web Services Security" and is frequent speaker at conferences including Java One, the RSA Security Conference, and Oracle Open World. Mark is based on Boston, Massachusetts. Mark is a DZone MVB and is not an employee of DZone and has posted 64 posts at DZone. You can read more from them at their website. View Full User Profile

How to Set Logging Content for Tools like Splunk

09.12.2012
| 3606 views |
  • submit to reddit
The Vordel API Server allows you to set the data which gets logged into its various log destinations: log files, database, syslog, etc. Very often, it's useful to log in a particular format for log-crunching tools such as Splunk to process. Here is how to do this:

In the "Settings" section of the API Server configuration in Policy Studio, there is an "audit log" tab. In the screenshot below, I'm setting the format of what gets put into the text logs. You can see that I've chosen it to be:

${timestamp},${id},'${text}'



 

So what does "${timestamp},${id},'${text}'" mean? It means that I'm logging the timestamp, the message ID (which is autogenerated for each message), and the text of each logging filter. I've chosen to separate them with a comma. So the next question is, what goes into ${text} ?

The answer can be seen if you look at any filter in a policy in Policy Studio, and click "Next". This is where you configure what gets logged. What gets logged is what goes into that ${text} value. In the example below, I have a filter which is logging the following upon the "Success" condition: 
${authentication.subject.id} accessed the API
  If you're doing authentication in your policy, then the value of ${authentication.subject.id} is automatically populated by the authenticated client's ID. Note that I am overriding the default logging settings (where only the "Failure" case is logged). I want the info in the "Success" case to go into the log. This is what I want to go into the ${text} part of ${timestamp},${id},'${text}'



 The log files for the API Server v7 are written out to \groups\group-n\instance-m\logs\audit  , where n and m refer to the group which the API Server is in, and the API Server itself.

So now, when I look at the log file being written out, I see this: 
# ProductName=apiserver1 7.0.0-2012-07-07 (Windows)
# CurrentDate=Tue, 11 Sep 2012 17:58:16 -0500
# CurrentDateUTC=1347400696
# TZ=Eastern Daylight Time

09.11.2012 22:59:11,884,Id-0599f187504fb42f112c0c5d,'JoeDeveloper accessed the API'
09.11.2012 22:59:11,916,Id-58e1fd41504fb42f06110000,'JaneDeveloper accessed the API'
09.11.2012 22:59:12,070,Id-ab634e7c504fb430152c0c5d,'JoeDeveloper accessed the API'
  This is simple for a tool like Splunk to process.

While we're looking at the logging settings, let's look at the database logging. You can see in the screenshot below that I'm logging to the database: 



The same content, ${authentication.subject.id} accessed the API , gets logged to the database. This can then be found in the Analytics of the API Server, in the Audit Trail. In the screenshot below, you can see the view of the Audit Trail, showing the entries. This is the same information as is written into the text log. In fact, if I also choose to write to Syslog (let's say) then the same information is written there too.



Clicking in on any of these entries, I can see the message ID, and the filter which wrote that log text. If many filters in my policy wrote out log data for the same message, then this is where I see the path through the policy, filter by filter. 

I can also search the Audit Log here. Let's say I am only interested in the API accesses by "Joe Developer". Here in the screenshot below I construct a search looking for lines which contain "Joe":  

So in summary, there is great flexibility in the Vordel API Server for both the format (e.g. comma-separated) and content (e.g. insert Client IDs) of logged data.
   
Published at DZone with permission of Mark O'neill, author and DZone MVB. (source)

(Note: Opinions expressed in this article and its replies are the opinions of their respective authors and not those of DZone, Inc.)