Working with SQL Server error logs in Log Parser Studio

Posted on April 05, 2015

Microsoft Log Parser Studio (LPS) has become one of my favourite (free) tools whenever it comes time to work with log files of any significant size. In the past I have always used one of the pre-define log type formats when working with log files. Recently I needed to work with SQL Server error logs and was a little concerned initially that there was no pre-defined format for these logs and trying one of the standard log types didn’t give me the output I expected (or needed).

With a little bit of effort I managed to figure out what was required, it turned out to be fairly simple. The trick was using the TSV format and then customising the settings.

LPS - select TSVLOG

 

Read the rest of this entry »

CISA, CGEIT, CRISC. What is a good score in an ISACA exam?

Posted on March 11, 2015

Please note these are all my own opinions and comments, based on my experiences and results. This is not an official ISACA post in any way.

There is not a lot of information available on ISACA exam scores and what would constitute a “good” score. After I received my exam results from the most recent CISA exam (and before receiving confirmation of a placing), I thought I had done quite well and wanted an idea of just how good a score this was. I looked around and really couldn’t find much.

Many years ago ISACA used to score their exams with a simple percentage score. 75 was required to pass the exam. A number of years ago they switched to a new system, where the results are provided on a scaled score with the maximum score of 800. A scaled score of 450 or higher is required to pass, which represents the minimum consistent standard of knowledge as established by ISACA’s Exam Certification Committee (there is one for each qualification). The score represents a conversion of individually weighted raw scores based on a common scale. As such you cannot  apply a simple arithmetic mean to convert area scores to your total scaled score. (Wording from various ISACA sources).

A comment on one site that a score of over 700 was a tremendous achievement was the best I could see.  I had written two other ISACA exams over the last few years so went back and looked at those scores and I could see I had done quite a bit better this time, although I had done well in those exams too.

So for those interested, I publish my results from the last three exams.  The information below is edited from results emails received from ISACA after each of the exams in question. I am fairly sure that a “good” score would be exam dependent and vary from exam to exam and year to year. Nonetheless, take this for what it is, set yourself a lofty goal, and good luck with your studies towards achieving one of ISACA’s globally recognised and universally accepted qualifications in the space of Risk, Security, Governance and Compliance.

If you happen to be based in Durban and are planning to write the CISA exam (or one of the others), we are hosting a facilitated study group at the ITSec offices in Durban. This is a no-cost endeavour for the good of the community. Come along and join us. We had a planning session last night and will be meeting weekly from next Tuesday. Studying with a diverse bunch of your peers is a whole lot better than doing it alone. More details here : ITSec facilitated study group.

Justin J Williams

CA(SA), CISSP, MBA (UKZN), CISA(pend), CGEIT(pend),CRISC(pend)

Director, ITSec.

 

Exam Results : December 2014 CISA exam

We are pleased to inform you that you successfully PASSED the exam with a total scaled score of 727.Your score was in the top 5 percent of those testing. For your information, your exam results by area are provided below.

SCALED SCORES OF YOUR PERFORMANCE BY AREA:

  • The Process of Auditing Information Systems: 800
  • Governance and Management of IT: 714
  • Information Systems Acquisition, Development and Implementation: 767
  • Information Systems Operations, Maintenance and Support: 615
  • Protection of Information Assets: 759

This score of 727 was ranked 1st in the World for the December 2014 CISA exam. 

As an aside, this is not the first time I had written the ISACA exam. I wrote and passed it back in 1996 with a score of 83% under the old scoring system. Why would I write it again? A little “oopsie” with CPE credits along the way meant I lost the certification at some point. Based on a new position I took late last year my new employer asked me to write the exam again.

Exam Results : June 2013 CGEIT exam

We are pleased to inform you that you successfully PASSED the exam with a total scaled score of 644.Your score was in the top 5 percent of those testing. For your information, your exam results by area are provided below.

SCALED SCORES OF YOUR PERFORMANCE BY AREA:

  •  Framework for the Governance of Enterprise IT: 722
  • Strategic Management: 702
  • Benefits Realization: 615
  • Risk Optimization: 598
  • Resource Optimization: 540

This score of 644 was ranked 1st in South Africa for the June 2013 CGEIT exam. 

 

Exam Results: December 2012 CRISC exam

We are pleased to inform you that you successfully PASSED the exam with a total scaled score of 634. For your information, your exam results by area are provided below.
SCALED SCORES OF YOUR PERFORMANCE BY AREA:

  • Risk Identification, Assessment and Evaluation: 534
  • Risk Response: 688
  • Risk Monitoring: 650
  • Information Systems Control Design and Implementation: 650
  • Information Systems Control Monitoring and Maintenance: 727

Again, congratulations on passing the CRISC exam, we look forward to having you join the more than 16,000 professionals worldwide who have earned the CRISC credential.

This score of 634 was ranked 3rd in South Africa for the December 2012 CRISC exam. 

Visualisation of time based attacks on DMZ (videos)

Posted on August 24, 2014

Visualisation of two weeks of IPS data

Critical and high significant IPS events detected on a public facing Palto Alto device, visualised using Microsoft Excel Power Map for a period in November and December 2013.

The data is taken from daily detection summaries so although it covers a nearly two-week period has 24 hour time resolutions.

The attacks are differentiated between Spyware and Vulnerability.

Note the fairly constant levels of vulnerability attacks from China, Turkey & Indonesia.

The practical application of such a visualisation in detecting or preventing attacks is limited, however, it provides an effective mechanism to explain the level of attack (directed and random) against the organisation on a pretty much constant basis.

 

 

Visualisation of 24 hours of IPS data

Critical and high significant IPS events detected on a public facing Palto Alto device, visualised using Microsoft Excel Power Map for a 24 hour period on the 10th and 11th December 2013.

The source data is per event detected over that 24 hour period.

The attacks are differentiated between Spyware and Vulnerability.

The video shows two types of visualisation, first a “phased decay” where the attack is plotted and then fades away if not detected. This shows the attacks coming and going across the globe with the exception of China which is fairly constant source of attack.

The second segment shows a continuous growth in the sizes of the attack bubbles over the period. This illustrates the overall relative number of attacks from the various sources.

Note the main sources of vulnerability attacks being China, Turkey, Argentina & Indonesia.

The practical application of such a visualisation in detecting or preventing attacks is limited, however, it provides an effective mechanism to explain the level of attack (directed and random) against the organisation on a pretty much constant basis.

 

Guest lecture to UKZN 2014 MBA Class : Security & Ethics

Posted on August 24, 2014

In this past week I once again had the pleasure of speaking with the UKZN MBA Class. It is always a pleasure to speak to a large group of some of the brightest minds in KZN. Unlike other presentations, these sessions are normally quite interactive and the class willing to share their ideas, experiences and questions.

What stands out for me in this set of discussions were three key diversions.

1. Bank fraud, and the divergence in opinions between the bank representatives and victims (customers)

There is always a lot of interest in, and debate over on-line frauds as they affect individuals. We all know someone, if not ourselves, who has been hit through some kind of bank fraud. In the class were a number of (un-named) employees of various (nameless) banks. They were adamant that the banks do their utmost to refund their customers in the event of frauds. The victims, however, had a polar opposite view and experience. They contended that the banks make it difficult to get your money back, denying, obstructing and delaying in the process while the victim suffers through no having access to the affected funds. For a bank dealing with hundreds of thousands of affected customers and millions in losses, a month may be a short period to resolve such an incident. For a victim needing access to their funds, a month is a payday away and that money could mean the difference between being able to pay your bills or defaulting.

2. Online identities (and password management)

Online identities are increasingly becoming integrated with your professional life. When being hired more and more organisations scan these to see whether they wish to employ you. Whether this is done as part of the background checks (for which prospective employees normally sign permission) or through other means varies. However, needing to take control of and responsibility for your on-line identity is important. Also don’t forget about your children. They may not yet comprehend the gravity of the situation, and could be creating a fun-filled but wholly undesirable persona that they come to regret later in life when they join the job market and are unable to control or erase their past sharings.

Related to this discussion was the age old one of passwords and password re-use. The dangers of password re-use were discussed in detail with some schemes for password protection. The example of people using the same password across all on-line services, and then having the local camera club hacked, with usernames and passwords being revealed and then those same passwords being used to log into gmail, a facebook “I lost my password” event resulting in the password being mailed to gmail, and very quickly the entire on-line identity can be stolen.

Some tips :  Use different passwords on-line, and at very least don’t use your primary mail account password anywhere else. It is better to use a password manager on your mobile (LastPass, Blackberry password keeper etc) then to re-use passwords. Also don’t use your phone address book to store passwords or bank pins and account numbers. If you use an iPhone or Android phone then this information is generally synchronised to the cloud, so when that Gmail account is hacked they also have all of your phone book without you ever knowing.

3. Return to old school

There was a comment / view put forward that with all of the information security breaches and discoveries of organisations and nation states lying to citizens about what is happening in this space that it would be better to return to the (golden) “olden days” . While that may appear to be the case, memory can be a strange thing. We often remember the good and forget the bad. Not so many years ago when cheques were still in common use cheque fraud was rife. The banks didnt like to disclose information on fraud (and still don’t) but some of the stats I remember seeing flashed up at fraud conferences indicate that the fraud we are seeing now is just a fraction of what was experienced at the peak of cheque fraud. Social media and the online information era just increase the level and speed of information sharing. The fewer incidents that happen now are just more widely reported and shared then ever before. Instances of misrepresentation and abuse by companies(and countries) are now more widely shared and reported, what is not clear is whether the actual occurrences are on the rise or just more visible.

We cannot go back in time, we need to move with the times. That said a dose of healthy skepticism in all we are doing can only be a good thing. Ask questions until your are satisfied with the answers. You may choose to trust, but trust and verify, don’t trust blindly.

Finally

Embedded below is a link to download the slides. Thanks for attending the sessions and for participating.  Feel free to drop me any questions you may have (or leave them here).

Information Security and Ethics 2014 August 2014

 

Thanks Andrew for the invitation and facilitating the discussion.

 

Analysing SCCM Logs with Log Parser Studio

Posted on June 21, 2014

Microsoft System Centre Configuration Manager (SCCM) is used in an Active Directory environment to amongst other things deliver patches and antivirus to the servers and workstations.

In a large environment you can have quite a number of SCCM servers operating in tandem to deliver the data to the client devices. Obtaining details in an SCCM environment of exactly which endpoints are being served by which servers (or how busy each server is) isn’t quite as straightforward as one might imagine.  The client devices all connect to the servers through HTTP calls to collect the packages they require. The IIS logs from the SCCM servers can be downloaded and analysed to try to figure out what is happening in the environment.

The IIS logs contain a number of useful fields (a sample included below) :

LogFilename : E:\logs\SCCMP101\SCCMP101\W3SVC1\u_ex140410.log
LogRow : 5
date : 41739
time : 36526.7730902778
c-ip : 10.75.xx.xx
cs-username :
s-sitename :
s-computername :
s-ip : 10.98.xx.xx
s-port : 80
cs-method : GET
cs-uri-stem : /SMS_MP/.sms_aut
cs-uri-query : MPLIST
sc-status : 200
sc-substatus : 0
sc-win32-status : 0
sc-bytes : 608
cs-bytes : 124
time-taken : 15
cs-version :
cs-host : sccmp101.xx.net
cs(User-Agent) : SMS+CCM

Further fields such as Cookie, Referer events and process types are present but I found these to be generally blank.  The above example includes the data transferred (sc-bytes and cs-bytes) which were not turned on by default and which I found quite useful. These can be activated in IIS easily enough.

In my use case I obtained the logs from 83 servers which amounted to 2859 files over 254 folders coming to 122GB (uncompressed). The proves to be a little bit of a challenge when I don’t have SQL server installed on my PC or Laptop, MS Access is limited to 2GB database and even SQL Express 2014 is limited to 10GB.

I had previously heard of (but not used) Microsoft Log Parser. A quick search revealed version 2.2 (released some 9 years ago in 2005 – but don’t let that put you off) available for download. Now this is a pretty nifty tool as it understands log files in many different formats (and even plain text files). This saves you from having to clean up the log files to strip out headers (which are written to the logs every time the web server starts / restarts) and from having to combine many different files. A real time-saver.

You can then write SQL-like queries and have them executed against your log files to get your results. Now with data of the size above on my 8 Gig i7 870 @ 2.93GHZ running off a Seagate 2TB 7200rpm SATA drive it takes around 3.5 hours to run a query (know how to speed this up fundamentally do share).  Using Task Manager to monitor the execution of the query shows CPU utilisation of only around 8% (one thread sits at about a 50% utilisation) memory utilisation of between 50MB upwards (as the result set grows) and disk speed varying from about 8MB/s to 30MB/s. So not quite sure where the bottleneck lies.

Writing the queries at the command line is a little bit of a pain so the next bit of genius is the Log Parser Studio.  Unlike Log Parser, the studio is kept up to date, with the latest build being 2.0.0.100 from 23 May 2014. The studio provides a library of queries (over 170)  covering logs from ActiveSync, IIS, Windows event logs, Exchange, Outlook Web access amongst others. Covers a huge number of ideas for useful analysis and provides the queries to do it for you.  What is great is that you can use these, modify them for your own purposes or create your own from scratch, and add them all to your own personal library.

For example, to understand what the patterns of access look like over a month a query such as this can be pretty useful.

/* Server total time and data transferred by date */
SELECT s-ip,date,count (s-ip),sum(time-taken),sum(sc-bytes),sum(cs-bytes)
FROM '[LOGFILEPATH]'
GROUP BY s-ip,date

A challenge that you quickly come across is that both the Log Parser and the Studio (which is dependent on the Parser) are 32 bit applications so you need to be careful as to which fields and how much summarisation is included in the result set. If the results grow to quickly then the query will crash as it runs out of memory. The trick is to find a balance between too much information (it crashes) and too little information (need to run many queries). Finding the right balance means further analysis can be done in Excel or Access. I have found that 30 000+ rows still works if few enough fields are chosen and some smart queries are used.

When executing the above query a problem is that the bytes transferred exceeds the value that can be stored in the data type, so you end up with negative numbers for some of the lines returned. Rewriting the query as follows assists in resolving this problem :

/* Server total time and data transferred by date */
SELECT s-ip,date,count (s-ip),sum(div(time-taken,60)),sum(div(sc-bytes,1024)),sum(div(cs-bytes,1024))
FROM '[LOGFILEPATH]'
GROUP BY s-ip,date

Log parser supports a huge number of different functions but you need to know the format to use them. Take a look here for a listing and examples : http://logparserplus.com/Functions

So whereas I would have expected sum(time-taken/60) to give me a result in minutes it fails with an unknown field. Even (sum((time-taken)/60) fails. Check the function reference shows that log parser wants it as sum(div(time-taken,60)) and then life is happy again. Running your query again having just spent 3 hours waiting for the last one to complete – a little less so.

Using these tools and queries I was then able to summarise down the 160 gig of source data into a few thousand rows that could be imported into Access and joined to CMDB data to produce really useful intelligence that can be analysed directly in Access, Excel or Gephi. Thanks Microsoft for these great free products.

I was also looking for further pre-built queries for Log Parser Studio but was unable to find any. If you know where such may be found, please do share. If there is interest and as I use the tool more I will be happy to share anything I have created.

 

 

 

 

 

 

 

 

Timeline of my tweets for #itwebsec IT Web Security Summit 2014

Posted on June 03, 2014

I once again had the privilege of attending the IT Web Security Summit in May 2014.  As always when attending these large events I try and cover the presentations I attend through tweets. This creates quite comprehensive coverage as I also monitor the hashtag for the event (#itwebsec in this case) and then re-tweet other bloggers, journalists and active people’s tweets. In the end I believe my timeline is quite a useful archive of the social media (Twitter) coverage of the event.

Now trying to archive this for particular event is somewhat problematic. The web interface for Twitter provides a nice view with the tweeters profile pics, stats (retweets etc) of each tweet and the like. It is however a real pain to put this into a format which I can post onto my blog. I tried editing the HTML of a saved page but without decent tools that HTML code is just unmanageable.

I came across tweetbook.in which provides a sort of journal creation facility and allows you to give a start and end date, then spews out a PDF. Sadly the formatting is quite poor and no pictures are included. It does however give the basic timeline and the tweets are provided timestamped in chronological order so it is much better than nothing.

Below is the tweetbook

JJZA Tweetbook

And the PDF’d twitter page, scroll down until you find the relevant tweets, sorry no selection options – and it is in reverse chronological order.

(oops – the file was 72 meg so exceeds the filesize limit for inserting. Pity indeed).

I found another option (twournal.com) which lets you create (and even sell if you like) books from your tweets. I generated a book from the period but it will mail the book to me in 24 hours. Depending on the size and outcome I will link that here too.

(twournal to come here)

If anybody knows of a better way of doing this then please do share. Sad to see may various events covered go to waste and be lost in the depths of cyberspace.

 

 

 

The Heartbleed bug : a short presentation given at the Kzn ISACA Chapter Meeting

Posted on June 03, 2014

I was honoured to be asked to make a (short) presentation at the May 2014 KZN ISACA Chapter meeting. The meeting went down well with probably around 25 people attending.

Attached is the PDF of the presentation.

I hope that some of the members present found it useful and that you, my readers, do too.

Feedback as always most welcome.

The Heartbleed Bug ISACA presentation v3

 

Visualising Security Data : SCCM patching traffic flows

Posted on March 03, 2014

I have been experimenting a little recently with visualisation of security data.

We have had some challenges with SCCM and needing to understand which clients were connecting to which servers, where and why. This data seemed very hard to come by and after some discussions with some helpful Microsoft South Africa folk the service provider pulled the IIS logs from most of our SCCM Primary Servers and the Distribution Points.

I then added in a Destination column (being the server from which the log was pulled) and combined the logs from all of the servers (6 Primary and 6 Distribution). In MS Access I then summarised the data by source and destination pairs, providing 13952 connections. This was exported as a CSV and headings added in using Notepad (Gephi wouldn’t read the data file without headings named to its liking).

The data was then loaded into Gephi as edge data. I then searched for each of the 12 servers in the node table, added in a Label, changed the colour and size (Red 30 for Primary Blue 20 for secondary), selected the Force Atlas option and let it plot my data. The 13 546 nodes and 113952 were then plotted providing the graph below (when exported as PDF).

The graph was somewhat unexpected in that I did not foresee so many of the workstations being served from Primary servers nor so many devices receiving data from multiple servers. A few of the DP’s (top and bottom of screen) clearly are not serving the numbers of workstations we would expect and need deeper investigation.

While Excel cross tabs and more detailed access queries provide more detailed insight into what is going on this visualisation very quickly demonstrates a very different picture to which the service provider running the SCCM infrastructure had been describing.

Have you done anything similar? Please do share.

 

Map of SCCM links using Gephi

Map of SCCM links using Gephi

 

Download the PDF version here :  map of sccm v2

 

UKZN MBA presentation 8th August 2013 : Information Security & Ethics

Posted on August 11, 2013

On Thursday the 8th August 2013 I was once again privileged to be the guest lecturer for the UKZN MBA programme. Despite Friday being a holiday and the start of the long weekend there was a great turnout. Thanks to all the students for all your questions and contributing to making it an entertaining session.

Below is the link to the slides. Please feel free to contact me if you have an questions or would like to discuss the subject further.

security and ethics UKZN MBA August 2013

UKZN MBA 2013 Presentation : Security & Ethics

Posted on March 02, 2013

On Thursday afternoon I was privileged to speak to the UKZN 2013 MBA class on information security and ethics. Below is a copy of the presentation. Lots of detail in here which we didn’t get to cover in the two hours together, and lots to remind you of the things we shared. I hope you all enjoyed the time as much as I did.

Feel free to mail me or post any questions here.

Justin

Download PDF presentation : security and ethics 2013 UKZN MBA Feb 2013

 

Transversal password cracking with NMAP (without downloading the hashes)

Posted on February 16, 2013

A few months back I discovered that our service desk had become a little “lazy” and were no longer using the defined process (identify user, randomly generate new password, set to change on first use) and were now handing out weak passwords without requiring the users to change them.

In order to assess the extent of the problem I wanted to do a test against the domain to see how wide-spread the problem was. I Google’d around a bit to try to identify a tool which could perform the exercise for me, but didn’t really find anything that looked suitable. I knew that I didn’t want to grab the hashes and do an off-line attack , but wanted instead to do it “live” against the domain, both to avoid the responsibility of having a copy of all the hashes (risk of is too high and as Head of Infosec I didn’t want that on my head)  and also to test the alertness of the security operations centre in detecting the attack.

My criterion was simple, find a tool that given a file of usernames and a file of passwords would test the usernames with the given passwords.

Read the rest of this entry »

Zacon presentation : Game Hacking : Ross Simpson

Posted on October 27, 2012

Unfortunately I missed Zacon this year, again! So far have only heard good things about it. Picked up on Twitter a short while ago that Ross Simpson had presented on “Game Hacking” and had made his presentation available for download, along with a number of the tools referenced. Makes a most interesting read and provides a good history to game hacking and introduction to those who want to get further into it.

Go and download the presentation and take a read here : http://hypn.za.net/zacon4/

Thanks Ross for supporting Zacon, and for sharing a most interesting presentation.

Security considerations for Cloud Computing (ISACA publication)

Posted on October 13, 2012

ISACA has released their latest book on cloud computing : Security Considerations for Cloud Computing, earlier in the week I received notification that my personal copy is with FedEx on it’s way to South Africa for me, one of the perks of being an expert reviewer on the panel for the publication.

This guide is Another publication in the Cloud Computing Vision Series, Security Considerations for Cloud Computing presents practical guidance to facilitate the decision process for IT and business professionals concerning the decision to move to the cloud. It helps enable effective analysis and measurement of risk through use of decision trees and checklists outlining the security factors to be considered when evaluating the cloud as a potential solution.

There are five essential characteristics, three types of service models and four major deployment models taken into account relative to cloud computing. To ensure a common understanding of these models, this publication describes the characteristics of each characteristic and model.

This guide is meant for all current and potential cloud users who need to ensure protection of information assets moving to the cloud.

If you are making any significant use of Cloud Computing I would recommend you get your hands on the publication. It’s free for members to download, otherwise $35 for a hard copy, $70 for non-members.

20121013-222714.jpg

I’m looking for staff : Security, Governance, Risk and Compliance

Posted on September 01, 2012

Six more positions are available in the Enterprise Information Security Management team at Transnet, within the IT Security, Governance, Risk and Compliance competency areas.

We have a lot of challenging but interesting work ahead of us. If you want to learn a lot, apply what you have learned, be part of a hard working and performing team, then please apply :)

  • ICT Continuity Compliance Manager
  • IT Risk and Compliance Manager
  • Information Security Subject Matter Expert
  • Information Security Analyst (SME) x 2
  • Senior Security Analyst (inc Forensic & Incident)

These positions are all based in the Johannesburg CBD (Carlton Centre) and are manager or senior consultant level positions.

External applicants must apply by submitting CVs electronically to recruitment@transnet.net by 16h00 on 07 September 2012. Any questions regarding the positions should be sent to linneth.mpete@transnet.net.

Further details for each of the positions can be found here :http://lnkd.in/gyy9FR  (Google Plus)

We urge all our employees, clients, members of the public and our suppliers to report any kind of fraud or corruption at Transnet. Call the hotline toll free number: 0800 003 056 or email Transnet@tip-offs.com

%d bloggers like this: