Today I noticed we were getting an increasing amount of spam on one of our form pages. I was curious to see if all of the user IP addresses were the same (in which case I’d just add them to the IIS7 IP Restrictions list). To quickly and easily figure this out I decided to use LogParser. Besides just querying for the page though, I wanted to add an additional condition to exclude rows that came from a certain internal IP address that we use for monitoring.

Here’s a generic version of the query I used:

LogParser.exe -q:on "SELECT * FROM x:\wwwlogs\W3SVC1\u_ex130411.log WHERE cs-uri-stem='/SomePage/' and c-ip<>'' >c:\temp\PageVisitors.txt"

I wanted to see the full logged data for the request, but if I didn’t, I could have very easily just pulled the IP addresses using:

LogParser.exe -q:on "SELECT c-ip FROM x:\wwwlogs\W3SVC1\u_ex130411.log WHERE cs-uri-stem='/SomePage/' and c-ip<>'' >c:\temp\PageVisitors.txt"

You can see that I’m piping the results to a text file (the “>c:\temp\PageVisitors.txt” part) so that I can easily deal with the results. You may also want to take note that I’m using the “-q:on” flag which runs the command in Quite Mode. If you don’t set this flag then LogParser will show results one page at a time. When piping to a text file rather than the command prompt window, you obviously can’t hit a key for “next page” so without this flag the query will actually hang forever if there is more than one page worth of results.

Happy hosting!

Be Sociable, Share!
    Using LogParser to Check Visitor IPs to a Certain Page
    Tagged on:         

    11 thoughts on “Using LogParser to Check Visitor IPs to a Certain Page

    • Pingback: Using LogParser to Check Visitor IPs to a Certain Page - Web and Cloud

    • April 12, 2013 at 2:58 am

      Nice post. While LogParser can be a great tool, I’d prefer the ported GnuWin32 tools, like grep.exe (cut.exe, etc), for such easy tasks. My fingers type grep/cut commands faster than SQL :razz:

      On the other side, LogParser has the advantage of selecting multiple sections of information at once and you’d need more than one GnuWin32 tool to do the job.

      PS: SANS and Symantec have great articles about using LogParser for gathering forensic information. Google for it, I’m sure you like it.

      • April 12, 2013 at 1:30 pm

        I haven’t used the GnuWin32 tools but I’ll check them out. Thanks.

        • April 19, 2013 at 7:17 am

          FYI: I created a small introduction on my blog/site.

          One of the problems I’m facing with LogParser is not being able to use it recursively when log files are in different folders, for example DIR*\File.log, where with DIR\*.log recursive usage is possible.

          • April 20, 2013 at 6:45 am

            Jan – how about starting the query one folder level higher (folder that contains both of the other folders)? Then use the -Recurse flag but set the match to only catch files you want? It might take longer if it has to scan lots of other folders, but still should only catch the files you filter for.

            • July 17, 2013 at 2:17 am

              An update: I was browsing through my old archives and found the following, which kinda solves my *\file.log problem. Just use ‘for’ with ‘dir’ to list all logfiles, for example:

              for /f %i in (‘dir /s/b u_ex1112.log’) do @LogParser -i:w3c “SELECT COUNT(cs-method) AS nmb_get FROM %i WHERE date = ‘2011-12-05′ AND time = ’18:30′ AND cs-method = ‘GET'”

              How could I forget… :)

          • April 29, 2013 at 4:22 pm

            Re quiet mode, you should try specifying -o:csv instead. If you want multiple queries to output to append to the same txt file, you need -filemode:0 too.

    • Pingback: April – Technical Roll-Up Mail - Internet - Technical RollUp - Site Home - TechNet Blogs

    • April 19, 2013 at 8:39 pm

      hi..wants to see his ip

    • July 4, 2013 at 2:16 am

      Log Parser Lizard is a GUI for MS Logparser. Vey useful for quckly build and organize queries (supports intelisense, charts, export to Excel, row filtering, etc. Thought you should know about it.

    Comments are closed.