Home     RSSRSS

How to enable logging in IIS 7.0

February 24, 2014 by kiranbadi1991 | Comments Off on How to enable logging in IIS 7.0 | Filed in Development, Environment, Performance Engineering, Web Server

IIS 7 on Windows 2008 Server has increased the logging capability of IIS compare to IIS 6. IIS 7 ships with below modules which are default on set up,

HTTP Logging Module(loghttp.dll)

Interacts with HTTP.sys process and processes request status. It is required for generation of logs.

Failed Request Tracing Module(iisfreb.dll)

Logs failed Requests for debugging purpose

Request Monitor Module(iisreqs.dll)

Watches the worker process activity

Tracing Module(iisetw.dll)

ETW tracing to capture trace file

Custom log Module(logcust.dll)

Logs custom module information

All the modules are located in system 32 /inetsrv directory of the server (Please see below screenshot),

image

It’s now possible to log information on per site basis or globally with IIS. It’s also possible to log just only the failed requests or only successful requests. Centralized logging can be done in Binary or W3C format.

In Order to enable logging, below steps needs to be followed,

  • Open Server Manager. Open command Prompt window and Type Run. Run Window will open. In Run window type CompMgmtLauncher.exe
  • Click on Roles (Web Server Roles).Check if Http logging is installed. If it’s not then you need to install it by adding appropriate Role.
  • Check the HTTP Logging box and install it.
  • Close the Server manager.
  • Once enabled you should be able to see something like below screen once you click on HTTP logging Module. (Below screen is from IIS 8).

image

Once the logging is enabled, logs tags are automatically created in applicationhost config file.

<log>
<centralBinaryLogFile enabled=”truedirectory=”%SystemDrive%\inetpub\logs\LogFiles/>
<centralW3CLogFile enabled=”truedirectory=”%SystemDrive%\inetpub\logs\LogFiles/>
</log

Its always a best practice to generate logs in the separate drive from your main drive where we have installed IIS 6 or the drive where heavy lifting of requests take place.

Tags: , , , ,

Transaction Response time–First Indicator of Application Performance

February 14, 2014 by kiranbadi1991 | Comments Off on Transaction Response time–First Indicator of Application Performance | Filed in Performance Center, Performance Engineering, Performance Test Tools, Quality, SilkPerformer, Small Businesses, Web Server

What do you mean by Transaction ?

What do you mean by Transaction Response time ?

Why is the Transaction Response time a key Performance Indicator for the application or system ?

Lot of young people who wants to become Performance Engineer asks me about these types of questions. Seriously I believe Transaction is often associated with the context.For Database DBA it could be something like commit the output of the SQL statement to the DB or rollback the entire output of the SQL Statement and bringing the DBA to its initial state due to some failures.For Developer, it could be related to 1 business requirement equal to 1 transaction and to the banker or domain specialist, it could be one entry in the ledger log.For the prospective of Performance Engineer, it could means any of the 3. It just depends on what is your objective of your tests or what are we looking for measure in the system or application.

The way Performance Engineer sees the transaction is bit different than the way Developer sees them.Some of the examples of the Transaction as seen by Performance Engineer are,

  • Time taken to do the login of the application
  • Time taken to generate and publish the report to the users.
  • Time taken to load the data to the database or Time taken to process xx amount of data and load it into the database(Batch Jobs).

If you look closely at above example, you can see that transaction is always associated with Time. Time taken to do X activity is prime focus of the Performance Engineer.The same might not be true for other folks like DBA,Domain experts or Developers.

One of the reasons as why Performance Engineer always associates time to transaction is because most performance tools have taught us to see transaction this way.Wrap the activity/event/Business functionality between start and end transaction markers and calculate the difference between these start and end time and say that these are our transaction response time for that activity.Most Load testing tools works this way and this logic works for almost 99.9999% of the application.However this kind of logic does not gel well with few of the technologies where non blocking of UI is more preferable than blocking of the UI. Comet/Push are some of the technologies where Marker based transactions do not work favorably unless you do some kind of the hacking.So I believe that Transaction marker based solutions work only for technologies where users waits for the response to come back.

Transaction Response time is most important characteristic of the System Performance from the Users point of view.If they are satisfied with the speed of the system to do more work in short time, then probably no effort is required to speed up the system, else lot of effort is required to increase the speed. Sometimes users are also satisfied with bit of high response time because they know the there is lot analytical calculation program/application has to do to bring data back to them.However when as more and more users starts getting into the system , they start experiencing slowness of the application.Application starts to more time to respond to the users request and the functionality which was taking 3 to 5 sec now starts to give response in 5 to 8 sec.As the user load increases, response time also starts to increase and after a certain point, it fails to respond to the users request.Performance Engineers calls this behavior a breaking point of the application.The reason for the application to stop responding could be many reasons ,however from the performance engineering’s prospective,its suggested that we do bit of analysis based on queuing theory.Rate at which application is receiving requests is more than the rate at which it can serve the requests keeping in mind the environmental constraints.Response time degradation can  happen not only when the number of users or volume of data exceeds some thresholds, but also after the application is up and running for a particular period. A period can be as short as hours or as long as days or weeks. That is often an indication that
some system resources are not released after transactions are completed
and there is a scarcity of remaining resources to allocate to new transactions.
This is called  resource leak  and is  usually due to programming/coding defects.

We say application is having performance issues after analyzing the trend of transaction and can conclude the below points based on data we have collected

    • Transaction response time is getting longer as more users are actively
      working with the system
    • Transaction response time is getting longer as more data are loaded
      into application databases.
    • Transaction response time is getting longer over time even for the same
      number of users or data volume primarily due to leaking of resources like memory,connections etc..

Transaction response time matters a lot when your application is web based application and its public facing site. The revenue of the company depends on how fast the application responds to the user’s request. If your application is internal , then it increases the productivity of the team and company over all.

Tags: , ,

Loading Microsoft IIS Logs into SQL Server 2012 with Log Parser

February 1, 2014 by kiranbadi1991 | Comments Off on Loading Microsoft IIS Logs into SQL Server 2012 with Log Parser | Filed in Database, Development, Performance Engineering, Performance Test Tools, Scripting, Web Server

If you ever thing of building the customized tool for viewing and analyzing the IIS logs, the first thing, you will probably do is to think of some way of loading the IIS logs to some database specially MS SQL Express or MY SQL.

Parsing the IIS logs and loading it into the database has its own challenges, we can always write the customized code which will read each line of log file and then load it to the database table. However it will  require a specialized programming skills since parsing logs file will require you to first read the log file, remove the headers or unnecessary data in the log file and then proceed to load the required data into the Database table. So your program should be aware of all the format and complete structure of your log file so that it can handle all unexpected characters of the log file.Its quite a tedious and time consuming  task.I have loaded the IIS logs using Powershell and Log Parser into the MS SQL Express database.

Log parser is a powerful, versatile tool that provides universal query access to text-based data such as log files, XML files and CSV files, as well as key data sources on the Windows operating system such as the Event Log, the Registry, the file system, and Active Directory®.You tell Log Parser what information you need and how you want it processed. The results of your query can be custom-formatted in text based output, or they can be persisted to more specialty targets like SQL, SYSLOG, or a chart.

So in this post, I will show you as how to Load the IIS Logs into the MSSQL Express database.First thing you need to do , is to download the IIS Log Parser and then Install it in your local machine or Server. Once you have installed the log parser tool, then you need know the location of your IIS logs files and you need to access that you have all the rights to access and read the IIS Logs.You will also need to ensure that you have SQL Server installed and you have complete rights on the database.Probably you need to have rights to create the table and have full rights to database.If your database is located in other machine, then you will need to ensure that you are able to connect the machine where log parser is installed and it is able to connect to the database and has access the location of IIS logs files.

Once you have installed log parser , have access to SQL Server and IIS logs location, then you can use the below example query to load iis logs to sql server database.(Below example I have loaded multiple logs into the Database)

C:\Program Files (x86)\Log Parser 2.2>LogParser.exe "SELECT * INTO TBIIS76 FROM
C:\Users\Kiran\Desktop\IISLogs\u_ex130327.txt" -o:SQL -oConnString:"Driver=SQL S
erver;Server=KIRAN\SQLExpress; Database=IISLOGS;Trusted_Connection=yes" -createT
able:OFF -e:10

Statistics:
-----------
Elements processed: 75
Elements output:    75
Execution time:     0.07 seconds


C:\Program Files (x86)\Log Parser 2.2>LogParser.exe "SELECT * INTO TBIIS76 FROM
C:\Users\Kiran\Desktop\IISLogs\u_ex130318.txt" -o:SQL -oConnString:"Driver=SQL S
erver;Server=KIRAN\SQLExpress; Database=IISLOGS;Trusted_Connection=yes" -createT
able:OFF -e:10
Task completed with parse errors.
Parse errors:
Cannot find '#Fields' directive in header of file "C:\Users\Kiran\Desktop\IIS
  Logs\u_ex130318.txt". Lines 1 to 285 have been ignored

Statistics:
-----------
Elements processed: 12968
Elements output:    12968
Execution time:     20.93 seconds


C:\Program Files (x86)\Log Parser 2.2>LogParser.exe "SELECT * INTO TBIIS76 FROM
C:\Users\Kiran\Desktop\IISLogs\u_ex130319.txt" -o:SQL -oConnString:"Driver=SQL S
erver;Server=KIRAN\SQLExpress; Database=IISLOGS;Trusted_Connection=yes" -createT
able:OFF -e:10

Statistics:
-----------
Elements processed: 25219
Elements output:    25219
Execution time:     54.60 seconds


C:\Program Files (x86)\Log Parser 2.2>LogParser.exe "SELECT * INTO TBIIS76 FROM
C:\Users\Kiran\Desktop\IISLogs\u_ex130320.txt" -o:SQL -oConnString:"Driver=SQL S
erver;Server=KIRAN\SQLExpress; Database=IISLOGS;Trusted_Connection=yes" -createT
able:OFF -e:10

Statistics:
-----------
Elements processed: 25922
Elements output:    25922
Execution time:     53.50 seconds


C:\Program Files (x86)\Log Parser 2.2>LogParser.exe "SELECT * INTO TBIIS76 FROM
C:\Users\Kiran\Desktop\IISLogs\u_ex130321.txt" -o:SQL -oConnString:"Driver=SQL S
erver;Server=KIRAN\SQLExpress; Database=IISLOGS;Trusted_Connection=yes" -createT
able:OFF -e:10

Statistics:
-----------
Elements processed: 27395
Elements output:    27395
Execution time:     59.74 seconds


C:\Program Files (x86)\Log Parser 2.2>LogParser.exe "SELECT * INTO TBIIS76 FROM
C:\Users\Kiran\Desktop\IISLogs\u_ex130322.txt" -o:SQL -oConnString:"Driver=SQL S
erver;Server=KIRAN\SQLExpress; Database=IISLOGS;Trusted_Connection=yes" -createT
able:OFF -e:10

Statistics:
-----------
Elements processed: 22772
Elements output:    22772
Execution time:     60.12 seconds (00:01:0.12)


C:\Program Files (x86)\Log Parser 2.2>LogParser.exe "SELECT * INTO TBIIS76 FROM
C:\Users\Kiran\Desktop\IISLogs\u_ex130323.txt" -o:SQL -oConnString:"Driver=SQL S
erver;Server=KIRAN\SQLExpress; Database=IISLOGS;Trusted_Connection=yes" -createT
able:OFF -e:10

Statistics:
-----------
Elements processed: 17497
Elements output:    17497
Execution time:     42.62 seconds


C:\Program Files (x86)\Log Parser 2.2>LogParser.exe "SELECT * INTO TBIIS76 FROM
C:\Users\Kiran\Desktop\IISLogs\u_ex130325.txt" -o:SQL -oConnString:"Driver=SQL S
erver;Server=KIRAN\SQLExpress; Database=IISLOGS;Trusted_Connection=yes" -createT
able:OFF -e:10

Statistics:
-----------
Elements processed: 670
Elements output:    670
Execution time:     1.35 seconds


C:\Program Files (x86)\Log Parser 2.2>LogParser.exe "SELECT * INTO TBIIS76 FROM
C:\Users\Kiran\Desktop\IISLogs\u_ex130326.txt" -o:SQL -oConnString:"Driver=SQL S
erver;Server=KIRAN\SQLExpress; Database=IISLOGS;Trusted_Connection=yes" -createT
able:OFF -e:10

Statistics:
-----------
Elements processed: 1135
Elements output:    1135
Execution time:     1.60 seconds


C:\Program Files (x86)\Log Parser 2.2>
Let me explain the below query which inserts the logs into the sql server database,
C:\Program Files (x86)\Log Parser 2.2>LogParser.exe "SELECT * INTO TBIIS76 FROM
C:\Users\Kiran\Desktop\IISLogs\u_ex130326.txt" -o:SQL -oConnString:"Driver=SQL S
erver;Server=KIRAN\SQLExpress; Database=IISLOGS;Trusted_Connection=yes" -createT
able:OFF -e:10

If you look closely at the above query, you can see that I have connection string to the local server KIRAN\SQLExpress and I am using the driver SQL Server and I have a database IIS Logs. Since I have windows authentication on my local machine, I am using Trusted_Connection = Yes. However if you are using network based logs, then you need to replace,

 
Trusted_Connection=yes

with

-username:yourusername -password:yourpassword 

I have given the create table flag as Off since I have table TBIIS76 already created in the database IISLogs. In case if you need the log parser to create the table by itself, then you need to set create table flag as ON.

e flag is very important flag and is more of extended logging for log parser.I will give you information as what went wrong in case if few records are not loaded into the database for any reason.

Below is example of parse errors which was given for one of the log files. Without this flag, you will spend hours trying to figure out as what is wrong in case if there are any errors. If you look at below error, you will understand as why it’s painful to write custom code to parse log files, if there is any missing status fields, then probably your program will die without giving you any indication.

Task completed with parse errors.
Parse errors:
Cannot find '#Fields' directive in header of file "C:\Users\Kiran\Desktop\IIS
  Logs\u_ex130318.txt". Lines 1 to 285 have been ignored

The table in case if it does not exists and you want log parser to create for you, then it will create one for you. The table structure along with column data format looks something like below,

image

Once the data is loaded into the SQL Server, it will open the new world for you to query the results and you can built the customized tools on top of it or use SQL Management studio to write customized query and do the kind of analysis you want to do. This is especially very helpful in analyzing Production metrics and doing capacity planning activity.

Also please note that you can upload any event logs/registry logs/http err logs etc to the database using the log parser tool.I feel using the above approach, you should be able to do all kinds of performance analysis on logs.

In the next post I will show you as how to load the IIS logs into the SQL Server at runtime using system ODBC driver.

Tags: , , , , , ,