Home     RSSRSS

Archives: Browser

Understanding Various Types of Authentication in IIS

April 9, 2013 by kiranbadi1991 | Comments Off on Understanding Various Types of Authentication in IIS | Filed in Browser, Development, Environment, Web Server

Few Days back I was a part of the debate with few ASP.NET developers where we were discussing the pros and cons of various web servers available in the market and types of authentication methods supported by them, why quite often public facing sites uses form based authentication and not any other types of authentication.I have been the fan of IIS since its 5x days and have learned a quite a bit about it from IIS product team members.( I am also huge fan of Apache Webserver, only think I hate about them is that they stopped giving binaries to Windows Users and its pain to built it from source unless we are on .nix platform).

I call Authentication as something where users supply their credential in order to identify themselves to the server and servers based on the roles configured in some system it determines their authorization powers as what type of operations users are entitled to do in the application.Authorization always happens after Authentication and almost at the same time.

Latest version of IIS provides around 6 ways for doing authentication and they are ,

  • Anonymous Authentication: In this type end user do not supply credentials, effectively making an anonymous request. IIS 7 impersonates a fixed user account when attempting to process the request.This type of authentication is mostly used for public-facing web sites where visitors are not required to supply credentials.Users can access the site freely and browsers will not prompt the users for any kind of challenge. Its enabled by default in IIS 7.0 version. One of the best way to prevent users from accessing any resource which requires credentials is to create separate group and assign appropriate  permission levels to it. Also execute permission levels should be denied to Anonymous users on windows directories.
  • Basic Authentication: In this type end users is prompted to supply credentials, which are then transmitted unencrypted in base 64 format across the network. Basic Authentication is supported by all major browsers. This type of Authentication should be used only when traffic flows entirely on SSL so that the data flowing in the wire is encrypted.Anonymous Authentication should be disabled in case we want to use Basic Authentication.
  • Digest Authentication: Over here end user are prompted to supply credentials, however not like in Basic authentication, the user’s password is not passed in clear text across the wire but it’s hashed using MD5.It’s mostly used along with Windows Domain Controller.Browsers needs to http 1.1 compliant in order to use this type of authentication.In addition to this Anonymous Authentication should be disabled. Quite a few intranet application in the large companies uses this type of Authentication internally for their application.
  • Integrated Windows Authentication: It contains two separate authentication schemes: NTLM v2 (NT Challenge/Response) and Kerberos. Enabling Integrated Windows authentication using IIS Manager enables support for both of these two schemes. NTLM works similar to to Digest authentication (it hashes users password). Kerberos relies on shared secrets between the client, ADC, and the IIS server to authenticate the user. Kerberos is only available for Windows Active Directory accounts, whereas NTLM can be used for local accounts as well. IIS 7.0 does not present Kerberos as a discrete authentication option to the client, instead sending a “Negotiate” option, allowing the client to choose Kerberos or NTLM. NTLM can be presented as a discrete authentication option to the client. Microsoft recommends using this type of authentication for Intranet Applications since client and servers share the same domain. This type of authentication is not useful for internet because there is no encryption in the internet.
  • Client Certificate Authentication:When using this type of authentication, the client presents a certificate to the server. The server is configured to map certificates to one or more Windows user accounts (it is possible to map multiple certificates to a single user account or to map each certificate to an individual user account). IIS logs on the mapped user account.Client Certificate authentication requires that SSL/TLS be enabled for the resource being secured.Mapping client certificates lets you automatically authenticate users who log on with client certificates, without requiring the use of other supported authentication methods such as Basic, Digest, or Integrated Windows authentication.
  • UNC Authentication : When Server needs to retrieve files from a remote network resource e g file share, a virtual directory in IIS can be mapped to a UNC path. When configuring this virtual directory, it is possible to specify a some fixed user account that will be used to connect to that file share, irrespective of the identity of the end user.
  • Form Based Authentication:It relies on the supply of credentials via html form as part of the HTTP traffic.In this way, the request for the login form is an anonymous request. After authenticating via the HTML form, an authentication cookie is set by the server.The client must return this cookie with each subsequent request in order for the request to be authenticated. Although this authentication can be configured using IIS Manager, it is effectively ASP.NET’s Form Based Authentication.Forms Based Authentication can be combined with either ASP.NET’s authorization features
    (available with previous versions of ASP.NET) or IIS 7.0 new inbuilt URL Authorization feature to protect access to resources.

All these types of Authentication can be configured at directory level, website level or at file level. Maybe sometimes later I will write some posts as how to configure these types of Authentication in IIS.

Tags: , , ,

Checklist for Troubleshooting Web Application in Internet

March 6, 2013 by kiranbadi1991 | Comments Off on Checklist for Troubleshooting Web Application in Internet | Filed in Browser, Development, Environment, Quality, Small Businesses

One of the hard things of troubleshooting the production issues is to identify the root cause of the issue. When the application is deployed in the open internet and accessed by the variety of the browsers, and has large number of hops in the network, it becomes quite a challenging task. So in this post I will walk you through the list of things which needs to be checked at high level on client side in order to identify the root cause of the issue and before proceeding to do the review of the code base.

Browser Setting

  • What is the proxy setting of the browser, does it access the application via proxy or connection is directly to the internet.
  • How is browser configured? Does the user have administrative rights or he has basic rights on the browser settings.
  • Is the browser configured to show user friendly messages? If Yes than can we reproduce the issue by removing those user friendly messages and just displaying the actual message which application is throwing.(Please note if your application does not do proper error handling, there exists a risk that you are displaying nasty code to the users. E.g. Famous yellow screens of the .net)
  • Is the browser configured to run in standard mode or compatibility mode? This point applies to IE. (Ajax and UI Issues are most often related to compatibility mode)
  • In case the issue is related to certificate, then checking whether the relevant certificate is present in trusted store often helps.
  • If your application uses popup windows to display some information, then checking if there are any add on or setting in browsers which is blocking pop ups also helps. Some browser add on silently block pop ups without giving any information to the users.
  • Is the browser setting on default? These are the setting which is factory default. One of the easiest ways to troubleshoot issues is reset the browsers to default setting.

Client Computer Settings

  • Is the client computer behind the firewall? If yes than verifying that it’s correctly configured saves lot of time.
  • Checking the antivirus software installed on the client machine also helps. Sometimes in case where your application uses specials character, there exists a chance that badly configured antivirus might filter out or block the incoming responses.
  • Hardware and software configuration of the user’s machine. In case if your application does lot of heavy lifting on the client side, then it often helps to educate the users that minimum configuration needs to be met.

Network Infrastructure and configuration

  • Is the network correctly configured? Using the bidirectional ping command often helps to identify the network issues.
  • Is the client able to resolve the application host name correctly?
  • Checking how many hops the user needs to make to connect to the server also helps
  • In increasing user experiences. General Thumb rule I often use is more the hops user does to connect to the server, more the response time he is going to get.
  • Is there any load balancer or firewall between the server and client? If yes than checking if they are correctly configured also helps.

User access or Login Issues

  • Is the user giving the right credentials and if yes then checking if server is doing correct validation of credentials also helps.
  • Is the identity and access validation done by application or by third party component like Site minder? If by third party, then checking the third party component in isolation often helps.
  • Does the user have appropriate access level to access the resources? If yes then further troubleshooting is required. Else its waste of time.
  • If browser is client, then disabling friendly error message settings in browser will reduce the time to identify the issue by almost 50% since no extra debugging tools are required unless the case if of missing or stolen headers.

Tags: , , ,

Javascript URL’s and Caching

January 1, 2013 by kiranbadi1991 | 2 Comments | Filed in Browser, Development, LoadRunner, Performance Engineering, SilkPerformer

In this post, I wanted to share some tips and also clear some misunderstanding which I have seen in lot many performance Engineer’s with regard to JavaScript URL’s which we often see during the Load testing of the Rich Client or web based Application which heavily uses JavaScript for rendering and manipulating the User interface.

I am sure lot many of us has seen url’s like below during their load testing efforts,

/mysite/js/myjsfile.js?1357011161255

The above url contains the unix style timestamp appended towards the end.There are quite a few reasons as why we append the timestamp the JavaScript,foremost reason being that we do not want to cache the JavaScript.Since these scripts are often interacting with DOM and building DOM Elements on fly based on the user interaction, it is a good thing in certain situations that we do not cache these type of javascripts.If browser caches these scripts, there exists some risks that we might see some browsers quirks happening at the UI.

Secondly, most of the JavaScript libraries which provides and implement XMLHTTP Requests implicitly uses these timestamp features.I know DOJO Library and Jquery extensively uses these timestamps while making Ajax calls so as to prevent the caching on the user’s browser.

One can easily implement these timestamps in their load testing scripts.I know both SilkPerformer and LoadRunner has built in Functions which supports creating and replacing the unix style timestamps.I suggest where ever you see these types of URL, just use those functions ,rather than commenting out those URL’s.If you are commenting out those URL’s, I feel you are building the scripts incorrectly and downloading at least 15% less bytes when compare to size of the entire page.

Also when you comment out these types of URL’s, you are doing less calls and thereby reducing the load on the servers.We need keep in mind that users never comment out anything and for these types of URL’s whenever they are present,browser will always be forced to make the call to fetch these JavaScript URL’s.There might be some performance impact on the application but again its choice between providing functionality and achieving performance.

However if you are developing Rich Client Web based application, then I suggest you to append your JavaScript files with timestamp in at least your development environment.It saves lot of time in debugging various client side issues and relieves you from clearing your cache every time you compile and build your code base.We need to keep in mind that browsers almost always caches JavaScript and css files and updated files are not available to application unless we clear all browser history and close and reopen all browser windows.Quite a painful process specially when we are implementing multi page functionality.

Tags: , ,