Wednesday, December 6, 2006

Tips for website performance

Over the last year I have found myself constantly discovering new optimization tips for my sites. Every time I think the only way to get more performance / speed for my users is to make
big architectural changes - I seem to find another small strategy to make my site more responsive. You can find below my list of tips - which I will expand on as I get more :)

I post it here in the hope that it will help you make your sites better, faster and less bandwidth intensive. Most of these strategies were tested by myself on 1 - but I have started testing them on 2 as well.

Summary of strategies:

1. Server global config - threading settings
2. http compression - 2 options
3. w3compiler
4. dbspeed
5. image pre-cacheing - evil multiple image
6. image/js cache-control
7. image size
8. ajax async
9. debug params in web.config
10. in proc
11. cache
12. viewstate on server
13. componentart js / infragistics w3compiler
14. fiddler
15. check for broken links
16. https / https login
17. IIS logfiles
18. app pool + fresh
19. .net 2
20. out of process execution
21. js files for javascript
22. dynamic cacheing of elements / frames
23. browsers
24. ajax - json or xslt instead of html
25. upper and lowercase references
26. flush -
27. cookies / http headers + url rewriter
28. compatibility js
29. Server.Transfer("default.aspx"); instead of response.redirect
30. Use IHTTPHandler instead of aspx pages for ajax calls
31. New SQL Cacheing models in .net 2 and SQL 2005
32. Web Application Deployment
33. Batch Compilation settings
34. Precompile


1. Server global config - threading settings

It seems that when you install the .net runtime / - that even on a server operating
system on Windows 2003 - there are default settings done which favour a single cpu
desktop machine. In scenarios where you have control over the webserver where you site / webapplication is hosted you should always go and customize these settings.

This tip was best explained in a great codeproject article which you can find here:


12 * #CPUs



88 * #CPUs

76 * #CPUs

It is worth noting that .net 2 has an autoconfig setting for the processmodel - although
I have not tested this myself.

autoconfig = "true"

2. http compression - 2 options

It should be criminal not to enable http compression on your webserver. There are very
few reasons not to do it. Still - I browse past website regularly where compression is not
enabled. On the web - size is king. The smaller you can get the data coming back, the
more responsive your site will appear.

In the world of IIS the 2 options I know of are:

2.1. Use IIS 6 built in compression

IIS 6 has fast useable compression built in already, you just have to enable it.
I use a utility from port80 called ZipEnable (

ZipEnable makes it easy to configure IIS compression through a couple of mouse-clicks.

If you do not want to use ZipEnable, you can enable compression manually by:

1. Enable live metabase editing.
2. Enable http compression on the site root in iis configuration.
3. Edit the metabase and enable static and dynamic compression on the correct filetypes -
of course remembering to set dynamic compression on your .aspx files.
4. Adding a parameter to the metabase to enable compression through proxy servers.

2.2. Use Isapi Filter or httphandler to compress

If you are using IIS 5, want to do some custom handling of the compression or just want
a more manageable solution you can try:

HTTPZip from port80 ( or
blowert HTTPCompress (,category,HttpCompress.aspx)

3. w3compiler

"W3Compiler got me 16%" :) Because javascript, html and css in your pages are not compiled,
any comments, line ends, long variable names etc. add to the size of your pages. Again - size
is king. Browsers do not need to get html in a nicely formatted way or javascript with your 5 pages of comments in.

I love the tool w3compiler. It removes unnecesary whitespace and comments just before I deploy my site, and also supports advanced stuff like renaming local js variables and file remapping and extension stripping. If you run a site that only supports IE - then you can also
have it optimize size a little more. (

I have also seen a utility published on codeproject which does the whitespace
removal for you, but that was all it seemed to support.

4. dbspeed

It goes without saying that if you run a database centric web-app, most of your performance
gains will come from optimizing the database. Keep queries minimal which happen in postbacks. Try and cache data which doesn't change. Implement "last changed" fields in your tables to only query data again when necessary.

5. image pre-cacheing - evil multiple image

Images are evil. Period. Why? Because every image on your page constitutes another possible
callback to the server. I have this page where I show in a grid say 200 items. The page iself is
30 K. Every row has 1 image in it, and its always exactly the same image. The image is a 200 byte gif. When the page used to load up - it took about 20 or 30 seconds, on my dev machine where the webserver is local to load up. In that 20 or 30 seconds - the cpu also spiked to 100%. The reason? Images. That page was doing 201 calls to the webserver.

The moral of the story is that I always make sure that I cache images on my pages as soon as possible. When my user logs in, so as soon as I know he may get to the innards of my web-app - I load up all images in javascript. This will force the browser early on to already know that
no images have changed on the server, and if it didn't have a specific image cached - to cache it.

My page now loads in 2 seconds.

Read the following guide:

6. image/js cache-control

There is a seldom used header in http called Cache-Control. IE adds a specific cache control
directive I have not seen in other browsers, but that dramatically increases the performance
of your website by forcing the browser to cache certain items in a cool way.

Even though your browser has cached a certain image, it still sends sometimes
a request to the server to check if it has changed.

Read this article:

This changed my life :)

I set the header
Cache-Control: post-check=3600,pre-check=43200

statically in IIS on my folders of javascript, xslt, css and images - dramatically
improving the user experience of my sites.

You can also use the product CacheRight to handle some of this for you.

7. image size

This one goes in conjunction with item 5. You can always tell an unoptimized website by its
images. It can make a big difference if that 1 image you use everywhere on your site is 10 kb or
30 bytes.

7.1. Choose the right file-format. Compare the sizes of gifs, jpg and png - depending on if its a photo or an icon etc.

7.2. Remove Image Comments. Image editors have the irritating habit of signing images with a comment string - that adds size to images. Examples are "Made with Gimp" or "Edited with Irfanview" etc. Make sure you cull this waste of space after editing an image. I usually
edit with gimp and explicitly go and remove the comment.

7.3. Remove extra colours. If you are displaying a nice photo of a lady in lingerie - I'm sure you
want to use as many millions of colors as you can. But if you have a 32x32 icon somewhere on the page - if might be hard to see the difference in using 256 colors and 16 million colors. Obviously 256 colors make for way smaller images. This is one of the reasons I tend to like gifs for small images.

7.4. If you have 5 images that always display next to each other, rather combine them into 1 image. Less is more and makes for better compression.

8. ajax async

Although using less bandwidth with your site is a consideration, the biggest reason for
all these tips of course is user perception. Using ajax calls in places on your page can greatly
improve the perception of speed.

Something I find now and again in beginners httprequest pages are that developers use
synchronous httprequests instead of async. This is catastrophic for your page.

Always make sure that the last parameter to your open method is true, and that you set
an async callback method."POST", pageUrl, true);
xmlRequest.setRequestHeader("Content-Type", "text/xml");
xmlRequest.setRequestHeader("Accept", "text/xml");
xmlRequest.onreadystatechange = handleData;

9. debug params in web.config

Make sure you remove the debug parameters in your web.config files on the production server.
If set to debug especially the initial compile speed of your pages are affected.


In 2 you can make sure for all web apps in production by setting
deployment retail=true

in your machine config.

10. in proc

Use the fastest Session management setting you can.

In a scenario where server side processing doesn't hog the cpu too much, and you
do not run a webfarm or failover session sharing server InProc is the best.

If you do a lot of server side processing and see your cpu take a hit - and your
running on a multi cpu machine, you shouldn't run in-proc as inproc forces requests onto
single cpu affinity.

Using an external in memory session server is usually more scalable than inproc, and
faster than database - but the database one is the most persistent.

11. cache

The cache provides a very convenient mechanism for caching data so that
you do not always have to go and query it from the database. Make sure to
set a realistic timeout.

12. viewstate on server

Viewstate only seems to be about as useful as far as you can throw it - because of the overhead
it adds to the size of your pages. The usual speed tip on this is to use viewstate only where
you really need to, for the rest just don't. Viewstate was put in to make a webpage act
more like a windows forms application.

Viewstate are not used on the client side at all! It seems to be sent over for
some obscure security reason. One of the best tips I have ever gotten was from

This shows you a way in which you can replace the huge blob of viewstate with a single id
that is sent to the client, and when it comes back to look up the state from the cache
again. Ie there are minimal changes to your pages needed - and it enables you to use
as much viewstate information as you want without making your pages bigger. The only
things to look out for then is 1. you will use more memory on the server, 2. you cannot debug
viewstate on the clientside.

Definitely go and put the following in:

protected override void SavePageStateToPersistenceMedium(object viewState)
// Generate unique key for this viewstate
string str = "VIEWSTATE#" + Request.UserHostAddress
+ "#" + DateTime.Now.Ticks.ToString();
// Save viewstate data in cache
Cache.Add(str, viewState, null, DateTime.Now.AddMinutes(Session.Timeout),
TimeSpan.Zero, CacheItemPriority.Default, null);
RegisterHiddenField("__VIEWSTATE_KEY", str);
// Keep the viewstate hidden variable but with no data to avoid error
RegisterHiddenField("__VIEWSTATE", "");

protected override object LoadPageStateFromPersistenceMedium()
// Load back viewstate from cache
string str = Request.Form["__VIEWSTATE_KEY"];
// Check validity of viewstate key
if (!str.StartsWith("VIEWSTATE#")) {
throw new Exception("Invalid viewstate key:" + str);
return Cache[str];

Following on from this then - if you keep the viewstate on the server side, sending a random id to the client, you do not have to worry about testing viewstate security. So setting

enableViewStateMAC = false

will gain you some more performance.

13. componentart js / infragistics w3compiler

This one should really be termed "go and look what the thirdparty components you use
are doing". A good tip for componentart is to include their javascript into your webproject,
so that they don't render it dynamically as part of your page.

A good tip for infragistics webcomponents would be to go and set cacheing headers on
their javascript folders.

14. fiddler

Every webdeveloper needs a good piece of software to inspect http traffic done by die browser on your website. Fiddler is a great one.

15. check for broken links

An aspect in IIS that I've noticed is that the errorpage for a requested resource that
is missing on the server is oftentimes bigger than the intended resource.

For example - if you have a 30 byte gif you used to use in a lot of places, which doesn't exist
anymore - it might seem to affect nothing, but it could slow you down as all the "don't exist" pages
coming back in the background are bigger than the image would have been.

For this reason, run through with Fiddler every now and again through your pages, and make sure that any
css links or js or images that are not found, are either corrected, or resolved to a 0 byte file.

16. https / https login

If you are in a secure environment or your data is non critical, do not use https. Https adds an overhead and sacrifices a bit
on speed. Also if its only the login you need to secure, redirect to http.

17. IIS logfiles

IIS has the ability to log very verbose standard logfiles. These logfiles contain very useful information.
Anylizing these logfiles will show you which pages/requests are done the most by your visitors - enabling you
to optimize the relevant pages in your website. Another thing you might find useful in them is seeing how much a specific
IP is requesting a resource, showing u how that browser is coping with caching resources on your pages. In a controlled
environment you could follow up on that.

Obviously you should only log the relevant information, and switch off logging on folders in your website where you do not need
logging from, or which are highly utilized.

18. app pool + fresh

Assign your webpage to its own application-pool, and customize the settings for the pool. An application pool
allows you to recycle the process the webpage runs under under special circumstances like high cpu and memory. It also
allows you to set some parameters around session timeout. Use these settings.

19. .net 2

Without doing any special optimization, just converting my .NET 1.1 project over to .NET 2 / ASP.NET 2 seemed to gain me some performance. Microsoft must have done some work
on the core engine. If you cannot fully move over yet - try running you .NET 1.1 app unchanged
in ASP.NET 2.

20. out of process execution

Again, getting back to the user as soon as possible is king. Therefore - if there is a non critical
piece of execution that you need to do after an action from a user - consider moving it out of
process. One way to do this is to write a service or console app listening on an msmq. You
can then send commands/events to it locally on the webserver transitionally - and then return
to the user. You can now perform whatever long task you want in the service as no user is waiting and events are queued up.

21. js files for javascript

This one seems kinda obvious, but I have seen it a million times. Do move almost all
javascript out to a separate .js file instead of embedding it in html/aspx.

This is important for:

1. Better caching - the browser can cache your js file for a long time, while your dynamic aspx
2. Easier debugging and deployment
3. Reuse of code.

22. dynamic caching of elements / frames

A strategy I have used in slow pages before - especially where the user can navigate around a lot
is to cache certain sections or pieces of data on the page in a "global" javascript variable.

The general approach is as follows:

- on page load in javascript
- if we have the data or the piece of dom on memory
- look up the applicable place to insert in dom or page
- render output into innerHtml or dom element
- kick of background httprequest to get newest data or element in the background
- if we do not, show "loading" and kick off background httprequest immediately
- on httprequest returning, replace old data or element or missing data with newest and save
element html or data in global javascript variable.

23. browsers

As a first point - obviously only support the browsers you need to.
Secondly - if you are in the position to dictate which browser to use - for example
if the user doesn't know the difference and its an intranet app - dictate the browser
with best performance for your app.

Safari seems to have a very aggresive cacheing policy - which makes it appear lightning fast
to the user.

Opera and Firefox also do well with Cacheing.

IE 6 doesnt do well with Cacheing, but general pageloads are very good.

IE 7 I have not tested intensively on this but would guess would give better cacheing performance.

24. ajax - json or xslt instead of html

A common use of httprequests are to in the background do a request replacing a traditional
browser request. The data retured then usually is either html or xhtml to make it more xmlish.

Because you have more control of rendering the data coming back than on a regular request,
and your users don't have super slow machines - rather do more work on the client side than
wasting bandwidth with html.

Options here include things like

- Send XML and parse on the client side or process on the client side with XSLT into presentation or html.

- I like the XSLT approach as I can easily change how things are rendered outside of code,
but I know that its a contentious issue. Let me just note on the compatibility - you can do XSTL on ie6,7, firefox 1.5, 2, opera and safari. It is no longer in only IE's domain.

- Roll your own format - for example just send text with comma seperated values, and split
and generate the html on the client side.

- Make it SEP - with some standard like JSON which allows you to in a more standard way
transfer data accross for javascript.

25. upper and lowercase references

Lets say you have a common stylesheet for your pages. It is called mysheet.css. You reference
this sheet in every page you have. Using a common one makes sense as it would be cached
for all your pages right?

Of course - BUT - make sure with fiddler or similar how many times the browser actually tries
to download the same file. I have found that say in page 1 you have "styles/mysheet.css"
as the reference, and in page 2 you have "Styles/mysheet.css" and in page 3 you have "styles/Mysheet.css" - in some browsers the same file will be downloaded 3 times!

This seems to be applicable for .js files and css and Images!
Make sure to go through your app after dev with a fine comb.

26. flush -

Again - the faster you get back to the user/browser the better. When using
with codebehind files you will see that output is only done to the browser

1. After rendering events
2. After filling up a certain buffer of data

This means that especially with requests that return big pieces of data, or pages
where you custom generate html - it can be a while before the user gets any feedback.

You can try and get around this by flushing the buffers of the response off to the browser
at certain places in your code. For example you can put a header on the page - then flush, then go on with the big return of data.

Unfortunately like I said - its harder to do with regular page output because of the page lyfecyle in - but doable if you play around a bit.

27. cookies / http headers + url rewriter

You can use cookies to store some information local on the user machine. Just make sure and inspect the amount of data sent back to the server on every request.

Also investigate which headers are set in IIS on your live server. For image folders etc
for example there is no reason on earth to set X-Powered-By=ASP.NET, I mean its
great for Microsoft and all for marketing - but do I really have to waste bandwidth and performance on every response on every image I have with an extra +- 20 chars?

I have also found headers like "Microsoft Office" for the frontpage extentions etc. that
make no sense sitting on my application.

Port 80 have nice utilities for removing headers on the fly, and also for rewriting urls - where you could rewrite all requests for .aspx to extentionless requests etc.

28. compatibility js

Making your javascript compatible for multuple browsers can bloat the size considerably.
A classic approach to compatibility is big "if browser x else browser y" statements.
You can do better on size and code readibility by extending javascript a bit via prototypes.

Check out prototype.js ( or this codeproject article:

29. Server.Transfer("default.aspx"); instead of response.redirect

I have had limited success with this - but have used it in places:

A common way to move the user to another page is to use response.redirect. Response.redirect
sends a http code to the browser which causes the browser to request a new page or location -
also changing the browser's url field if applicable.

In some cases you can change over to Server.Transfer instead - which will instead cause the
server to switch the request over to another page or location, and return the results of this to the browser still in the same response. This means that you save a lot of to and thro between the
server and client - BUT that the client doesn't necesarilly know that it has been redirected - ie
the url is wrong, the back button and bookmark setting doesnt work as expected.

It can be very useful though to use if thers a couple of pages to run through.

30. Use IHTTPHandler instead of aspx pages for ajax calls

A tip I see on blogs and msdn a lot these days is to use the right handlers for the right thing. For
example - when you do an httprequest call back to the server to an aspx file, its kind of a waste on performance as you don't usually need all the events and objects used in a regular page lifecycle. For some requests back to the server it makes more sense to imlement an IHttpHandler for that specific request which will have way less overhead than an aspx page.

31. New SQL Cacheing models in .net 2 and SQL 2005

I have started investigating exciting new cacheing models for sql data you can use in 2
and in sql 2005 which can greatly increase response time for data. For example SQL 2005 aparently can notify you if certain data have changed, instead of you always having to poll all data or having to poll some datetime fields to find if anything has changed.

Check out

32. Web Application Deployment

In 2 you now have several options of compiling your app before deployment. For the best performance you can even roll up most of the aspx contents into a dll.

You will have to decide where the tradeoffs lie between performance and maintainability though.

33. Batch Compilation Settings

In your dev environment - try set batch=false in your compilation settings. If it slows general working and building down in visual studio, remove it again.

The setting means that .net will not build all the code from aspx files per directory when 1 of them changes, but on demand for individual aspx files. You want batch=true which is the default on your production server as you want as many pages compiled as soon as possible (if not prebuilding in .net 2) - but if developing a large site, you do not want to wait when debugging for all the pages to compile every time if your just gonna be looking at 1.

34. Precompile

For 2. When you hit your pages after a rollout batch compilation needs to take place for every directory. This can give a bad impression to users and leads to you having to go and click around the application first after every upgrade to ensure compilation. In 2 you can force compilation from the commandline.


Do use aspnet_compiler after every upgrade.


Anonymous said...

Nice article on Bandwidth Usage over at Coding Horror

Les M. said...

Some excellent tips here, thank you.

Page saved for future reference :)

Jac Steyn said...