High Performance Web Sites

Steve Souders

Mentioned 29

Presents a collection of fourteen "rules" to help optimize the performance of a Web site.

More on Amazon.com

Mentioned in questions and answers.

In all the documentation I see the jQuery script tag beneath <title> in the head, but then when I go into some other sites (the initializr template is the first off the top of my head), they drop it into the bottom of the body (you know, right before </body>).

Which of these two is right?

Steve Souders' book High Performance Web Sites recommends putting scripts at the bottom, before the </body> tag. You can also find this documented at the developer.yahoo.com site here. (It's the fourth recommendation in the list.)

I generally put the scripts at the bottom, before my '' tag and have had few issues with this. I've noticed that this helps performance on older browsers like IE7 a lot more than it does with newer ones like FF3.6, Chrome and IE9. The older browsers only support 2 parallel downloads, whereas the newer ones support something 6 connections. The blocking, as a result, isn't as noticeable.

Hope this helps!

A typical website consists of one index.html file and a bunch of javascript and css files. To improve the performance of the website, one can:

  • Minify the javascript and css files, to reduce the file sizes.
  • Concatenate the javascript files into one file and similar for the css files, to reduce the number of requests to the server. For commonly used (and shared) libraries like jquery it makes sense to leave them external, allowing the browser to cache the library and reuse it in different web applications.

I'm wondering if it makes sense to put the concatenated javascript and css file inline in on single html file, which will reduce the number of requests even further. Will this improve the performance of your site? Or will it work reversed, making it impossible for the browser to cache anything?

Concatinating your CSS and JS files into one file will reduce the number of requests and make it load faster. But as commented, it won't make much sense unless you have a one-page site and the load time of that page is very critical. So you're better off to separate CSS from Javascript in my opinion.

Here's a book where you can learn more about the topic:

High Performance Web Sites

Most web frameworks and "best practices" are not suitable for very high performance sites and the whitepapers from vendors out there ain't worth the paper they are printed on.

So where should someone look to find books, tutorials or other resources on this subject?

I have done quite a bit of search on this.Finally boiled down to these three books.

  1. High Performance Websites

  2. Even faster websites

  3. The art of scalability

Have a look at Cal Henderson's, 'Building Scalable Websites' by O'Reilly

http://www.amazon.com/Building-Scalable-Web-Sites-Applications/dp/0596102356

he's the guy behind Flickr.

Also have a look at highscalability.com, They have some of the architectures of the most loaded sites out there.

.button { background: url(../Images/button.png); }

Problem: for performance reason all static content has expiration headers and is cached by browser. When image changes user must refresh cache (Ctrl+F5 in IE). I want images to be cached, but when necessary they must be automatically reloaded.

Question: is next approach 'valid'?

.button {
    background: url(../Images/button.png?v=1234);
}

where v=1234 is version of my site. I do not know whether it is 100% valid to write such things in CSS and I do want browsers to still cache images if version is the same. Do all modern browsers correctly cache data with URL parameters part?

Thanks.

This is discussed in rule 3 of High Performance Web Sites: "Add an Expires or a Cache-Control Header". One of the approaches recommended is to version the files rather than the site.

From the accompanying blog:

Keep in mind, if you use a far future Expires header you have to change the component's filename whenever the component changes. At Yahoo! we often make this step part of the build process: a version number is embedded in the component's filename, for example, yahoo_2.0.6.js.

What are the best practices for doing DOM insertion?

  • Is it faster to insert large chunks of html vs element at a time in a loop?
  • Does it matter what html you are inserting, or only how big the chunk is?
  • It it faster to insert a table, vs inserting just the rows using the table hack?

It's definitely faster to do it all at once. Also check out Steve Souder's blog and his book.

I am willing to learn about different architectures of highly scalable web applications like gmail, google, youtube, amazon, orbitz, linkedin, ebay etc. and would certainly appreciate if someone can point me to some online resource/book from where I can learn about details of their architecture and trade offs in selecting a particular design over other.

I'm into my first 3 months of web development and I have been dabbling with some server side scripting in the form of ColdFusion, along with some Javascript, JQuery and CSS.

I have read about CSS optimization and would like to know what are the other pertinent factors contributing to the better performance of a website. What all factors can a developer profile and optimize?

How much part does picking (or rather I should say recommending) a particular browser play in this performance hunt?

cheers

I'd recommend reading Best Practices for Speeding Up Your Web Site and all of the content on Yahoo!'s Exceptional Performance page.

If you like books, you may be interested in High Performance Websites (note that a lot of the content in this is in the Best Practices for Speeding Up Your Web Site article) and Even Faster Websites.

Here are a few of my favourite rules from Best Practices for Speeding Up Your Web Site:

  • Minimize HTTP Requests
  • Add an Expires or a Cache-Control Header
  • Gzip Components
  • Make JavaScript and CSS External
  • Minify JavaScript and CSS

Also, smush.it is good for compressing images (which have a big impact on how fast a webpage loads).

As far as browsers go, Safari 4 claims it is "the world's fastest browser", and I can say that the Mac version is certainly nice and fast (not to mention elegant!). However, the above suggestions will make much more of a difference than which browser you're using.

Steve

I'm involving in kind of work that i think knowledge about large scale application, large scale web will help me much. What do you think i should take? I mean books to read, courses to take... etc... Thanks in advance for any suggestion.

PS: maybe applications i mean are not large enough :D, kind of social network for >100k users or realtime game online for 5000 ccu. architecture and design, things to consider when you build these kind of applications

security: http://www.owasp.org/index.php/Category:OWASP_Top_Ten_Project

scalability (db layer): http://oreilly.com/catalog/9780596003067 (half of the book is very relevant for any database)

scalability (app layer): http://www.javaconcurrencyinpractice.com/ (half of the book is very relevant for any language with shared state and threads)

front end: http://www.amazon.com/High-Performance-Web-Sites-Essential/dp/0596529309/ and http://www.amazon.com/Even-Faster-Web-Sites-Performance/dp/0596522304

Also, just to get a grasp of how difficult things are sometimes, you should start reading that those 2 blogs:

I need to build a windows forms application to measure the time it takes to fully load a web page, what's the best approach to do that?

The purpose of this small app is to monitor some pages in a website, in a predetermined interval, in order to be able to know beforehand if something is going wrong with the webserver or the database server.

Additional info:

I can't use a commercial app, I need to develop this in order to be able to save the results to a database and create a series of reports based on this info.

The webrequest solution seems to be the approach I'm goint to be using, however, it would be nice to be able to measure the time it takes to fully load the the page (images, css, javascript, etc). Any idea how that could be done?

Depending on how the frequency you need to do it, maybe you can try using Selenium (a automated testing tool for web applications), since it users internally a web browser, you will have a pretty close measure. I think it would not be too difficult to use the Selenium API from a .Net application (since you can even use Selenium in unit tests).

Measuring this kind of thing is tricky because web browsers have some particularities when then download all the web pages elements (JS, CSS, images, iframes, etc) - this kind of particularities are explained in this excelent book (http://www.amazon.com/High-Performance-Web-Sites-Essential/dp/0596529309/).

A homemade solution probably would be too much complex to code or would fail to attend some of those particularities (measuring the time spent in downloading the html is not good enough).

Just curious why there is a performance enhancement when you standardize a site with the css and js in the headers and at the top/bottom of the page.

That's not entirely correct. Said simply:

  • Style declarations should be as close to the top as possible, since browsers won't render your page before loading the CSS (to avoid a flash of unstyled content)

  • Script tags should be as close to the bottom as possible, since they block browsers from parsing after the tag before it is loaded and complete (because the script may change the document with document.write)

If you're interested in frontend performance, I highly recommend reading High Performance Web Sites: Essential Knowledge for Front-End Engineers by Steve Souders.

When designing a ASP.net WebForm application what are some important steps to take (or hacks if you like to use the term) to ensure the best possible performance (in terms of speed, stability, and scalability)?

There is a phenominal book on this subject by one of the Yahoo guys, Steve Souders. It taught me a lot.

Or you can just watch this video. It's a high level overview of the same information - you can pick up a lot in 45 minutes by watching this.

NOTE: This content is not WebForms-specific. It's general best practices for the web, and it is what you need if you are trying to roll out a high performance website.

I read High Performance Web Sites: Essential Knowledge for Front-End Engineers and in it the author suggests that all JavaScript code should be externalized and put at the bottom of the page instead of putting it in the head.

This is illustrated in this example. The external script tag blocks both downloading and progressive rendering of a page, so the solution was to put it at the bottom of the page.

However, in his second book Even Faster Web Sites: Performance Best Practices for Web Developers he talks about Inline JavaScript tags.

Inline scripts also blocks downloading and rendering of a page, so he suggests moving them also to the bottom of the page. However, this acts still blocks the rendering of the page entirely as illustrated in this example

Why do moving external scripts to the bottom of the page lets the page render progressively while moving inline scripts blocks rendering completely till the script is executed?


PS:

The question isn't about why add JavaScript to the bottom of the page instead of putting them in the head. It's about why bottom inline scripts block rendering while bottom external scripts don't.

In the inline script, the time is taken up running the script, which might change the DOM. Trying to render the DOM while it's mutating is a recipe for a mess. So rendering only happens at points when the JS is stalled, and therefore the DOM is stable.

While waiting for an external script to download, the running of scripts is stalled, so the DOM can be rendered safely. The downloaded JS won't be run until the rendering is complete.

We are running a web service, that is struggling with some pretty high page rendering times especially IE8 (around 20 sec). We are very skilled at building high performing backend systems, but not as skilled at optimizing the frontend.

Currently it seems (from newrelic) that page rendering and dom-parsing is the biggest issue.

We have tried to optimize js scrips, and that helped a little, but still the page renders terrible slow in IE8, and I have a feeling that some low hanging fruits is out there. My problem is, that I have really no idea where to start, and what would work and if there is some red lambs flashing that I'm not seeing. I need an experienced eye.

Can any one help me in the right direction (I'm open for everything!)?

The slow page is here: the slow page

PS. we are running Rails 3.2

Hello Guys!

Question ::::::


I have been trying to create a site (Hosted on x10hosting). So I have been searching how to build a fast site. And I got a page to improve Jquery codes. Below is the link to the page. In that page I read that including Jquery framework codes from Google.com can speed up the site. And I also find out the cause to it, as the Google.com caches the Jquery framework code so that user doesn't have to download the Jquery framework code again and again. So I was thinking that can I do that too with my site. Well I'm using Linux based site. So can anyone suggest a page or a code to cache scripts or images in my site so that user doesn't have to download it again and again.


Links::::::


Improve JQuery


THANKS IN ADVANCE!

As pekka said fine tuning caching is a very complex field. I recommend beginning with the book "High Performance Web Sites" and follow yahoo performance blog.

I am wondering, should I use these - or are they too much of a hassle or have drawbacks that make them inadvisable? I am finding it impossible to find a good suite of tools that aren't just hacked together and now unmaintained and have quirks that make them useless.

I just wanted to see from Stack Overflow if this is something I should worry about? I am a bit OCD about code format.

It depends on what you want to do with the code. If you want to focus on performance for your production environment, you probably don't need to worry about it being nice to look at. You actually want it as compact as possible to reduce the amount of bandwidth used by your application and the amount of data the user needs to download. Hence the existence of tools like YUI Compressor that will remove all white space, line breaks and even rename variables to make the code smaller (and in the process obfuscate it). Take a look at 'Rule #10: Minify Javascript' of High Performance Web Sites from O'Reilly.

For your development environment, however, this doesn't apply. I am a big fan of clean, indented code since that will make it that much easier to debug and understand. Usually this is something you just get in the habit of doing with a little help from you IDE and some discipline (keep line lenght under 80 characters, add line breaks where useful, indent consistenly, etc.)

Anyone know how I can reduce my page size while it's loading? Like anyway I can reduce my CSS file's size and Javascript file's size?

There is a great book on this called High Performance Web Sites. It's an O'Reilly book that takes you through the various steps of client side performance upgrades. It covers the topics you suggest such as CSS and JS file sizes amongst other things.

Even though my question is very similar to this one, it's not a duplicate.

Firebug NET Tab screenshot

The images shows the stats from Firebug's NET tab, each request is taking a fraction of a second (all requests add up to 2.9 sec), yet the total time adds up to 6 seconds.

How do I figure out which request took the longest time, and where did the extra 3 seconds came from?

Requests are not necessarily in parallel. Most browsers only pull 2 concurrent resources per host. So if all six of your resources are on the same host, they could simply be blocking. Furthermore, if these resources are JavaScript or some other resources that may be parsed on load.

Also note that the total time is when the page load event fires, so this doesn't necessarily mean that is a white screen for six seconds.

Check out the YSlow guidelines for more details tips on performance. I also recommend Building Faster Websites if you're really interested in this subject.

For our site, Im using a lot of jQuery - right now Im looking at 340 lines of jQuery code on top of the base library. How much is too much? I will be adding more, when do I start trying to condense the code and eventually move to OOP?

Optimally, you should keep you script size as minimum as possible, but with today's 'Web 2.0' websites, you will most probably accumulate quite a lot of JavaScript code.

The important thing is that before you deploy your website, make sure to minify and gzip your script files as to reduce the size of your script files as much as possible.

If you are really interested in optimizing and improving your website performance, I highly recommend taking a look at Steve Souders' High Performance Web Sites: Essential Knowledge for Front-End Engineers

I am currently a placement student (web developer) working for a university, and I have been assigned a few big web projects. The projects include a total revamp of the university i.t. help site which draws around 14k hits a month from on the uni campus and around 4k externally. I also have a second project which is a mobile version of the first project. The projects will be sharing some resources.

To generalise this question so the answers could be useful to more people:

  • I have two websites that will share some resources, lets say, index.php, functions.js and style.css, and these scripts will be used on almost all pages on the websites.
  • I have two audiences to cater for (in terms of download speed), the users within the same network that the sites hosted on (100mb/s aprx) and external users.

I would like to know what would be the best way to cache each kind of script (.js, .css, .php) and examples of how this would be done with their pros and cons over other methods if possible. By caching I mean locally, network and server caching.

Note: index.php is a dynamic page which should be refreshed from cache every 2 hours. It would be cool if you start your answer with .js, .css, .php or a combination so I can easily see what type of script you are talking about caching.

Thanks All!

Performance tuning through cachine could be categorized into multi-layers:

Good introduction and practical code examples can be found in Chapter 9 (Performance) - Developing Large Web Applications. It will talk about Caching CSS, Javascript, Modules, Pages, Ajax and Expire headers.

If we need to keep things simple on server-side do the following:

  1. Install APC extension which will make PHP faster for you through the so called opcode caching. No special configuration, it will work silently for you.
  2. Cache the full page for two-hours using this simple Pear library PEAR::Cache_Lite.
  3. For each database SELECT query cache the result in APC with a TTL of 5 Min, md5 hash the SELECT statement and use it as key for APC cache. Docs

In future if you have multiple servers and the performance becomes to be crucial before then you will need to look at:

  1. Shared memory caching between servers. Check Memecache or even Membase
  2. You need a reverse proxy solution: this basically layer between your user and server server so that it will serve the HTTP requests instead of your server. You can use for that Varnish, Squid or Apache Traffic Server.
  3. Mysql innoDB engine is slow, you may need to go for faster engine such as XtraDB
  4. Then maybe you will find that rational databases are stil slow for you. Then you will go for the key-value solution such as MongoDB.

Finally as references in web application performance check:

  1. Front-end Performance: High Performance Web Sites, Even Faster Web Sites and High Performance JavaScript.
  2. Back-end Performance: Pro PHP Application Performance and High Performance MySQL

I would like to get some tips from our SO users about serving static files on a website — like JavaScript, CSS, images and Flash files — faster. Any useful tips?

The best tip I can give you is: buy Steve Sounder's book High Performance Web Sites which is full of easy-to-follow tips. On the account of static images: use a content delivery network (CDN), which means: just place your static content on another server, on another (sub) domain and you have the best performance you can have for static content.

The advantages are: no cookies send back and forth (this accounts for much overhead!), no other HTTP overhead, good timeouts, solid performance when using external CDN and your own server gets much less trafic. There are many commercial (like Amazon S3), but also free, CDN suppliers.

Some less-important but still valuable of tips:

Note: StackOverflow is a fine example website that follows all these tips and download YSlow to test your own website.

I am interested in adding a landsacpe footer on my website but the image size is 115KB and will load on every page... Is there any effective way to load an huge image such as this one:

http://gyazo.com/5b1b7312ec4370873368df0181e41b13.png

Here's a few things that may help you:

EDIT: I tried the second tip in the list below (tinypng.com) and it reduced the size of your image with 71% to 39.1 KB. So that's a big win.

  1. Make sure to set the cache headers on your webserver so that the browser can cache the file. Also use the same URL for all other times you use the image. Doing these two simple things will make sure that the image will only get downloaded the first time the browser requests it. All other times it will be loaded from the browser's cache.

  2. Make sure to check if the image is as small as it can be. If you use a PNG then use tools like https://tinypng.com/ to squash all metadata out of the image. If you use a JPEG then maybe lower its quality. If you use Photoshop make sure to "save the image for web". This will also reduce the size. For photographs you are mostly better of using JPEGs, for text or other images that need to be lossless use PNG or GIF.

  3. Loading images will not really slow down your page that much. Not like JavaScript anyway. Loading Javascript will block the rendering of the page until the JS file is downloaded unless you use special loading techniques. That is not the case for images: the page will continue being rendered and the user can start using the page.

  4. If the image is placed using an IMG tag the make sure to set the width and the height of the image in the CSS (or using the img width and height attributes). That will make sure that the browser does not need to reflow the page when the image is downloaded. It will know what size it needs to be even before the image is downloaded.

  5. There is a maximum number of parallel requests per domain that the browser will do. If the image has a very low priority you could postpone its loading and wait for the onLoad event. This will make sure the other resources (with a a higher prio) will be downloaded first. This will require some JavaScript, but not that much (Use an image lazy loader, there are many).

  6. As I said in the previous item the are a maximum number of requests PER DOMAIN. This is why you could also create a new (sub)domain and load some content from there. It will increase the total number of resources that will be downloaded in parallel. using a CDN will also help here because they also have a separate domain (and they are optimised as well).

  7. If you want to read some REALLY GOOD books about this kind of optimising, read these: http://www.amazon.com/High-Performance-Web-Sites-Essential/dp/0596529309 http://www.amazon.com/Even-Faster-Web-Sites-Performance/dp/0596522304

The best way to include JavaScript on your website is to include as few files as possible at the end of the page*, after all of the markup and other assets have been loaded. Doing it that way also has the added benefit of encouraging a "progressive enhancement" development model.

*Yahoo actively spends a great deal of money researching this and has dedicated, full-time people (including the brilliant Steve Souders) who publish their findings online and in book form.

I've seen other approaches that attach a version number or MD5 hash to a JS src querystring.

e.g. <script src='/script/v1/'></script>

However, my JavaScript is still getting cached in multiple browsers (Chrome, Firefox) when I push a new version of my site.

This seems like a major problem that others have solved, and I seem to be doing the right things. How can I get this to work?

I added log messages and determined that the querystring method is working. Sorry for the unnecessary question.

However, in researching, I found some important points worth mentioning:

  1. One of the articles suggests using a querystring with the current time appended. You probably don't want to follow this suggestion as your files will never be cached. Using source control version numbers or an MD5 hash would be better.
  2. Steve Souders (of High Performance Web Sites fame) notes that certain web proxies never cache anything with a querystring. Thus, the version number should be embedded within the path to the file in order to ensure that your files are cached appropriately when accessed through these proxies. ( http://www.stevesouders.com/blog/2008/08/23/revving-filenames-dont-use-querystring/ )

The point of concatenation is to improve performance by having just one file to download, but that means that every time you change a bit of your own javascript, the whole package is recompiled and fingerprinted - including large libraries like jQuery that haven't changed, and would have been cached if they were downloadable separately, but now jQuery is going to be redownloaded each time as part of your unified application.js.

What am I missing here? Wouldn't the best approach be to create two manifests - one for your own files (which are small and change frequently), and one for libraries (which are large and change infrequently)?

I will give it a try, with some speculation inside it ...

First, JQuery is provided by Rails itself, and depending on your layout, it will come from a CDN. So lets look at the libraries that may change over time. What are the scenarios here?

  • A user is visiting the web site for the first time. His browser (depending on the type) has to load all Javascript files before he can show something that comes below that (therefore, move it to the end). Depending on the browser, it may load 2, 4, 6 or 8 resources at one time, if your site consists of dozens or even hundreds of them, this will slow the presentation then.
  • A user is visiting the web site (this page) the second time. Normally on the same day, hour or even minute. The whole thing will be cached, there is only one request, that the cached thing can be used, pretty fast then. If all resources (hundreds) would be loaded one after another, there will be hundreds of requests if the cache is valid.
  • A user is visiting the web site the second time, and there was some time in between (lets say 15 days). Only 1 resource was changed, all other could be cached and reused. How probable is that?
  • A user (the developer) is visiting his work during development. No asset pipeline is used, no caching, because every change should be noticed immediately.

So I think, from a web site view, only the scenario 3 may be (a little bit) slower, and it is the most improbable one. Normally, the overhead of many, many requests is much more relevant than the size of the requests.

If you have the time, just try with a tool that displays it the loading time of all resources. There may be edge cases that one resource will change often, and should therefore not included in the asset pipeline, but normally, every change includes numerous resources, and caching them as one bit blob helps to avoid a lot of requests.

Here are some references to literature that discusses this:

The two connection limit can be particularly troublesome when you have multiple tabs open simultaneously. Besides "ignore the problem," what coping mechanisms have you seen used to get multiple tabs both doing heavily interactive Ajax despite the two connection limit?

The two connection limit is a "suggestion" and this article describes how to get around it where possible. Other Firefox configuration is discussed on this about the about:config capability in Firefox.

Also, if you own the website, you can tweak the performance of the site using suggestions form this book from the Chief Performance Yahoo.

what are the measures which reflects a highly scalable and with high performance asp.net form.

like time in seconds to load the page?

time in seconds to do data actions like Add delete.

page size.

I have a developer designing a financial application to be used on a mobile phone via the phones browser. Now each page is 150kb which to my opinion is way to large. No images are used as it is mostly HTML buttons and CSS and possibly JavaScript. How can I minimize the page size?

This is the problem I'm currently tackling, and I can tell you that you will absolutelly love the book High Performance Web Sites by Steve Souders.

The first thing you'll find out in the book is that you should make fewer HTTP requests and one of the ways to do that is to put images in so called "sprites" (although you said your designer doesn't use a lot of images, it would still help to push that few images to sprites).

To do that you would need some tool (cause doing it manually is just a waste of time), and for this I used compass, and the book Pragmatic guide to SASS was absolutelly great for this topic (it goes straight to the point and shows how to use compass to make sprites - also, I bet once you try SASS, you will never go back to vanilla css).

Also, as Midhat mentioned you should minify your js/css/html, but on top of that you should put all your js file to one file (cutting down on HTTP requests the browser has to make), all your css code to one css file. Also, you should put the stylesheets at the top of the page, and scripts at the bottom. Anyways the few things I listed and a total of 14 rules you'll find in Steve's book. so give it a try, you won't be dissapointed.

I want to Switch from Web Designer to Front End Developer or web developer PHP, which skill should I get , Is it Easy to switch from Designer to Developer. I have two years Exp. in Web Designing. Please suggest.

Or should i stick to the Designing what is the Next BIG thing for Designer after DIV layout.

My Current Roles Conversion PSD to HTML, fixing Bugs in Different Browsers , Strong knowledge of HTML and CSS. I want to Go with the Open source Programming like PHP and MySQL

@wazdesign, I didn't come from a design background like you, but I found my niche in Front-End Web Development none-the-less. I started with Standards-based HTML and CSS and then started working back in the day on the Views and Helper functions in MVC frameworks (with a good team doing the controllers, models etc.) Ask a competent Web Developer to give you a basic web-server architecture and process demo. Understand how data from the DB gets onto your user's pages, and all the checkpoints the data goes thru on the way. Once you understand the principles, you can pretty much work with any technology after tooling up with the syntax.

I've listed some terms to research below that are tech-agnostic. I can't help you with the PHP side of things :)

Some books worth reading:

Some terms to research:

  • Interaction Design
  • MVC Frameworks
  • Templating systems
  • HTTP
  • User Interface

Some tools to use:

  • Firebug
  • YSlow for Firebug

Many web developers are probably aware of various website optimization techniques described in this Yahoo Developer Network article and/or Steve Souders' book. Most of these techniques are very simple, yet bring about huge difference to the downloading time of most web pages. As simple as they are, applying some of these rules again and again in all .NET web applications can easily become a tedious task. Combres automates many steps that you would have to do yourself when applying many optimization techniques in your ASP.NET MVC and Web Form applications.