Optimizing JavaScript For Execution Speed
05.27.03

Optimizing JavaScript For Execution Speed

By Andy King
Contributing Writer

JavaScript can benefit from many of the same speed-optimization techniques that are used in other languages, like C1,2 and Java. Algorithms and data structures, caching frequently used values, loop unrolling and hoisting, removing tail recursion, and strength-reduction techniques all have a place in your JavaScript optimization toolbox. However, how you interact with the Document Object Model (DOM) in large part determines how efficiently your code executes.

Unlike other programming languages, JavaScript manipulates web pages through a relatively sluggish API, the DOM. Interacting with the DOM is almost always more expensive than straight computations. After choosing the right algorithm and data structure and refactoring, your next consideration should be minimizing DOM interaction and I/O operations.

With most programming languages, you can trade space for time complexity and vice versa.3 But on the web, JavaScripts must be downloaded. Unlike desktop applications where you can trade another kilobyte or two for speed, with JavaScript you have to balance execution speed versus file size.

How Fast Is JavaScript?

Unlike C, with its optimizing compilers that increase execution speed and decrease file size, JavaScript is an interpreted language that usually is run over a network connection (unless you count Netscape's Rhino, which can compile and optimize JavaScript into Java byte code for embedded applications4). This makes JavaScript relatively slow compared to compiled languages.5 However, most scripts are usually so small and fast that users won't notice any speed degradation. Longer, more complex scripts are where this chapter can help jumpstart your JavaScript.


Resources for Web Developers.
Trials, Software, Downloads, and More.


Design Levels

A hierarchy of optimization levels exists for JavaScript, what Bentley and others call design levels.6 First comes the global changes like using the right algorithms and data structures that can speed up your code by orders of magnitude. Next comes refactoring that restructures code in a disciplined way into a simpler, more efficient form7). Then comes minimizing DOM interaction and I/O or HTTP requests. Finally, if performance is still a problem, use local optimizations like caching frequently used values to save on recalculation costs. Here is a summary of the optimization process:
  1. Choose the right algorithm and data structure.

  2. Refactor to simplify code.

  3. Minimize DOM and I/O interaction.

  4. Use local optimizations last.
When optimizing your code, start at the highest level and work your way down until the code executes fast enough. For maximum speed, work at multiple levels.

Measure Your Changes

Measurement is a key part of the optimization process. Use the simplest algorithms and data structures you can, and measure your code's performance to see whether you need to make any changes. Use timing commands or profilers to locate any bottlenecks. Optimize these hot spots one at a time, and measure any improvement. You can use the date object to time individual snippets:
<script type="text/javascript">
function DoBench(x){
  var startTime,endTime,gORl='local';
  if(x==1){
    startTime=new Date().getTime();
    Bench1();
    endTime=new Date().getTime();
  }else{
    gORl='global';
    startTime=new Date().getTime();
    Bench2();
    endTime=new Date().getTime();
  }
alert('Elapsed time using '+gORl+' variable: '+((endTime-startTime)/1000)+' seconds.');
}
...
</script>

This is useful when comparing one technique to another. But for larger projects, only a profiler will do. Mozilla.org includes the Venkman profiler in the Mozilla browser distribution to help optimize your JavaScript.


The Venkman JavaScript Profiler

For more information on the Venkman profiler, see the following web sites:

The Pareto Principle

Economist Vilfredo Pareto found in 1897 that about 80 percent of Italy's wealth was owned by about 20 percent of the population.8 This has become the 80/20 rule or the Pareto principle, which is often applied to a variety of disciplines. Although some say it should be adjusted to a 90/10 rule, this rule of thumb applies to everything from employee productivity and quality control to programming.

click here for article

About the Author:
Andy King is the author of "Speed Up Your Site" and founder of WebReference.com and JavaScript.com. As Managing Editor of these award-winning sites, Andy became the "Usability Czar" at internet.com. A ten-year web veteran, he has written extensively on web site optimization; "Speed Up Your Site" is the culmination of that work. For more information on the book and Andy's consulting services see the companion site at http://www.WebSiteOptimization.com. You can contact him at http://www.websiteoptimization.com/contact.
Resources for Developers








--
DevNewz is an iEntry, Inc. publication --
2003 iEntry, Inc.  All Rights Reserved  Privacy Policy  Lega
l