JavaScript Optimization Theory--Part 1 of Chapter 10 from Speed Up Your Site (1/5) | WebReference

JavaScript Optimization Theory--Part 1 of Chapter 10 from Speed Up Your Site (1/5)

current pageTo page 2To page 3To page 4To page 5

Speed Up Your Site, Chapter 10: Optimizing JavaScript for Execution Speed

Design Levels

A hierarchy of optimization levels exists for JavaScript, what Bentley and others call design levels.[6] First comes the global changes like using the right algorithms and data structures that can speed up your code by orders of magnitude. Next comes refactoring that restructures code in a disciplined way into a simpler, more efficient form[7]). Then comes minimizing DOM interaction and I/O or HTTP requests. Finally, if performance is still a problem, use local optimizations like caching frequently used values to save on recalculation costs. Here is a summary of the optimization process:

  1. Choose the right algorithm and data structure.

  2. Refactor to simplify code.

  3. Minimize DOM and I/O interaction.

  4. Use local optimizations last.

When optimizing your code, start at the highest level and work your way down until the code executes fast enough. For maximum speed, work at multiple levels.

Measure Your Changes

Measurement is a key part of the optimization process. Use the simplest algorithms and data structures you can, and measure your code's performance to see whether you need to make any changes. Use timing commands or profilers to locate any bottlenecks. Optimize these hot spots one at a time, and measure any improvement. You can use the date object to time individual snippets:

<script type="text/javascript">
function DoBench(x){
  var startTime,endTime,gORl='local';
    startTime=new Date().getTime();
    endTime=new Date().getTime();
    startTime=new Date().getTime();
    endTime=new Date().getTime();
alert('Elapsed time using '+gORl+
      ' variable: '+((endTime-startTime)/1000)+
      ' seconds.');

This is useful when comparing one technique to another. But for larger projects, only a profiler will do. includes the Venkman profiler in the Mozilla browser distribution to help optimize your JavaScript.

The Venkman JavaScript Profiler - For more information on the Venkman profiler, see the following web sites:

The Pareto Principle - Economist Vilfredo Pareto found in 1897 that about 80 percent of Italy's wealth was owned by about 20 percent of the population.[8] This has become the 80/20 rule or the Pareto principle, which is often applied to a variety of disciplines. Although some say it should be adjusted to a 90/10 rule, this rule of thumb applies to everything from employee productivity and quality control to programming.

Barry Boehm found that 20 percent of a program consumes 80 percent of the execution time.[9] He also found that 20 percent of software modules are responsible for 80 percent of the errors.[10] Donald Knuth found that more than 50 percent of a program's run time is usually due to less than 4 percent of the code.[11] Clearly, a small portion of code accounts for the majority of program execution time. Concentrate your efforts on these hot areas.

current pageTo page 2To page 3To page 4To page 5

6. Bentley, Programming Pearls. Back

7. Martin Fowler, Refactoring: Improving the Design of Existing Code (Boston, MA: Addison-Wesley, 1999). Back

8. Vilfredo Pareto, Cours d'économie politique professé à l'Université de Lausanne, 2 vols. (Lausanne, Switzerland: F. Rouge, 1896-97). Back

9. Barry W. Boehm, "Improving Software Productivity," IEEE Computer 20, no. 9 (1987): 43-57. Back

10. Barry W. Boehm and Philip N. Papaccio, "Understanding and Controlling Software Costs," IEEE Transactions on Software Engineering 14, no. 10 (1988): 1462-1477. Back

11. Donald E. Knuth, "An Empirical Study of FORTRAN Programs," Software—Practice and Experience 1, no. 2 (1971): 105-133. Knuth analyzed programs found by sifting through wastebaskets and directories on the computer center's machines. Back

Created: January 8, 2003
Revised: January 8, 2003