Bug 470455 - test_database_sync_embed_visits.js leaks, r=sdwilsh
[wine-gecko.git] / tools / performance / layout / perf-doc.html
blobf4f66f79cb645b058e15949a8ce7d77c1910789f
1 <!doctype html public "-//w3c//dtd html 4.0 transitional//en">
2 <html>
3 <head>
4 <meta http-equiv="Content-Type" content="text/html; charset=iso-8859-1">
5 <meta name="Author" content="Marc Attinasi">
6 <meta name="GENERATOR" content="Mozilla/4.7 [en] (WinNT; U) [Netscape]">
7 <title>Performance Tools for Gecko</title>
8 <style>
9 BODY { margin: 1em 2em 1em 2em; background-color: bisque }
10 H1, H2, H3 { background-color: black; color: bisque; }
11 TABLE.boxed { border-width: 1; border-style: dotted; }
12 </style>
13 </head>
14 <body>
16 <dl>&nbsp;
17 <table WIDTH="100%" >
18 <tr>
19 <td>
20 <center><img SRC="mozilla-banner.gif" height=58 width=600></center>
21 </td>
22 </tr>
23 </table>
25 <center><table COLS=1 WIDTH="80%" class="boxed" >
26 <tr>
27 <td>
28 <center>
29 <h2>
30 Performance Monitoring for Gecko</h2></center>
32 <center>
33 <dd>
34 <i>maintainer:&nbsp; marc attinasi&nbsp;</i></dd></center>
36 <center>
37 <dd>
38 <i><a href="mailto:attinasi@netscape.com">attinasi@netscape.com</a></i></dd></center>
39 </td>
40 </tr>
41 </table></center>
43 <center>
44 <dd>
45 </dd></center>
46 </dl>
48 <h3>
49 Brief Overview</h3>
50 Gecko should be <i>fast</i>. To help us make sure that it is we monitor
51 the performance of the system, specifically in terms of Parsing, Content
52 Creation, Frame Creation and Style Resolution - the core aspects of layout.
53 Facilitating the monitoring of performance across build cycles is a small
54 set of tools that work in conjunction with program output coming from the
55 Mozilla or Viewer applications to produce tables of performance values
56 and historical comparisons of builds analysed in the past. The tools, their
57 dependencies and their general care and feeding are the topics of this
58 document.
59 <h4>
60 Usage: A five-step plan to enlightenment</h4>
62 <ul>
63 <li>
64 First, the tools are all designed to run only on Windows. That is really
65 a bummer, but since most of what we are measuring is XP it should not really
66 matter. Get a Windows NT machine if you want to run the tools.</li>
68 <li>
69 Next, you need a build that was created with performance monitoring enabled.
70 To create such a build you must compile the Mozilla source with a special
71 environment variable set. This environment variable turns on code that
72 accumulates and dumps performance metrics data. The environment variable
73 is: <b>MOZ_PERF=1</b>. Set this environment variable and then build all
74 of Mozilla. If you can obtain a build that was built with MOZ_PERF=1 set
75 then you can just use that build.</li>
77 <li>
78 Third, run the script <b>perf.pl</b> to execute Viewer and run through
79 the test sites gathering performance data.</li>
81 <li>
82 Fourth, make sure the script completed and then open the resultant HTML
83 files which is dropped in the Tables subdirectory.</li>
85 <li>
86 Lasty, stare at the table and the values in it and decide if performance
87 is geting better, worse, or staying the same.</li>
88 </ul>
90 <h3>
91 The PerfTools</h3>
92 <blink>IMPORTANT: </blink>The tools created for monitoring performance
93 are very tightly coupled to output from the layout engine. As Viewer (or
94 Mozilla) is run it spits-out various timing values to the console. These
95 values are captured to files, parsed and assembled into HTML tables showing
96 the amount of CPU time dedicated to parsing the document, creating the
97 content model, building the frame model, and resolving style during the
98 building of the frame model. All of the scripts that make up the perftool
99 are locate in the directory <tt>\mozilla\tools\performance\layout.</tt>
100 Running them from another location <i>may</i> work, but it is best to run
101 from there.
102 <p>The perl script, <tt>perf.pl</tt>, is used to invoke Viewer and direct
103 it to load various URLs. The URLs to load are contained in a text file,
104 on per line. The file <tt>40-URL.txt</tt> is the baseline file and contains
105 a listing of file-URLs that are static, meaning they never change, because
106 they are snapshots of popular sites. As the script executes it does two
107 things:
108 <ol>
109 <li>
110 Invoke Viewer and feed it the URL-file, capturing the output to another
111 file</li>
113 <li>
114 Invoke other perl scripts to process the Viewer output into HTML tables</li>
115 </ol>
116 A set of perl scripts are used to parse the output of the Viewer application.
117 These scripts expect the format of the performance data to be intransient,
118 in other words, it should not change or the scripts need to be updated.
119 Here are the files involved in parsing the data and generating the HTML
120 table:
121 <ul>
122 <li>
123 <tt><b>perf.pl</b> : </tt>The main script that orchestrates the running
124 of viewer and the invocation of other scripts, and finally copies files
125 to their correct final locations. An example of an invocation of the perf.pl
126 script is: '<b><tt><font color="#000000">perl perf.pl</font><font color="#000099">
127 Daily-0215 s:\mozilla\0215 cpu</font><font color="#000000">'</font></tt></b></li>
129 <ul>
130 <li>
131 <tt><b><font color="#000099">Daily-0215 </font></b><font color="#000000">is
132 the name of the build and can be anything you like.</font></tt></li>
134 <li>
135 <tt><b><font color="#000099">s:\mozilla\0215 </font></b><font color="#000000">is
136 the location of the build. There must be a bin directory under the directory
137 you specified, and it must contain the MOZ_PERF enabled build.</font></tt></li>
139 <li>
140 <tt><b><font color="#000099">cpu </font></b><font color="#000000">indicates
141 that we are timing CPU time. The other option is clock but that is not
142 currently functional because of the clock resolution.</font></tt></li>
143 </ul>
145 <li>
146 <b><tt>Header.pl</tt></b> : a simple script that generates the initial
147 portion of the HTML file that will show the performance data for the current
148 build.</li>
150 <li>
151 <tt><b>AverageTable2.pl</b> </tt>: a slightly more complicated script that
152 parses the output from Viewer, accumulates data for averaging, and generates
153 a row in the HTML table initialized by header.pl. This file is <b>must</b>
154 be modified if the performance data output fromat changes.</li>
156 <li>
157 <tt><b>Footer.pl</b> </tt>: a simple script that inserts the last row in
158 the HTML table, the averages row. It also terminates the table and ends
159 the HTML tag.</li>
161 <li>
162 <tt><b>GenFromLogs.pl</b> </tt>: a script that generates the HTML table
163 from already existing logs. This is used to regenerate a table after the
164 QA Partner script has run, in case the table file is lost or otherwise
165 needs to be recreated. Also, if old logs are kept, they can be used to
166 regenerate their corresponding table.</li>
168 <li>
169 <b><tt>Uncombine.pl</tt></b> : a script that breaks up a single text file
170 containing all of the timing data for all of the sites into a separate
171 file for each individual site.</li>
173 <li>
174 <b><tt>History.pl</tt></b> : a script that generates an HTML file showing
175 historical comparison of average performance values for current and previous
176 builds.</li>
177 </ul>
179 <h3>
180 The URLs</h3>
181 It is critical that the URLs that we load while measuring performance do
182 not change. This is because we want to compare performance characteristics
183 across builds, and if the URLs changed we could not really make valid comparisons.
184 Also, as URLs change, they exercise different parts of the application,
185 so we really want a consistent set or pages to measure performance against.
186 The builds change, the pages do not.
187 <p>On February 3, 2000 the top 40 sites were 'snaked' using the tool WebSnake.
188 These sites now reside in disk-files and are loaded from those files during
189 the load test. The file <tt>40-URL.txt</tt> contains a listing of the file-URLs
190 created from the web sites. The original web sites should be obvious from
191 the file-URLs.
192 <br>&nbsp;
193 <blockquote><i><b>NOTE</b>: There are some links to external images in
194 the local websites. These should have been resolved by WebSnake but were
195 not for some reason. These should be made local at some point so we can
196 run without a connection to the internet.</i></blockquote>
198 <h3>
199 Historical Data and Trending</h3>
200 Historical data will be gathered and presented to make it easy for those
201 concerned to see how the relative performance of various parts of the product
202 change over time. This historical data is kept in a flat file of comma-delimited
203 values where each record is indexed by the pull-date/milestone and buildID
204 (note that the buildID is not always reliable, however the pull-date/milestone
205 is provided by the user when the performance package is run, so it can
206 be made to be unique). The Historical Data and Trending table will show
207 the averages for Parsing, Content Creation, Frame Creation, Style Resolution,
208 Reflow, Total Layout and Total Page Load time for each build, along with
209 a simple bar graph representation of each records weight relative to the
210 other records in the table. At a later this can be extended to trend individual
211 sites, however for most purposes the roll-up of overall averages is sufficient
212 to track the performance trends of the engine.
213 <h3>
214 The Execution Plan</h3>
215 Performance monitoring will be run on a weekly basis, and against all Milestone
216 builds. The results of the runs will be published for all interested parties
217 to see. Interested and/or responsible individuals will review the performance
218 data to raise or lower developer awareness of performance problems and
219 issues as they arise.
220 <p>Currently, the results are published weekly at <a href="http://techno/users/attinasi/publish">http://techno/users/attinasi/publish</a>
221 <h3>
222 Revision Control and Archiving</h3>
223 The scripts are checked into cvs in the directory \mozilla\tools\performance\layout.
224 The history.txt file is also checked in to cvs after every run, as are
225 the tables produced by the run. Commiting the files to cvs is a manual
226 operation and should be completed only when the data has been analysed
227 and appears valid. Be sure to:
228 <ol>
229 <li>
230 Commit history.txt after each successful run.</li>
232 <li>
233 Add / commit the new table and new trend-table after each successful run
234 (in the Tables subdirectory).</li>
236 <li>
237 Commit any chages to the sciripts or this document.</li>
238 </ol>
240 <hr WIDTH="100%">
241 <h3>
242 History:</h3>
244 <table BORDER WIDTH="50%" >
245 <tr>
246 <td WIDTH="25%">02/04/2000</td>
248 <td>Created - attinasi</td>
249 </tr>
251 <tr>
252 <td>03/17/2000</td>
254 <td>Removed QA Partner stuff - no longer used</td>
255 </tr>
257 <tr>
258 <td></td>
260 <td></td>
261 </tr>
263 <tr>
264 <td></td>
266 <td></td>
267 </tr>
269 <tr>
270 <td></td>
272 <td></td>
273 </tr>
274 </table>
276 </body>
277 </html>