Tool Mentor:
Finding Performance Bottlenecks Using Rational Quantify and Rational
PurifyPlus (Windows)
Purpose
This tool mentor provides an overview of how to use Rational Quantify to quickly
pinpoint performance bottlenecks in Visual C/C++, Visual Basic, Java, and VS.Net
language applications. This tool mentor is applicable for use with systems
running Microsoft Windows.
PurifyPlus is a Rational product that includes Quantify
functionality.
To learn more about Quantify, including how to project performance
improvements, interpret source-code annotations, compare program runs, and
fine-tune data collection, read Getting Started with Rational Quantify.
For step-by-step information about using
Quantify, see the Quantify online Help.
Related Rational Unified Process information:
Overview
Quantify provides a complete, accurate set of performance data for your
program and its components, in an understandable and usable
format so you can see exactly where your program spends most of its time.
Tool Steps
To profile a program's performance:
- Run a program using Quantify to collect performance
data
- Use the Quantify data analysis windows and tools to
analyze the performance data
- Run the program again and use the Quantify Compare Runs
tool to find performance changes
1. Run a program using Quantify to collect
performance data 
The first step in profiling your program's performance is to collect
performance data.
For Visual C++, instrument and run a program, either directly from
Microsoft Visual Studio using the Quantify integration or from Quantify.
Quantify instruments copies of the executable and its associated modules.
Quantify also inserts additional code to collect counted and timed performance
data. Quantify shows you its progress as it instruments files.
For Java, run Java applets, class files, or code launched by container
programs from Quantify (using the Run Program dialog) or from the command line.
When you profile Java code, Quantify puts the Java virtual machine (JVM)
into a special mode that enables Quantify to monitor the JVM’s operation, and
directly collect counted and timed performance data as the applet, class file,
or code runs.
For Visual Basic, run Visual Basic projects or p-code programs (Visual
Basic 6.0) or Visual Basic native-code programs (Visual Basic 5.0 or later),
either directly from Microsoft Visual Basic using the Quantify integration or
from Quantify. When you profile projects or p-code programs, Quantify puts the
Visual Basic for Applications (VBA) interpreter engine into a special mode that
enables Quantify to monitor the engine's operation and directly collect timed
performance data as your code runs. For native-code programs, Quantify
instruments the program and then collects counted and timed performance data.
For VS.Net language applications, run the application from Quantify
(using the Run Program dialog and specifying that you are profiling a
managed-code application) or from the command line. Quantify puts the common
language run-time (CLR) into a special mode that enables Quantify to monitor the
run-time's operation and directly collect performance data as the application
runs.
When Quantify starts profiling, it displays the Run Summary window so you can
monitor the activity of threads and fibers, and check other information about
the run. As you exercise your code, Quantify records data about its performance.
You can pause and resume data recording at any time, enabling you to profile
specific portions of code. You can also take a snapshot of the current data,
enabling you to examine performance in stages.
When you exit your program, Quantify has a complete profile of its
performance. Because this base dataset can be very large, Quantify uses filters
to automatically filter out non-critical data from system libraries and other
modules before it displays the performance profile. As you analyze the
performance data, you can display more or less data and detail from the original
dataset.
Tip: In addition to using Quantify interactively, you can also use it with your test scripts, makefiles, and batch files for automated
testing. For more information, look up scripts in the Quantify online
Help index.
For more information, look up the following topics in the Quantify
online Help index:
- developer studio
- visual basic
- java
- run summary
- recording data
- vs.net
- scripts
2. Use the Quantify data analysis windows and tools to
analyze the performance data 
The second step in profiling your program's performance is to analyze the
performance data that Quantify has collected.
When you exit the program for which Quantify has been collecting data, it displays the Call Graph window, graphically depicting the calling
structure and performance of the functions, procedures, or methods (collectively
referred to here as functions) in the program. By default, the call graph
displays the top 20 functions in the current dataset by function + descendants
(F+D) time. Quantify's results include virtually no overhead from the profiling
process itself. The numbers you see are the time your program would take without
Quantify.
The call graph also highlights the most expensive path; thicker lines
indicate more expensive paths. You can highlight other functions based on
various criteria, including performance, calling relationships, and possible
causes for bottlenecks. You can also show additional functions, hide functions,
and move functions around to better view the call graph.
You can use Quantify's other data analysis windows to further examine the
program's performance. To review all functions in the current dataset, and to
sort them by various criteria, use the Function List window. The Function Detail
window displays data for a specific function, and data about its callers and
descendants, in both tabular and graphical formats. If debug data was available
when you ran the program and you measured functions at line level, you can use
the Annotated Source window to analyze a specific function's performance line by
line.
Quantify provides several ways to reduce large datasets and display only the
data you're interested in. For example, you can specify filters to remove
functions based on module, pattern (for example, functions with CWnd
in their name), or measurement type (for example, all waiting and blocking
functions). You can also focus on a specific subtree.
You can easily analyze the performance of the program over several runs by
merging the separate runs to create a new dataset.
For more information, look up the following topics in the Quantify
online Help index:
- call graph window
- function list
window
- function detail window
- annotated source window
- highlighting
functions
- filtering data
- subtrees
3. Run the program again and use the
Quantify Compare
Runs tool to find performance changes 
The third and final step in profiling your program's performance is to
compare performance data from two runs, to see whether code changes have caused
the performance to improve or regress.
After you make changes to your code, you can rerun the updated program and
compare the new results to a previous run. The Diff Call Graph highlights
performance improvements in green and regressions in red to help you pinpoint
performance changes quickly. The Diff Function List displays the differences
between the two runs, as well as original data from the two runs.
Use the Navigator window to keep track of all the runs you're working with.
You can save performance data as a Quantify data file (.qfy) to use for further
analysis or to share with other Quantify users. You can save data to a
tab-delimited ASCII text file (.txt) to use outside of Quantify, for example, in
test scripts or in Microsoft Excel. You can also copy data directly from the
Function List window to use in Excel.
For more information, look up the following topics in the Quantify online
Help
index:
- comparing runs
- navigator window
- saving data
Copyright
© 1987 - 2001 Rational Software Corporation
|