The Performance tab in Real User Monitoring (RUM) is the central place for understanding the front-end performance of your software. This documentation outlines the data and features available to help you identify poor performing parts of your application, and measure the impact of changes you make on software performance.
Real User Monitoring captures a number of metrics across multi-page and single-page applications. These metrics can help you identify poor performing parts of your application and measure the performance impact of changes made.
For more information on the metrics captured by Raygun, and what they’re measuring, see the performance metrics documentation.
The line graph visualizes the Average, Median, P90, and P99 of each performance metric over the time range selected, enabling you to see the trend in overall front-end performance of your software.
The dropdown in the top right hand corner of the chart can be used to toggle between different metrics.
The graph displays lines for Average, Median, P90, and P99. You can hide each line by clicking the corresponding label in the key at the bottom of the graph.
For more information on the metrics displayed in the line graph, and what they’re measuring, see the performance metrics documentation.
Understanding performance for specific pages & XHRs
You can find performance diagnostics for a particular page, virtual page or XHR by clicking on the hyperlinked route from any one of the tables on the performance page. The following tables offer different ways to identify slow performing parts of your application:
Most requested pages: Lists pages from the most to least viewed within the time range selected.
Most viewed virtual pages: Lists virtual pages from the most to least viewed within the time range selected.
Slowest & most requested pages: This list combines page requests and load time, to surface the biggest performance wins in your application.
Slowest pages by client time: Client time represents the time taken to execute the parts of the load time which take place on the client side (Transfer, Render and Children). This table shows pages with the slowest client load time.
Slowest pages by server time: Server time represents the time taken taken to execute the server-side component of a page request. This table shows pages with the slowest server load time.
Most requested XHR: Lists XHRs from most to least requested within the time range selected.
You can also use the URI search (below the line graph on the performance page) to search for a particular URI. You can search with either full routes or keywords.
You can find more information on the page performance diagnostics captured in our documentation.
The filter bar at the top of the performance page can be used to refine the dataset, helping you identify performance issues based on particular parameters such as geography, device, browser, custom tags and more.
For more information, see the documentation on using filters in Real User Monitoring.
You can track custom performance measurements across your website or application using custom timings in Real User Monitoring. For example, you might track the time it takes for a specific video to play, or for the search bar to load.
For more information on custom timings, and how to configure them, see the custom timings documentation.
Exporting to CSV
Performance data is available to be downloaded in CSV format. This can be done by clicking the settings icon on the module that you wish to export data for and selecting export CSV. Some important points to note are:
- Export CSV functionality will export data within the date range selected.
- For performance reasons the number of records exported is limited to 1000.
- Export CSV does not conform to top level filters, meaning data exported will not match any filter criteria applied.
note: For performance reasons we limit the number of records exported to 1,000.