Angular is Google’s flagship open-source web application framework designed to develop efficient and sophisticated single-page applications (SPAs). It is a cross-development platform capable of building modern progressive web apps, mobile and desktop-installed applications across Mac, Windows and Linux. Aside from its powerful development ecosystem, Angular provides a clean structure out of the box that empowers developers to follow design patterns to build, scale and maintain projects. This allows developers to mix and match components with ease, which results in a smooth application design with decoupled components.

However, solely writing code to build an Angular application could later down the road lead to performance issues and slow loading times. Working with the latest features from the framework and diligently reorganizing project structure, files and code are just a few of the actions developers can do to optimize overall performance. Today’s high expectations for lightning fast load times demand developers to pay attention to other areas, like build and deploy time, code enhancement techniques, and operational strategies to monitor an app’s metrics during runtime in order to improve an applications performance.

 

Issues when Determining Application Performance and Load Times

Since Angular is a modern typescript-based framework that offers dynamic web pages, it provides some challenges in monitoring its performance and load time. It is troublesome to accurately measure when new content is rendered on the page because SPAs do not trigger new navigation in the browser once the web page has loaded. Therefore, HTTP-monitoring tools will not provide significant metrics to optimize load times since Angular does not trigger new browser requests to the server.

Additionally, despite that HTTP responses determine the result of a request sent by a web page, they fall short in grasping true load times when embedded JavaScript files and associated resources are parsed, executed and rendered until users can fully interact with the page. It is necessary to have a different approach on how to test and monitor the JavaScript events in the browser in order to receive accurate load times from the client side.

 

Tools to Optimize Load Times

Angular offers a list of tools and techniques that can help reduce an application’s load time and monitor its performance over time, especially after an application has scaled large enough to deal with multiple heavy computations. Some of those techniques that can help decrease an application’s initial load time and speed up page navigation are Ahead-of-Time (AoT) compilation, code-splitting and preloading modules.  We’ll discuss these techniques in more detail.

Ahead-of-Time Compilation

There are two main ways to compile an Angular application: Just-in-Time (JiT), which compiles the application in the web browser at runtime and Ahead-of-Time (AoT), which as the name states, compiles the application at build time. The AoT compiler assembles the HTML and TypeScript code during the build process before the web browser downloads it.

It helps to quicken the rendering process by considerably reducing the time the application takes to bootstrap. In this way, the web browser loads executable code allowing it to render the application instantly without needing to wait for the application to compile. Moreover, the pre-compiled code reduces the amount of asynchronous requests to external sources by adding those resources, like HTML templates and external CSS, inside the app. Thus, the compiled code mitigates the separate AJAX requests to those files. Therefore, the user experience becomes smoother and faster.

 

Code-splitting

In short, code-splitting separates the application’s JavaScript bundles in a way the it does not risk the applications features. It maintains control over the main JavaScript code during initial loading time. Code-splitting can be done at different levels within the application, such as through entry points, dynamically loaded modules and shared code with the help of SplitChunksPlugin while preventing code duplication.

There are two main approaches to code-splitting in an Angular app: component level code-splitting and route level code-splitting. The main difference between the two approaches is that component level code-splitting loads individual components lazily, even without a route navigation, while route level code-splitting loads individual routes lazily. In any case, both approaches can be test considering the app’s TTI (Time to Interactive). TTI is a great performance indicator to compare against since it measures how much time it takes an application to be responsive. In other words, how long it takes the application to load so the user can interact with it.

 

Preloading Modules

Preloading modules is a technique offered in Angular applications that allows the modules to be loaded as soon as possible following established rules. Modules can be preloaded all at the same time, when a particular event occurs or just a selected few depending on the circumstances. Developers have the possibility of checking how much time it takes for a module to load and the inherent value of using a preloading strategy.  Preloading modules in Angular is quite similar to lazy loading except that the applications modules are loaded right after all the eager loaded modules have successfully loaded. In this manner, possible latency is discarded when the user navigates to a lazy loaded module while still benefiting of a quicker initial loading of the application because its initial modules are loaded first.

Angular’s default preloading strategies are PreloadAllModules and NoPreloading. The first means that all lazy-loadable modules are preloaded while the latter disables any preloading. In the case of using PreloadAllModules, applications could potentially face a bottleneck if the application has a large number of modules. It is then when considering a custom preloading strategy could be beneficial.

The concept of using a custom preloading strategy might make more sense in an enterprise scenario. For example, preloading first the most expensive modules over those that are less resource expensive could be an approach developers could use. Moreover, the moment in which modules are preloaded also has an important role in reducing load times.

 

Load Testing your Angular Applications with LoadView

LoadView proposes an innovative and holistic solution to approach the limitations of HTTP-monitoring tools and strengthen the tools Angular developers have today to control, monitor and optimize their applications on the client side. LoadView is a cloud-based load-testing platform that offers stress test monitoring of websites, web applications, and APIs by simulating thousands of concurrent connections in real time, helping to identify bottlenecks and verify overall performance.

After creating an account, developers can test their websites and web applications by creating a device, which stores the website or application to be tested. By choosing the Website option, Angular developers can test the initial load time of their application’s landing or login page by configuring a scenario where thousands of users are concurrently trying to access the page. On the other hand, by choosing the Web Application option, Angular developers can script and test load time of specific use cases of their application. For example, filling out a form, navigating through in-application routes, sorting loaded data from the server and, in general, measure the TTI of their app. LoadView lets users personalize their test load type in three different ways along with an execution plan that sets up how many connections to establish over a period. In addition, LoadView takes a step further by having the possibility to arrange the geographical distribution of the virtual users connected to the website.

Finally, LoadView has the capacity to show full in-depth reports of the results of a simulation. It can show a graphical representation of the scenarios execution plan for establishing virtual user connections, the average response time per user and the number of errors per session that occurred while performing the scenario.  These charts and performance data give the opportunity to look into detailed information of a particular moment of the simulation to obtain important insight on the load time of every element that was rendered to the page. This is extremely beneficial for Angular applications since it allows developers to take action upon specific elements that may be delaying an app’s TTI. In this sense, LoadView fills that void in accurately testing and monitoring JavaScript events to test load time on the client side, and thus, becoming in a powerful asset front-end developers should have under their belt.

 

Conclusion:  Load Testing Web Applications Written in Angular

Current demands raise the bar in terms of performance for modern web apps. Today’s DevOps teams must keep in mind that application response times and application TTI become a pivotal factor for new applications to have a chance to compete in the market. More often than not, Angular developers must continuously evaluate load time reducing techniques such as AoT compilation, code-splitting and preloading strategies when designing applications, and also take a step further. Continuously test and monitor client-side operations and metrics with LoadView to guarantee the best user experience and application performance.

For more information, visit the LoadView website and sign up for the free trial.  You will receive $20 in load testing credits to help you get started.