Аt some point all applications encounters performance problem and you will have to think about scaling techniques. The focus here will be on increasing the performance on the backend. We will give examples with performance of a NodeJS application, but principles are similar in all languages.
So first we should explain what application performance is. In simple words it is the measurement of the application abilities. For example how much time it will take to the server to give a response to the client. The 2 main factors we should have in mind are as we mentioned the response time that the server needs to complete a specific amount of operations such as login a user or searching data. The second involves measurements of what resources are consumed by the application for doing these tasks. Here are some things that can improve the performance if they are done the correct way.
Caching is the first thing that comes to mind when we talk about improving the response time. The process is simple, we store the data in a temporary storage called cache which is with smaller size and faster access time. The application logic first checks in the cache before hitting the database. If the data is there, it is returned to the client, if not the application reads it from the database and then saves a copy of it in the cache. Caching can be done at different levels. We can use Application Caching which are basically in-memory stores like Redis. Web servers can also cache requests and returning response without second contact to the application server. In order to cache static files such as images, images, html, css files we can use CDN caching. Databases and ORMs by default includes some level of caching in order to boost the performance.
If our application handles large amount of incoming requests we may want to distribute the traffic to balance the connections. This approach is called Load Balancing. For example for a NodeJS app, if we want to scale it we can use the built-in cluster module which spawns new processed called workers that run simultaneously and connect to a master process. In that way, the server behaves like one multithreaded server.
If our app requires a lot of internal calls which do not depend on each other you can group them and run them parallel. This will increasing the response time a lot. Most of the time we should avoid writing synchronous code, there are a lot of components that can potentially lock up our application. We should always use asynchronous APIs, especially in performance-critical functions.
We should try to decrease the number of HTTP request we may. The solution here is to group some requests in one and cache the response in order to use it later.
A bad query will make the process of responding data very slow. So we should always try to write maximum optimized queries without unnecessary information. The problem here will be when there are millions rows of data and without indexing querying through this database will be extremely slow.
They are an alternative to HTTP communication in the web. WebSockets provides long-lived channel between the client and the server. Once the connection is open, it is kept alive, offering quick and persistent connection. This approach is great for real-time and long-lived communications.
To sum up, there are a lot of things we can do to increase the performance of our app. We have to choose depending on the specific case, and where exactly is the bottleneck in the application.