
Introduction
Express.js is one of the most widely used web frameworks in the Node.js ecosystem, powering everything from lightweight REST APIs to full-featured web applications and backend services. Its simplicity is a major strength, but that same flexibility means performance can vary dramatically depending on middleware choices, database access patterns, authentication design, and how your application handles concurrency.
That’s why load testing Express.js applications is essential. A service that feels fast with a few users can quickly degrade under real-world traffic if routes are blocking the event loop, database queries are inefficient, or authentication middleware adds too much overhead. With proper load testing, performance testing, and stress testing, you can identify these issues before they affect production users.
In this guide, you’ll learn how to build realistic Express.js load tests using LoadForge and Locust. We’ll cover basic route testing, authenticated API workflows, e-commerce style transactions, and file upload scenarios. Along the way, we’ll show how LoadForge’s cloud-based infrastructure, distributed testing, real-time reporting, global test locations, and CI/CD integration can help you validate Express.js performance at scale.
Prerequisites
Before you start load testing your Express.js application, make sure you have:
- A running Express.js app in a development, staging, or pre-production environment
- The base URL for the application or API, such as
https://staging-api.example.com - Test user accounts or a way to generate them
- Knowledge of your key Express.js routes, such as:
GET /healthPOST /api/v1/auth/loginGET /api/v1/productsPOST /api/v1/ordersPOST /api/v1/uploads/avatar
- Sample request payloads and expected response codes
- Access to LoadForge to run distributed load tests and inspect real-time metrics
It also helps to know:
- Which routes are public versus authenticated
- Which endpoints are CPU-heavy, database-heavy, or involve external APIs
- Any rate limiting, caching, or session behavior in your Express.js stack
- Whether you use JWT, cookie-based sessions, OAuth, or API keys
Understanding Express.js Under Load
Express.js runs on Node.js, which uses a single-threaded event loop for handling I/O efficiently. This makes Express.js highly capable for many web workloads, but it also introduces some important performance characteristics during load testing.
Key Express.js performance behaviors
Event loop sensitivity
If your Express.js route handlers perform heavy synchronous work, such as:
- large JSON serialization
- image processing
- CPU-intensive validation
- synchronous filesystem access
then the event loop can get blocked. Under load, this causes rising response times across all routes, not just the affected endpoint.
Middleware overhead
Express applications often stack many middleware functions:
- request logging
- body parsing
- CORS
- authentication
- session handling
- rate limiting
- validation
- error handlers
Each layer adds processing time. A route that looks simple may actually pass through multiple expensive middleware chains before reaching your controller.
Database bottlenecks
In many Express.js apps, the framework itself is not the bottleneck. Instead, performance issues come from:
- slow SQL queries
- unindexed MongoDB lookups
- connection pool exhaustion
- N+1 query problems
- long-running transactions
Load testing helps reveal whether your Express.js app is waiting on downstream systems rather than struggling with HTTP handling alone.
Session and authentication scaling
Express.js apps frequently use:
- JWT bearer tokens
express-sessionwith Redis or database-backed session stores- Passport.js strategies
- CSRF protection for form-based apps
These can become bottlenecks under concurrent traffic, especially if every request triggers session store lookups or token verification overhead.
Payload parsing and uploads
Routes using express.json(), multer, or multipart parsing can consume significant memory and CPU under high concurrency. File uploads and large JSON bodies are especially important to test realistically.
Common bottlenecks in Express.js applications
When load testing Express.js, watch for:
- increasing p95 and p99 response times under moderate concurrency
- high latency on login or session endpoints
- route-specific slowdowns caused by database queries
- memory pressure from large request bodies or responses
- 429 responses from rate limiting middleware
- 502/504 errors behind reverse proxies like Nginx or API gateways
- Node.js worker saturation if using PM2 or cluster mode
This is exactly where LoadForge helps. You can simulate realistic user patterns from multiple global test locations, monitor failures in real time, and compare behavior across different traffic levels.
Writing Your First Load Test
Let’s start with a basic Express.js load test that checks public routes typically found in a web app or API.
This first script simulates users visiting a homepage, checking health status, and browsing a product listing endpoint.
from locust import HttpUser, task, between
class ExpressBasicUser(HttpUser):
wait_time = between(1, 3)
@task(3)
def homepage(self):
self.client.get(
"/",
headers={
"Accept": "text/html,application/xhtml+xml"
},
name="GET /"
)
@task(2)
def health_check(self):
self.client.get(
"/health",
headers={
"Accept": "application/json"
},
name="GET /health"
)
@task(5)
def browse_products(self):
self.client.get(
"/api/v1/products?category=electronics&page=1&limit=20&sort=popular",
headers={
"Accept": "application/json"
},
name="GET /api/v1/products"
)What this test does
This script models a simple traffic mix:
- homepage visits
- health endpoint checks
- product browsing requests
The @task weights reflect relative frequency. Product browsing happens most often, which is common for many Express.js applications serving catalog or listing pages.
Why this matters for Express.js
This basic load test helps establish baseline performance for:
- static or server-rendered pages
- lightweight JSON endpoints
- routing and middleware overhead
- reverse proxy and cache effectiveness
If these basic routes are already slow under moderate traffic, the problem may be in:
- too much middleware
- expensive template rendering
- missing response compression tuning
- poor cache headers
- inefficient database-backed listing queries
Running this in LoadForge
In LoadForge, paste this Locust script into a new test, configure your host such as https://staging.example.com, then set a user count and spawn rate. Start with a moderate test like:
- 50 users
- spawn rate of 5 users/sec
- duration of 5 to 10 minutes
Then scale upward using LoadForge’s distributed testing to see how the Express.js app behaves under heavier production-like load.
Advanced Load Testing Scenarios
Basic route testing is useful, but realistic Express.js load testing should include authentication flows, stateful API interactions, and high-cost operations. Below are several advanced scenarios.
Authenticated JWT workflow for an Express.js API
A common Express.js pattern is JWT-based authentication. Many apps expose a login route that returns an access token, which is then used for protected endpoints.
This script logs in once per simulated user and exercises authenticated profile and order history endpoints.
from locust import HttpUser, task, between
import random
class ExpressAuthenticatedUser(HttpUser):
wait_time = between(1, 2)
token = None
def on_start(self):
email = f"loadtest{random.randint(1, 50)}@example.com"
password = "TestPassword123!"
with self.client.post(
"/api/v1/auth/login",
json={
"email": email,
"password": password
},
headers={
"Content-Type": "application/json",
"Accept": "application/json"
},
name="POST /api/v1/auth/login",
catch_response=True
) as response:
if response.status_code == 200:
data = response.json()
self.token = data.get("accessToken")
if not self.token:
response.failure("Login succeeded but no accessToken returned")
else:
response.failure(f"Login failed: {response.status_code}")
def auth_headers(self):
return {
"Authorization": f"Bearer {self.token}",
"Accept": "application/json"
}
@task(3)
def get_profile(self):
if self.token:
self.client.get(
"/api/v1/users/me",
headers=self.auth_headers(),
name="GET /api/v1/users/me"
)
@task(2)
def get_order_history(self):
if self.token:
self.client.get(
"/api/v1/orders?status=completed&page=1&limit=10",
headers=self.auth_headers(),
name="GET /api/v1/orders"
)
@task(1)
def refresh_session_data(self):
if self.token:
self.client.get(
"/api/v1/notifications?unreadOnly=true",
headers=self.auth_headers(),
name="GET /api/v1/notifications"
)What this reveals
This scenario is valuable for performance testing Express.js authentication and protected APIs because it measures:
- login throughput and latency
- JWT creation overhead
- authentication middleware cost
- database access for user profile and order history
- performance of personalized endpoints under concurrency
If POST /api/v1/auth/login slows down quickly, inspect:
- password hashing cost, such as bcrypt rounds
- database lookup speed
- rate-limiting middleware behavior
- token signing overhead
If protected routes degrade, check:
- per-request token validation
- repeated user/session lookups
- inefficient joins or document population
E-commerce checkout flow in Express.js
Now let’s simulate a more realistic multi-step transaction. This is especially useful for Express.js APIs backing online stores or SaaS billing flows.
The user will:
- browse products
- view product details
- add an item to cart
- review the cart
- submit an order
from locust import HttpUser, task, between, SequentialTaskSet
import random
class CheckoutFlow(SequentialTaskSet):
def on_start(self):
login_response = self.client.post(
"/api/v1/auth/login",
json={
"email": "shopper@example.com",
"password": "TestPassword123!"
},
headers={
"Content-Type": "application/json",
"Accept": "application/json"
},
name="POST /api/v1/auth/login"
)
self.token = login_response.json().get("accessToken")
self.product_id = None
self.cart_id = None
def auth_headers(self):
return {
"Authorization": f"Bearer {self.token}",
"Accept": "application/json",
"Content-Type": "application/json"
}
@task
def browse_catalog(self):
response = self.client.get(
"/api/v1/products?category=laptops&page=1&limit=12",
headers={"Accept": "application/json"},
name="GET /api/v1/products"
)
products = response.json().get("items", [])
if products:
self.product_id = random.choice(products)["id"]
@task
def view_product(self):
if self.product_id:
self.client.get(
f"/api/v1/products/{self.product_id}",
headers={"Accept": "application/json"},
name="GET /api/v1/products/:id"
)
@task
def add_to_cart(self):
if self.product_id:
response = self.client.post(
"/api/v1/cart/items",
json={
"productId": self.product_id,
"quantity": 1
},
headers=self.auth_headers(),
name="POST /api/v1/cart/items"
)
cart = response.json()
self.cart_id = cart.get("id")
@task
def view_cart(self):
if self.cart_id:
self.client.get(
f"/api/v1/cart/{self.cart_id}",
headers=self.auth_headers(),
name="GET /api/v1/cart/:id"
)
@task
def checkout(self):
if self.cart_id:
self.client.post(
"/api/v1/orders",
json={
"cartId": self.cart_id,
"shippingAddress": {
"fullName": "Load Test User",
"line1": "123 Test Street",
"city": "Austin",
"state": "TX",
"postalCode": "78701",
"country": "US"
},
"paymentMethod": {
"type": "card",
"token": "tok_visa_test_4242"
}
},
headers=self.auth_headers(),
name="POST /api/v1/orders"
)
self.interrupt()
class ExpressCheckoutUser(HttpUser):
wait_time = between(2, 5)
tasks = [CheckoutFlow]Why this scenario matters
This is a strong Express.js stress testing scenario because it exercises:
- authenticated traffic
- product lookup queries
- cart persistence
- transactional order creation
- middleware-heavy routes
- database writes and validations
This often exposes bottlenecks in:
- cart/session storage
- inventory checks
- order transaction locking
- external payment provider calls
- JSON validation libraries
For best results, stub or sandbox external payment systems so your load test focuses on the Express.js app rather than third-party limitations.
File upload testing for Express.js with multer
Many Express.js apps support uploads for avatars, documents, or media. These routes often use multer or similar middleware and can become memory- or CPU-intensive.
This example simulates authenticated avatar uploads.
from locust import HttpUser, task, between
from io import BytesIO
class ExpressFileUploadUser(HttpUser):
wait_time = between(3, 6)
token = None
def on_start(self):
response = self.client.post(
"/api/v1/auth/login",
json={
"email": "uploader@example.com",
"password": "TestPassword123!"
},
headers={
"Content-Type": "application/json",
"Accept": "application/json"
},
name="POST /api/v1/auth/login"
)
self.token = response.json().get("accessToken")
@task
def upload_avatar(self):
if not self.token:
return
fake_image = BytesIO(b"\x89PNG\r\n\x1a\n" + b"0" * 2048)
files = {
"avatar": ("avatar.png", fake_image, "image/png")
}
self.client.post(
"/api/v1/uploads/avatar",
files=files,
headers={
"Authorization": f"Bearer {self.token}",
"Accept": "application/json"
},
name="POST /api/v1/uploads/avatar"
)What this tests
This file upload scenario helps evaluate:
- multipart form parsing performance
- request body memory usage
- storage backend latency
- upload size limits
- authentication overhead on upload routes
For Express.js apps, upload endpoints can behave very differently from standard JSON APIs. If latency spikes or errors appear, review:
multerstorage strategy- temporary disk I/O
- object storage latency
- file validation logic
- body size and timeout settings
Analyzing Your Results
After running your Express.js load testing scenarios in LoadForge, focus on more than just average response time. Real performance issues often appear in tail latency, error rates, and throughput saturation.
Key metrics to review
Response time percentiles
Look at:
- p50 for median user experience
- p95 for typical slow requests
- p99 for worst-case latency under load
For Express.js, a route may look fine on average while p95 and p99 climb sharply due to event loop blocking or database contention.
Requests per second
Track total throughput and per-endpoint throughput. If requests per second flatten while response times rise, your Express.js app may be reaching capacity.
Error rate
Watch for:
- 401 or 403 from auth misconfiguration
- 429 from rate limiting middleware
- 500 from route exceptions
- 502/504 from upstream proxy failures
A small but increasing 500 rate under load often indicates hidden concurrency or resource issues.
Endpoint-specific behavior
Compare routes like:
POST /api/v1/auth/loginGET /api/v1/productsPOST /api/v1/ordersPOST /api/v1/uploads/avatar
This helps isolate whether the bottleneck is in authentication, reads, writes, or upload handling.
How to interpret common Express.js patterns
Fast reads, slow writes
This usually points to database write contention, transaction overhead, or validation costs.
Login degrades before other routes
Likely causes include:
- expensive password hashing
- session store bottlenecks
- token signing overhead
- user table query inefficiency
All routes slow down together
This often suggests:
- event loop blocking
- CPU saturation
- overloaded reverse proxy
- shared middleware overhead
Upload routes fail first
This may indicate:
- insufficient memory
- upload size limit issues
- slow disk or object storage
- worker timeout configuration problems
LoadForge’s real-time reporting makes it easier to spot these patterns as tests run, and its distributed testing model helps you validate whether performance differs by region or traffic source.
Performance Optimization Tips
Once your Express.js performance testing reveals bottlenecks, here are some practical optimizations to consider.
Minimize blocking work in route handlers
Avoid synchronous operations like:
fs.readFileSync- CPU-heavy loops
- large in-process transformations
Move expensive tasks to background workers or asynchronous services.
Review middleware order and necessity
Every Express.js request passes through middleware chains. Remove unnecessary middleware from hot paths and ensure expensive logic only runs where needed.
Optimize database access
For API routes under load:
- add missing indexes
- reduce query count per request
- paginate large result sets
- avoid over-fetching fields
- tune connection pools
In many cases, Express.js itself is fine and the database is the real bottleneck.
Cache where appropriate
Frequently requested routes like product listings, configuration endpoints, or public content may benefit from:
- in-memory caching
- Redis caching
- CDN edge caching
- proper
Cache-Controlheaders
Tune JSON payload sizes
Large request and response bodies increase parsing and serialization overhead. Return only the fields clients need.
Scale Node.js correctly
If your Express.js app is CPU-constrained, consider:
- Node.js cluster mode
- PM2 with multiple workers
- container-based horizontal scaling
- autoscaling behind a load balancer
Then use LoadForge to validate that scaling actually improves throughput and latency.
Test from multiple regions
If your users are global, use LoadForge’s global test locations to understand whether latency is caused by application processing or geographic distance.
Common Pitfalls to Avoid
Load testing Express.js is straightforward, but several mistakes can make your results misleading.
Testing only /health
Health endpoints are useful, but they rarely reflect real application behavior. Include the routes that matter to users and revenue.
Ignoring authentication overhead
Many Express.js apps spend a significant amount of time in auth middleware. If you only test public endpoints, you’ll miss important bottlenecks.
Using unrealistic user behavior
Real users don’t hit the same endpoint in a tight loop with no delay. Use weighted tasks, think time, and multi-step flows.
Forgetting test data management
Order creation, uploads, and cart operations can pollute your environment. Use isolated test accounts and cleanup strategies.
Overloading downstream dependencies unintentionally
Your Express.js app may depend on:
- payment gateways
- email APIs
- search services
- cloud storage
Stub or sandbox these systems when needed so your load testing focuses on your application’s performance.
Not correlating app metrics with load test results
Load testing data is most useful when paired with server-side metrics such as:
- CPU usage
- memory usage
- event loop lag
- database query latency
- connection pool saturation
Running only one test size
A single 100-user test tells only part of the story. Run progressive tests:
- baseline load testing
- peak traffic performance testing
- stress testing beyond expected limits
This helps you identify both safe operating ranges and failure points.
Skipping CI/CD performance validation
Express.js apps change quickly. Integrating LoadForge into CI/CD pipelines helps catch regressions before deployment, especially after middleware, ORM, or authentication changes.
Conclusion
Express.js is fast and flexible, but real-world performance depends on far more than the framework itself. Middleware, authentication, database access, file uploads, and transactional workflows all shape how your application behaves under load. With realistic load testing, performance testing, and stress testing, you can uncover bottlenecks early and improve both reliability and user experience.
Using LoadForge, you can run scalable Express.js load tests with Locust, simulate real user behavior, test from global locations, and analyze results with real-time reporting. Whether you’re validating a simple API, a session-heavy web app, or a high-traffic checkout flow, LoadForge gives you the tools to measure and improve performance with confidence.
If you’re ready to see how your Express.js app performs under real traffic, try LoadForge and start building your first test today.
LoadForge Team
LoadForge is a load and performance testing platform built on Locust. Our team has been shipping load tests against production systems since 2018, and we write these guides from real customer engagements.
Related guides
Keep going with more guides from the same category.

ASP.NET Load Testing Guide with LoadForge
Learn how to load test ASP.NET applications with LoadForge to find performance issues and ensure your app handles peak traffic.

CakePHP Load Testing Guide with LoadForge
Load test CakePHP applications with LoadForge to benchmark app performance, simulate traffic, and improve scalability.

Django Load Testing Guide with LoadForge
Discover how to load test Django applications with LoadForge to measure performance, handle traffic spikes, and improve stability.