With loadforge-cli, your Locust scripts and test configs can live in Git, and your automations (syncing tests, starting runs, waiting for results) live in GitHub Actions. It’s the fastest way to make load testing part of your development workflow.
Writing the same Python helpers across multiple Locust tests is brittle and slow. The Library lets you create shared Python modules—small, reusable functions you can import across all your tests—to keep your performance suite DRY and consistent.
TLDR: We've introduced full-size screenshot viewing for better test monitoring, drastically improved test launch speeds to 20-30 seconds, enhanced crawler speed and capability, and accelerated AI script analysis by 2.5-3x with an upgrade to o3-mini.
•1 min read
Sometimes it's the small things that make the load testing experience better!
We've rolled out an update that lets you click on screenshots in the live monitor and on the report page to see a full size image - something that was requested quite often to our surprise!
We've also improved the speed to launch tests on our cloud (where installations are not required). Load runners now launch in 20-30 seconds, allowing your test to be fully running in half a minute.
We've also improved the speed of the crawler, and increased the depth - so it'll find more pages, faster! This primarily improves the Wizard when creating tests.
And finally, AI analysis and AI improvements to test scripts should be around 2.5-3x faster as we've switched from GPT4o to o3-mini.