How We Test

Almost every running gear review you'll find online was written by someone who had the product for a week, did a couple of runs, and wrote up their first impressions. First impressions mean something — but only a fraction of what matters over a season of actual use.

At Stride Tested, we test longer. Here's what that actually looks like.


Minimum testing periods

  • Trail shoes: Minimum 60–80km on varied terrain before we write a word. We test across multiple surface types — grit, bog, scree, technical rock — because a shoe that grips on one surface can be useless on another. We document how the outsole and upper change over the test period.
  • Road shoes: At least 80km, including runs at different paces and distances. Daily trainers get used as daily trainers, not just for easy efforts. Race-day shoes get raced in, or at least run at race effort.
  • GPS watches: 6–8 weeks of daily use across a variety of conditions — good satellite coverage and poor. We compare GPS track accuracy against known routes and check heart rate data against a chest strap. We let the battery run down properly.
  • Waterproof jackets: Worn in actual rain, not a shower test. We assess breathability on uphill efforts, not just waterproofing in static conditions. Tested until we have a confident verdict on seal durability at the seams and zips.
  • Nutrition and gels: Tested on long runs and race efforts of at least 2 hours, because gut response at high intensity is completely different from sitting at a desk. We note how products perform cold and warm.
  • Running packs and vests: Loaded and tested on long runs. Fit assessment includes both loaded and empty. We check bounce, chafe, access, and whether the reservoir or soft flask system actually works one-handed on the move.

What we score

We assign scores on a 1–5 scale. Each review covers performance, durability, value, and who should and shouldn't buy the product.

Performance — Does it do the main job well? A trail shoe needs grip on the terrain it's designed for. A GPS watch needs to hold signal where you need it. A waterproof needs to actually be waterproof. We test this in real conditions, not lab simulations.

Durability and long-term condition — What does it look like after 60km or six weeks of real use? Has anything changed — outsole wear, upper breakdown, delamination, seam failure? We document this with photos where relevant.

Value assessment — Is the price justified relative to similar products at different price points? We'll name the specific cheaper option that gets close to the same result, and explain what you actually get when you spend more.

Who it's for — No piece of kit is right for everyone. A minimal fell shoe is excellent for some runners and a disaster for others. We try to be honest about fit, use case, and who should look elsewhere.


How we acquire products

We buy almost everything we test. When a company sends us a product, we make that clear in the review — and we make no promises about coverage or tone when they do. No freebies get preferential treatment. We don't do paid reviews or advertising. We earn a commission if you buy something through a link — see the Affiliate Disclosure page for exactly how that works.


A note on testing in Northern England

We're not a commercial testing facility, and we don't have a lab. What we have is the Peak District, the Pennines, and the roads and canals of Sheffield and Leeds — and the kind of weather that tests gear properly whether you want it to or not.

Rob tests in the Dark Peak and on the fell race circuit. Dan tests on road and towpath and on GPS-challenging routes through valleys. Kat tests on fell terrain and winter roads. Between us we cover most of what UK runners actually encounter. That's the point.