A third of the online journeys investigated by UK web site testing specialist SciVisum experienced 3% error rates. More than 10% were subject to wild inconsistencies in delivery speed. Yet these problems slipped below the radar of existing analytics.

A problem that impacts say one in 100 random users on a particular journey is not reproducible by IT teams, and so frequently remains unresolved, said Deri Jones, CEO, SciVisum.

Invisible errors included session swaps, where two users saw each others’ online sessions and page not delivered errors, which don’t show up in web logs.

Other faults that bypassed traditional analytics included jump back, where users are forced to jump back several pages, yet this isn’t detected by analytics as the new page is valid; and situations where users find an empty shopping cart after adding items, which again wouldn’t be logged on the server or analytic logs.

Delivery speed also varied by more than 200% for almost a third of the web journeys tested and one in 10 varied by 300% over a seven-day period. Performance was most unreliable during the peak 8pm-10pm period.

As well as trying a spot of mystery shopping to back up their analytics, SciVisum recommended IT people must look at web site performance in terms data collected from real user experience rather than simply from metrics gathered from internal servers and monitors.