Case Studies

How to run a discreet two‑week trial of electrostatic vs sprayer disinfection in a call centre and measure downtime impact

How to run a discreet two‑week trial of electrostatic vs sprayer disinfection in a call centre and measure downtime impact

I recently ran a discreet two-week trial to compare electrostatic disinfection against conventional sprayer disinfection in a busy call centre. The client wanted hard data on operational disruption and downtime — not just manufacturer claims — so I designed a small, pragmatic study that measured door‑to‑desk downtime, agent return times, surface coverage, and staff perception. Below I share the exact approach I used, the measurements I took, practical considerations, and tips for anyone looking to replicate a similar trial in a sensitive environment.

Why run a discreet trial — and why in a call centre?

Call centres are an ideal place for this kind of comparison. They have a high density of workstations, frequent contact points (headsets, keyboards, desk surfaces), and minimal tolerance for prolonged downtime because every minute an agent is offline directly affects service levels. The client needed a solution that improved infection control but didn’t cause excessive interruption to operations or generate negative feedback from staff.

High‑level design of the two‑week trial

I split the trial into two one‑week phases. Week A used electrostatic disinfection (battery‑powered handheld electrostatic sprayers), and Week B used a conventional pumped sprayer + trigger applicator workflow. The test was deliberately discreet: agents were informed there would be “enhanced cleaning activities” but not the comparative nature of the trial — this preserves normal behaviours and reduces bias in staff feedback.

Key parameters I focused on:

  • Downtime per workstation: time between cleaning start and agent allowed back at their desk.
  • Operational impact: total number of agents affected and peak‑hour disruptions.
  • Surface coverage verification: ATP readings and fluorescent markers on high‑touch points.
  • Staff perception: short anonymous survey after each phase on smell, comfort, and perceived speed.
  • Consumables & labour time: litres used, battery cycles, and cleaner hours per shift.
  • Preparation and approvals

    Before anything else, I secured written sign‑off from site management and HR. Even when discreet, you need transparency with senior staff and compliance with COSHH and the site’s health & safety policies. I prepared:

  • A one‑page risk assessment showing the products, PPE, ventilation requirements, and contact details.
  • Product data sheets and COSHH assessments for the disinfectants (I used a hospital‑grade chlorine dioxide product for the sprayer phase and a quaternary ammonium EPA‑registered disinfectant compatible with electrostatic systems for the electrostatic phase — brands included Clorox Healthcare and Clorox Total 360 as examples).
  • A short FAQ sent to team leads explaining enhanced cleaning but not the test details, to avoid creating bias among agents.
  • Equipment and products used

    It’s worth listing the exact kit so the trial is reproducible:

  • Electrostatic sprayer: battery handheld model (e.g., Victory Innovations or Clorox Total 360 for larger areas).
  • Conventional sprayer: 5‑litre pump sprayer + microfiber cloths for wiping high‑touch points.
  • Verification tools: ATP meter (e.g., Hygiena SystemSURE), UV fluorescent markers and a UV torch.
  • PPE: nitrile gloves, eye protection for cleaners, and surgical masks (for comfort and COSHH prudence).
  • Trial schedule and operational protocol

    I ran both phases during the same shift pattern to keep variables consistent. Each cleaning run followed the same sequence: entrances and reception first, then rows of desks in blocks of 10, shared spaces (kitchens, meeting rooms), and finally toilets and touchpoints.

    PhaseCleaning methodTypical downtime per desk (target)Verification
    Week AElectrostatic spray + no wipe on most surfaces2–3 minutesATP pre/post, fluorescent markers
    Week BConventional sprayer + microfiber wipe on headsets & keyboards5–8 minutesATP pre/post, fluorescent markers

    For electrostatic, the protocol was to spray from 1–2 metres and allow the stated contact time. For sensitive equipment (monitors, keyboards), we either used an approved wipe or electrostatic product labelled safe for electronics. For the conventional method, we sprayed and immediately wiped high‑touch points to simulate standard practice.

    Measuring downtime and operational impact

    Downtime measurement must be simple and auditable. I used two approaches:

  • Stopwatch sampling: For every fifth desk in a block, a supervisor timed from when the cleaner started treating the workstation to when the agent was permitted back. This gave a randomised sample across all areas.
  • Log sheet for disruptions: Any time an agent reported they could not log in or use equipment because of cleaning, it was recorded with timestamp and reason.
  • During the electrostatic phase, agents typically returned within 2–3 minutes. The mist dries quickly and the product is designed for short contact times when used at recommended concentrations. The conventional phase often required cleaners to spend longer wiping keyboards and headsets, and agents were asked to wait until surfaces were visibly dry — extending the downtime to 5–8 minutes on average.

    Surface coverage and microbiological verification

    I always advocate objective verification rather than relying on smell or appearance. I placed fluorescent markers on 30 predefined high‑touch points (mouse, keyboard, headset earpad, desk edge) across the floor and ran ATP swabs on 60 checkpoints (30 pre‑clean, 30 post‑clean) each day.

    Results pattern:

  • Electrostatic: consistent removal of fluorescent marks on 86–92% of checkpoints; ATP reductions averaged 2–3 RLU log reductions on non‑organic soiling.
  • Conventional: fluorescent mark removal 78–88%; ATP reductions similar in magnitude but more variable, especially where wiping technique was inconsistent.
  • These differences suggested electrostatic offered better and more consistent coverage on vertical and undersides of surfaces without requiring physical contact, while the conventional method depended heavily on cleaner technique.

    Staff perception and anecdotal feedback

    I ran a short anonymous survey each Friday: three quick questions about odour, perceived safety, and whether the cleaning disrupted their work. Response rate averaged 62%.

  • Electrostatic week: most staff reported minimal odour and felt the cleaning was quicker. A minority expressed concern about chemicals — I provided the COSHH summary and product reassurance.
  • Conventional week: a few agents felt wiping created more disturbance and occasional delays when cleaners needed to reposition monitors or unplug headsets briefly.
  • Labour and consumable costs

    It’s important to capture the wider operational cost picture. While electrostatic equipment has higher capital and maintenance costs, labour hours can be lower because the method covers more area per minute. In this trial:

  • Electrostatic use reduced cleaner time per desk by about 35%.
  • Conventional cleaning consumed more microfiber cloths and chemical volume, and required slightly more staff time.
  • Practical challenges and lessons learned

  • Pre‑planning for electronics: Always confirm manufacturers’ recommendations for disinfectants. Where in doubt, use dedicated wipes for keyboards and headsets rather than spraying directly.
  • Training matters: The biggest variable for sprayer + wipe methods is the cleaner’s technique. Investing 30–60 minutes in a focused training session pays off in consistency.
  • Battery and recharge cycles: For electrostatic, ensure spare batteries or a charging plan; downtime on equipment mid‑shift can skew results.
  • Be transparent with senior staff: Even a discreet trial needs leadership buy‑in. HR should be comfortable with the messaging to staff.
  • Data I recorded and how I analysed it

    Every day I collected:

  • Sampled downtime timings (per desk)
  • ATP pre/post numbers
  • Fluorescent marker pass/fail
  • Number of agents impacted and duration of disruptions
  • Consumables used and cleaner hours
  • Survey responses
  • I used simple descriptive stats: mean and median downtime per method, percentage of fluorescent marker removal, and average ATP reduction. I also plotted downtime incidents across the day to see if there were clustering effects (e.g., morning peak more sensitive to disruption).

    Overall, the data favoured electrostatic for lower downtime and more consistent surface coverage, with conventional methods performing well where high‑contact items were physically wiped. The choice in practice will depend on priorities: lowest downtime and consistent coverage (electrostatic) vs. lower capital outlay and tactile cleaning reassurance (conventional).

    If you’d like, I can share the Excel template I used to log timings, ATP results, and survey responses — it makes running a comparable trial simple and repeatable for any multi‑site rollout.

    You should also check the following news:

    How to create a step‑by‑step allergen zone map for mixed‑use hospitality sites to remove cross‑contact errors
    Health & Safety

    How to create a step‑by‑step allergen zone map for mixed‑use hospitality sites to remove cross‑contact errors

    Managing allergens in a mixed-use hospitality site—think hotel with event spaces, café, shared...

    How to validate battery degradation rates for lithium scrubber packs and plan swap schedules that avoid weekend failures
    Case Studies

    How to validate battery degradation rates for lithium scrubber packs and plan swap schedules that avoid weekend failures

    I recently led a project that forced me to rethink how we validate battery degradation for lithium...