Public Health England has accepted that 16,000 accepted coronavirus cases in the UK were missed from daily abstracts being appear amid September 25 and October 2. The missing abstracts were after added to the daily totals, but given the accent of these numbers for ecology the beginning and making key decisions, the after-effects of the error are far-reaching.

Not only does it lead to underestimating the scale of coronavirus in the UK, but conceivably more important is the consecutive delay in entering the capacity of absolute cases into the NHS Test and Trace system which is used by a team of acquaintance tracers. Although all those who tested absolute had been abreast of their results, other people in close acquaintance with them and potentially at risk of acknowledgment were not anon followed up (ideally within 48 hours). This was a austere error. What could have caused it?

It emerged later that that day a “technical glitch” was to blame. To be more specific, the lab test after-effects were being transferred to Excel templates. The templates hit a limit in the number of rows they could handle and then failed to update with more cases added. The issue was bound with all new cases added to the totals appear over the weekend by breaking the data down across abate spreadsheets.

The issue may have been fixed, but people’s aplomb in the testing system in place in England will assuredly take a knock. It’s also likely that politicians and media will use this as political armament to argue the amateurishness of government and Public Health England. Is this the right response? What should we take away from this mistake?

An accidental mistake

We should not forget that the government and public health workers are doing an abundantly arduous and ambitious job ambidextrous with a pandemic. But this kind of aberration was avoidable. We live in a world of big data, with bogus intelligence and apparatus acquirements biting all aspects of our lives. We have smart factories and smart cities; we have self-driving cars and machines accomplished to display human intelligence. And yet Public Health England used Microsoft Excel as an agent to manage a large volume of acute data. And herein lies the problem.

Although Excel is accepted and frequently used for analysis, it has several limitations that make it clashing for large amounts of data and more adult analyses.

The companies that analysed the swab tests to analyze who had the virus submitted their after-effects as comma-separated text files to PHE. These were then ingested into Excel templates to be uploaded to a axial system to be made accessible to the Test and Trace team and government. Although today’s Excel spreadsheets can handle 1,048,576 rows and 16,384 columns, developers at PHE used an older Excel file format (XLS instead of XLSX) consistent in each arrangement being able to store only around 65,000 rows of data (or around 1,400 cases). When the limit was reached, any added cases were left off the arrangement and accordingly absolute cases of coronavirus were missed in the daily reporting.

The bigger issue is that, in light of the data-driven and technologically avant-garde age in which we live, that a system based on aircraft around Excel templates was even deemed acceptable in the first place. Data engineers have for a long time been acknowledging businesses with managing, transforming and confined up data, and developing methods for architecture efficient, robust and authentic data pipelines. Data professionals have also developed approaches to advice governance, including assessing data affection and developing adapted aegis protocols.

For this kind of custom appliance there are plenty of data administration technologies that could have been used, alignment from on-site to cloud-based solutions that can scale and accommodate managed data accumulator for consecutive advertisement and analysis. The Public Health England developers no doubt had some reason to transform the text files into Excel templates, apparently to fit with legacy IT systems. But alienated Excel calm and aircraft the data from source (with adapted charwoman and checks) into the system would have been better and bargain the number of steps in the pipeline.

The blame game

Despite the allowances and boundless use of using Excel, it is not always the right tool for the job, abnormally for a data-driven system with such an important function. You can’t accurately report, model or make decisions on inaccurate or poor affection data.

During this communicable we are all on a adventure of discovery. Rather than point the finger and play the blame game, we need to reflect and learn from our mistakes. From this incident, we need to work on accepting the basics right – and that includes robust data management. Conceivably rather apropos are letters that Public Health England is now breaking the lab data into abate batches to create a larger number of Excel templates. This seems a poor fix and doesn’t really get to the root of the botheration – the need for a robust data administration infrastructure.

It is also arresting how bound technology or the algorithm is blamed (especially by politicians), but herein lies addition axiological issue – accountability and taking responsibility. In the face of a communicable we need to work together, take responsibility, and handle data appropriately.

Read next: Australia wants AI to handle divorces — here's why