Everyone boasts about reading Vargas Llosa or Conrad, but on the vast majority of bedside tables, you’ll find bestsellers and mass-market paperbacks. In my case, the influence of my father—who was enamored with detective fiction—led me to my favourite reading: Sherlock Holmes.
In one of his stories, The Adventure of the Copper Beeches, Conan Doyle puts the following words into Holmes’ mouth:
“Data! Data! Data! I can’t make bricks without clay.”
The fact is, when I use AI for research and analysis, the problem doesn’t really lie in the system’s capabilities, nor in the connectivity between databases, nor even in the budget. The key is data quality. Given the enormous variety of possible breakdowns in a machine, you ultimately have to trust that the classification of the failure, the description of the symptom, and the corrective actions are accurate and sufficiently informative. This is a task that only humans can perform—at least for now—and therein lies the obstacle.
We invest millions of euros in data acquisition and processing technology, yet we leave everything in the hands of the goodwill, aptitude, and attitude of an operator who often doesn’t even realize the importance of what is being asked of them. Consequently, we analysts get frustrated and blame the “human factor,” which drives us to spend even more money and resources on further sensing.
This is all well and good in critical environments or those with abundant resources, but in the real world, time and money usually restrict our capacity to automate, and that leads me to the following reflection.
This race to add sensors to replace senses reminds me of arms manufacturers in their eternal battle between creating better swords and better shields. Perhaps I am biased by my Psychology degree, or maybe I’ve been indoctrinated by the Japanese concept of individual responsibility that has permeated my thinking through TPM training, but I wonder: Why do we invest so much in the quality of sensors and so little in the quality of people? A company that rests solely in the hands of responsible, committed people—who perceive workplace problems as their own and feel certain that the company reciprocates by caring for their physical, economic, and even psychological well-being, while refusing to tolerate toxic or uncommitted individuals—does not need as many sensors or as much automation. They already possess the best data acquisition network: people.
But we must work with what is, not what should be, so here are a few tips to try and improve data quality:
- Analyze the nature of your breakdowns: If you have the resources, analyze past data to extract the major categories and subcategories of reported problems. Don’t get stuck in the classic hierarchy (electrical-hydraulic-pneumatic), because the person writing the report often cannot distinguish the technical cause. Imagine you are a doctor talking to patients: they don’t need to know the source of their pain; they only need to report the symptoms with honesty and clarity.
- Close Work Orders (WO) with mandatory confirmation of real vs. estimated downtime: Often, the actual intervention time is inflated by organizational issues (lack of a technician to restart the equipment, warm-up times, delays in responding to the call…). These causes are legitimately part of a breakdown at a production level, but for Maintenance—as faithful subjects of the god Pareto—they can distort the analysis, giving importance to secondary problems and sidelining more serious ones.
- Automate data capture: Try to have the system automatically populate as much data as possible: date, shift, time, equipment, tool, material, technician… If you could even include operating parameters in the data capture of your CMMS (Computerized Maintenance Management System), you would obtain valuable information while saving the operator time. For data that must be entered manually, set up mandatory fields and offer hierarchical lists based on your previous analysis.
Last but not least, working on motivation is essential. When someone is vague in a description, it is generally because:
1 They don’t see the utility: No one has shown them that the data is actually used for anything.
3 They don’t know how to describe it: They lack a standard technical vocabulary.
3They are in a hurry: The system penalizes them for time, but not for quality.
4 There are no incentives: Filling it out poorly or filling it out well yields the same result for them.
Here some ideas to improve the motivation:
- Data that goes in must come out visible to the person who generated it. Try to provide simple visual feedback to the operator and technician. Explain the “why” and the real-world utility of their report.
- Daily analysis. The supervisor’s review should be a key milestone, not a routine. The supervisor doesn’t validate the WO just to “approve” it, but to gather information, reflect on it, and think of actions to prevent or mitigate recurrence.
- Chase the data. Starting with the most significant breakdowns, try to follow up with the person in charge (in person or via email). The goal is to send a clear message to everyone: information is a priority.
- Light gamification by team. A weekly ranking visible on the plant floor regarding reporting quality by shift or line (not by individual, to avoid friction) can help foster healthy competition.
And at the end of the day, with work, luck, and time, perhaps you’ll make music where there was only sound before…

