The State of Things
Three machines running tissue product. Every hour, an operator walks the floor with a clipboard. Eighteen parameters per check — log width, carton weight, clip tension, case count, sheet dimensions, ply bond quality. Ten measurement sequences on two machines, fourteen on the third. Every value written by hand onto a paper form designated IF1, IF2, or IF3, depending on the line.
At shift-end, the forms go into a binder. The supervisor reviews them the next morning. Rejects logged at 2pm aren’t seen until 10pm. Quality compliance is calculated retroactively, once a week, by someone reading every page. The plant manager’s quality report arrives days after the period it describes. Nobody is failing — the system is just permanently looking backward.
The Mission
“Every measurement — every dimension, every weight, every count — captured once, visible everywhere, and impossible to lose. My team spends their time on quality decisions, not paperwork.”
What We Found
This wasn’t a quality problem. The operators were skilled, their measurements accurate. It was a visibility gap disguised as a process. Paper introduced delay at every stage. A reject on one parameter triggered no immediate response — it was just a circled number on a form that would be reviewed hours later. Supervisor approvals happened the following morning. The weekly compliance number was assembled by hand from binders of completed forms.
The Brief surfaced what paper couldn’t: patterns across machines, shifts, and time windows. Which parameters rejected most often. Whether certain hours or shift changes produced concentration effects. Quality data existed in abundance — it was just locked in paper, invisible to anyone who wasn’t physically holding the right binder at the right time.
The Personas
On the floor every hour with a clipboard, measuring the same eighteen parameters in sequence. Skilled hands — but the system reduced their judgment to penmanship and timing.
Reviews checks after the fact. Approves rejects from yesterday. Signs off on quality she could only verify by reading paper from the previous shift.
Assembles the weekly compliance report from binders of completed forms. Knows the plant’s quality story, but can only tell it in past tense.
Needs quality metrics for customer audits and continuous improvement. Gets them days late, hand-assembled, with no way to drill into the detail behind the number.
The Build
The guardrail framework came first — Target-Action-Reject assessment rules calibrated to each parameter’s specification limits. Every measurement evaluated the moment it’s entered: within target, in the action zone requiring attention, or a reject requiring supervisor sign-off. Not as an afterthought. As the first thing the system does.
On that foundation: a dynamic sequence system matching each machine’s configuration — ten sequences on two lines, fourteen on the third. Automatic hourly scheduling. Real-time collaboration so two operators can work the same check simultaneously, each parameter locked to prevent overwrites. Supervisor approval workflows that route rejects instantly, not at shift-end. And quality shift reports generated digitally, matching the exact layout of the paper forms — because the operators needed to see something familiar before they could trust something new.
The Portal
The operator’s view was there on day one — today’s quality checks organized by production line, each one showing which sequences are active, which are complete, where rejects are waiting. Enter a measurement and the status appears immediately — green for target, amber for action, red for reject. No more writing a number and hoping someone catches the problem eight hours later.
The supervisor’s queue changed shape entirely. Reject approvals arrived in real-time, with the measurement value, the specification limits, and the operator’s notes already attached. The weekly analytics dashboard showed what binders never could: first pass yield trending day by day, reject heatmaps revealing where and when problems concentrate, the top rejected parameters ranked across the entire week.
The Signal
First Pass Yield — the percentage of quality checks completed without a single reject. On paper, this number didn’t exist. It could only be computed when every measurement from every check was captured and aggregated. When it appeared, it told a story: not a quality crisis, but a pattern. The reject heatmap showed concentrations in the first hour after shift change — not a materials problem, a handover problem. The top five rejected parameters were identified in the first week. Improvement targets that once took quarters to formulate now had data behind them in days.
What This Opened
In month two, the quality manager stopped assembling the weekly report manually. It assembled itself. In month three, the plant manager asked the question the team had been carrying: “Can we compare quality performance across shifts — not from memory, but from data?” The reject-by-shift analysis was already there. What followed was the first data-driven shift handover protocol the plant had ever implemented.
The environment didn’t close. It expanded into analytics — weekly trend dashboards, compliance rate tracking, measurement volume as a proxy for operational discipline. What began as a replacement for paper forms became the foundation of a quality intelligence layer — not replacing the team’s expertise, but making it visible for the first time.
The Engagement Arc
Paper clipboards shifted to streaming inspection signals. The number that mattered — First Pass Yield — only existed once every measurement was captured. The shift handover protocol followed.
