All case studies
SoftwareFood & BeverageMarch 28, 2026

20 Hours of Weekly Compliance Reports. Automated.

Result

20 hrs/week recovered, zero missed audit deadlines

20 Hours of Weekly Compliance Reports. Automated.

A food & beverage manufacturer's quality team was spending 20 hours per week manually compiling audit reports from three disconnected systems. We automated the entire pipeline and eliminated the manual work.

A regional food manufacturer with three production facilities was running their SQF audit reporting process almost entirely by hand. Each week, the quality manager and two technicians spent a combined 20 hours pulling records from an ERP system, a production historian, and a standalone quality management system — copying data into Excel, cross-referencing it manually, and formatting it into the required report structure. The process was error-prone, deadline-sensitive, and consuming a third of the quality team's working capacity on work that added no analytical value.

The Stakes of SQF Compliance in Regional Manufacturing

SQF — Safe Quality Food — is a GFSI-benchmarked food safety certification that most major retailers and distributors now require as a condition of doing business. For a regional manufacturer, SQF certification isn't optional. Losing it means losing shelf placement. An audit finding — particularly a major nonconformance — can trigger corrective action plans, re-audits, and customer notification requirements that consume management attention for months.

The audit cycle at this facility required weekly compliance reporting as part of their ongoing SQF maintenance program. The weekly report documented batch-level production records, ingredient traceability, inspection results, hold events and dispositions, and corrective actions — everything an auditor would expect to see as evidence that the food safety management system was operating in control. The report had to be filed by Monday morning to meet the quality manager's internal review schedule before being archived for the annual audit.

Missing the Monday deadline wasn't a regulatory violation on its own. But it created downstream pressure: without the weekly report, the quality manager couldn't close out the prior week's hold events, which meant open CAPAs accumulated, which meant the CAPA summary that fed into the annual audit preparation was perpetually behind. The whole system depended on the weekly report arriving on time.

How the Process Actually Broke Down

The three systems the quality team was working across — a legacy ERP, an OSIsoft PI historian, and a cloud-based QMS — each held a piece of the audit picture. None had native integration with each other, and the SQF report format required data from all three cross-referenced at the batch and lot level.

The weekly process looked like this: export production records from PI for the reporting period, pull corresponding lot traceability and ingredient records from ERP, pull inspection results and hold events from the QMS, reconcile the three datasets manually in Excel, flag discrepancies for follow-up, and format the final document to SQF specification.

Every step was manual. Every export was in a different format. Discrepancies between systems appeared regularly — timestamp offsets between PI and ERP created near-miss matches that had to be manually confirmed, QMS inspection records occasionally referenced lot numbers that didn't match ERP exactly due to manual transcription errors at the line, and hold events sometimes lacked resolution records because operators had closed them verbally without completing the QMS workflow.

Each discrepancy required investigation before the report could be filed. Some were resolved quickly by a phone call to the production supervisor. Others required pulling paper records from the shop floor. The quality manager estimated that 35 percent of the 20 hours was spent not on actual quality analysis, but on chasing data mismatches and formatting output — work that produced nothing except a report that should have been generated automatically.

There was a second, less visible cost. Because the process was so time-intensive, the quality team had developed informal workarounds under time pressure: small discrepancies were sometimes carried forward rather than fully resolved, the assumption being that they'd get cleaned up before the annual audit. By the time auditor season arrived, the team was spending significant prep time reconciling a year's worth of minor discrepancies that should have been addressed weekly.

What We Built

We built a workflow automation system that connects to all three data sources on a scheduled basis, normalizes and reconciles the data, detects discrepancies automatically, and generates the weekly SQF compliance report without manual intervention.

The integration layer pulls from PI via its REST API, from the ERP via a read-only database connection, and from the QMS via its export API. Data is normalized into a unified batch record model — each production batch keyed by lot number, with production metrics, ingredient traceability, and inspection results joined at the lot level. The normalization handles the known format differences between systems explicitly: PI timestamps are converted to the ERP's lot creation time as the authoritative reference; QMS lot number formatting is standardized to match ERP conventions including known variations from the line operators' data entry patterns.

Discrepancy detection is automated and categorical. The system distinguishes between three types of exceptions: data mismatches (a QMS inspection record that can't be matched to an ERP lot), missing records (a production batch with no corresponding QMS inspection), and open holds (a hold event with no resolution record after the expected disposition window). Each exception type routes to the appropriate workflow for review — production mismatches go to the line supervisor queue, missing inspections go to the quality technician queue, and open holds escalate to the quality manager if they're more than 48 hours past the standard resolution window.

The quality team reviews only flagged exceptions — currently averaging two to three per week. Everything else is handled automatically. The final report is generated in SQF-compliant format and delivered to the quality manager's inbox by 6 AM Monday. A dashboard view shows real-time exception status and report readiness throughout the week so there are no surprises on Monday morning.

Implementation: The Data Problem Came First

The primary challenge was not the integrations — all three systems had accessible APIs or connectors — it was the historical data quality in the QMS. The system had been in place for several years with minimal governance on data entry standards. Inspection categories had been renamed at least twice without migrating historical records, legacy hold codes from an older workflow no longer mapped to current disposition states, and a partial data migration two years prior had left duplicate records for roughly a three-month window that neither the current QMS vendor nor the internal IT team had addressed.

We spent the first two weeks of the engagement mapping the data model exhaustively before writing any integration code. Every known inconsistency was documented. Every field mapping was validated against a sample of actual production records. The normalization logic handles the documented variations explicitly — the system doesn't assume the source data is clean, it assumes the known failure modes will recur and builds specific handling for each one.

This front-loaded data work is where most similar projects fail. It's not glamorous, and it delays the gratifying part of building the actual integration. But a normalization layer built on an incomplete understanding of the source data will surface exceptions that aren't real discrepancies and miss exceptions that are — making the automated report less reliable than the manual one it replaced.

We also ran the automated system in parallel with the manual process for four weeks before cutover, comparing outputs side by side each week. Three discrepancies were found during parallel run — all traced to source data issues that had been silently passing through the manual process undetected. An ingredient traceability record that was being mismatched between ERP and QMS due to a formatting inconsistency. A hold event from eight months prior that had been verbally resolved but never closed in the system. And a recurring duplicate lot number pattern from a specific line that had been manually corrected by habit in the Excel process.

"We found errors in the old manual process during parallel testing. That told us the automation wasn't just faster — it was more reliable." — Quality Manager

Finding those three issues during parallel run was not a project setback. It was the validation that the system was functioning correctly. The manual process hadn't been catching them either.

Results

The quality team has not produced a manual report since cutover. Weekly report generation runs in under four minutes of automated runtime. The quality manager reviews the exception queue and approves the final document before it's archived — a process that takes 15 to 25 minutes depending on the week's exception volume, compared to the prior 20-hour manual cycle.

  • Manual reporting hours eliminated: 20 hrs/week (quality manager + 2 technicians)
  • Missed SQF report deadlines since deployment: 0 (previously 2-3 per year due to capacity constraints)
  • Automated discrepancy detection: avg. 2-3 exceptions flagged per week that previously passed undetected
  • Legacy QMS data anomalies surfaced and corrected during implementation: 47 records
  • Quality team hours redirected to CAPA follow-through, supplier audits, and FSMA preparation

The system has been running for five months without a missed report. More importantly, the quality team's CAPA backlog — which had accumulated over two years of inconsistent exception follow-through — has been cleared. Open corrective actions are now current within the standard disposition window.

What Changed for the Quality Team

This project is representative of a pattern that shows up across food manufacturing and other regulated industries: a business-critical compliance process that runs manually not because automation is difficult, but because the quality team that owns the process has never had the bandwidth to build a better system. The data exists. The systems have APIs. The report format is defined by an external standard that isn't changing. The only missing piece was the integration layer.

The quality manager's time is now spent on supplier audits, pre-audit preparation for the annual SQF assessment, and process improvement initiatives that had been perpetually deferred due to weekly report load. The two technicians who contributed hours to the manual process each week are now available for floor-level inspection work and environmental monitoring documentation. No one was replaced. The 20 hours were redirected.

The upcoming FSMA Traceability Rule — which will require substantially more granular lot-level traceability documentation for covered food categories — is also now tractable. The data infrastructure built for the weekly SQF report is the same infrastructure that FSMA traceability reporting will require. The compliance team is ahead of that requirement rather than scrambling to meet it.

The FSMA Extension

The client is currently in scoping conversations with us about extending the system to generate FSMA Section 204 traceability reports on demand. The unified batch record model already captures the Critical Tracking Events and Key Data Elements required under the rule — it was designed with that extension in mind. Generating an on-demand FSMA traceability report for a specific lot is functionally an additional query against the same normalized data layer, not a new integration project.

Building compliance infrastructure that can grow with regulatory requirements — rather than being rebuilt each time requirements change — is the design goal. The first version of this system was the foundation.

Ready to see results like these?

Tell us what you're working on. We'll scope it and tell you how we'd approach it.

Start a project