A bipartisan push for transparency and accountability in customer service
A new bipartisan bill, the Keep Call Centers in America Act of 2025, has just been introduced in both the House and Senate. If passed, it would fundamentally reshape how U.S. contact centers operate. Among other provisions, the bill would:
- Require agents to disclose their physical location and whether AI is involved in a customer interaction.
- Give customers the right to transfer to a U.S.-based human agent on request.
- Penalize companies that offshore more than 30% of their call center volume by making them ineligible for federal contracts, grants, and loans for up to five years.
- Mandate new reporting requirements on AI use and its impact on jobs.
This is not a fringe proposal. It has bipartisan sponsorship and reflects growing concern over data privacy, AI transparency, and the long-term future of U.S. service jobs.
Why this matters for BPO leaders
For BPO executives, the message is clear: regulators are paying closer attention to how and where customer service is delivered. Even if this bill does not become law, it underscores a trend that cannot be ignored. Compliance is no longer just about ticking boxes; it is about demonstrating accountability, transparency, and the ability to adapt to rising expectations from regulators and enterprise clients alike.
In this environment, manual processes and limited QA sampling are not enough. When only 1–3% of calls are reviewed, leaders cannot reliably prove disclosures are being made, AI involvement is being tracked, or compliance obligations are being met. That gap represents real risk in lost contracts, reputational damage, or even regulatory penalties.
The role of next-generation technology
The requirements outlined in the Keep Call Centers in America Act touch directly on disclosure, compliance, and monitoring, areas where legacy QA processes fall short. To meet these rising expectations, BPOs will need to adopt technology that automates compliance at scale and ensures nothing slips through the cracks.
Take the disclosure requirement: every interaction must begin with a statement of the agent’s location, and if AI is being used, that must also be disclosed. A manual QA process that samples a handful of calls cannot verify whether disclosures are consistently happening across tens of thousands of daily interactions. Technology solutions like MosaicVoice can automatically check 100% of calls, flag any disclosure failures, and generate compliance reports that stand up to regulator or client scrutiny.
Similarly, the bill gives consumers the right to transfer to a U.S.-based human agent on request. That creates a new operational risk: routing must work flawlessly, and BPOs must be able to prove it did. Next-generation platforms can track whether transfer requests occurred, confirm they were honored, and surface any failures for rapid remediation.
Finally, the legislation mandates tracking and reporting on AI use and its impact on jobs. This requires a detailed audit trail of when AI was involved in an interaction and how it performed relative to human agents. Advanced QA platforms can log this information automatically, giving BPOs the visibility they need to comply while also turning that data into insights to optimize staffing, training, and AI-human collaboration.
In short, the provisions in this bill map directly to capabilities that only automated, AI-powered QA solutions can deliver. For BPO leaders, adopting these technologies is not just about future-proofing against regulation, it is about running a smarter, more transparent, and more competitive operation today.
The bigger picture
The Keep Call Centers in America Act may or may not move forward, but its introduction is a wake-up call. The rules of the game are changing. For BPOs, the choice is clear: continue relying on outdated processes and risk falling behind, or embrace technology that turns compliance into a competitive advantage.