Here’s a number that should keep every credit union CEO up at night: 95% of BSA alerts are false positives.
Your compliance team — the two or three people keeping your institution out of regulatory trouble — spends the vast majority of their working hours investigating transactions that turn out to be nothing. Meanwhile, the United Nations Office on Drugs and Crime estimates that less than 1% of global illicit financial flows are actually seized or frozen. The number they use is 0.2%.
The system is broken at both ends. Too much noise on the front end. Too few catches on the back end. And the answer isn’t hiring more people to process more noise. It’s using AI to separate signal from noise so your people can do the work that actually requires human judgment.
The 95% Problem
The false positive rate isn’t a credit union problem. It’s an industry-wide structural failure.
McKinsey puts the number at over 90%. HSBC’s internal data shows 95% or higher. The Bank Policy Institute surveyed the largest US banks and found that in 2017, a sample of major institutions reviewed approximately 16 million alerts, filed over 640,000 SARs — and received feedback from law enforcement on a median of 4% of those filings. Ninety-six percent of all that work product went into a FinCEN database and was never acted on.
Why is the rate so high? Because current monitoring systems are overwhelmingly rules-based, not intelligence-based. They fire on patterns: cash transaction over $X, wire to country Y, account opened less than Z days ago. These rules catch everything that looks suspicious but can’t distinguish between a contractor who deposits cash every Friday and a money launderer structuring deposits to avoid reporting thresholds. Rules see patterns. They don’t see people.
The downstream costs are staggering.
The Bank Policy Institute found that filing a single SAR takes an average of 21.4 hours — more than ten times FinCEN’s official estimate of 1.98 hours. The gap exists because FinCEN’s methodology excluded the actual investigative work: managing the monitoring system, reviewing alerts, and transforming alerts into cases for review. The 1.98 hours only covers the final paperwork.
A mid-size credit union filing 50-70 SARs per month dedicates 1,000-1,500 hours monthly to SAR preparation alone. That’s before CTR filings, before OFAC screening, before the examiner preparation, before the documentation that keeps every compliance action audit-ready. And 96% of that output is never acted on by anyone.
Total US and Canadian spending on financial crime compliance hit $56.7 billion in 2022, according to LexisNexis. The industry is spending billions and catching almost nothing. That’s not a people problem. That’s a system design problem.
A Day in Your BSA Analyst’s Life
I spent time embedded with the compliance team at a credit union partner. Here’s what I watched.
6:30 AM. The BSA Detail file from last night’s batch processing lands. Overnight, the core processor dumped every transaction from the previous day. Not real-time — T+1 data, meaning anything suspicious that happened yesterday is only visible today. If a member’s account was compromised at 2 PM on Tuesday, the earliest your analyst sees it is Wednesday morning.
7:00 AM. The analyst opens five or six separate systems to begin her day. The core processor for member data. Verafin for AML alerts and fraud alerts — which are on separate pages within Verafin, because AML and fraud monitoring aren’t unified. IDOC for check images. A separate tool for SSN aggregation, because the BSA Detail file aggregates by account, not by individual — joint account holders appear as a single entry, so she needs a second system to untangle who did what. And an internal spreadsheet tracker, because none of these systems talk to each other.
8:00 AM to noon. She works through the alert queue. Each alert requires pulling the member profile from the core, checking transaction history across all accounts, cross-referencing with previous alerts and filed SARs, checking for related accounts — joint holders, business accounts, authorized users — looking up negative news manually in a browser, and then making the determination: is this suspicious, or is this just Maria the florist making her weekly cash deposit?
For 95 out of 100 alerts, it’s Maria. But she has to check every single one.
1:00 PM. A SAR needs to be filed. The narrative — the story of why this activity is suspicious — is entirely manual. She synthesizes notes from multiple systems, transaction records, prior correspondence, and external research into a coherent document that must meet FinCEN’s filing requirements. A manual SAR takes one to three hours. Even the semi-automated ones, with some Verafin assistance, take 10-20 minutes of focused review.
This team — two or three people — handles 400-plus CTRs and 50-70 SARs per month. They’re operating at 125% capacity, averaging 60-hour weeks. And the transaction volume is growing.
These are smart, dedicated, experienced professionals. They’re not slow. They’re drowning. And the answer isn’t to hire a fourth person to drown alongside them.
Why More Staff Doesn’t Fix a Systems Problem
Credit unions can’t hire their way out of this.
Seventy percent of banks and non-bank financial institutions report capacity challenges in their compliance operations, according to a 2023 WorkFusion/Celent study. Sixty-three percent of firms say it takes four months or longer to fill an experienced compliance analyst role. Once you find someone, the training timeline is daunting — a new BSA analyst takes 12-18 months to become truly effective. Not because they’re slow, but because understanding your credit union’s specific membership patterns, risk profile, examiner relationship, and community economics requires time with the data that no certification can shortcut.
The salary competition makes it worse. A BSA analyst at a $500 million credit union makes $60,000-$80,000. The same analyst at a regional bank makes $80,000-$110,000. At a money center bank or fintech: $120,000 or more. Credit unions are fighting for scarce talent with one hand tied behind their back.
And the math never balances. If your transaction volume grows 15% per year — a sign of a healthy credit union — and your alert rules remain the same, your false positive volume grows 15% per year too. Hiring scales linearly. Alert volume scales with your success. You cannot staff your way to equilibrium.
The retirement cliff compounds everything. The most experienced BSA officers — the ones with 20-plus years of institutional knowledge about your membership patterns, your examiner relationships, your community’s economic rhythms — are retiring. When they leave, they take irreplaceable context with them. The analyst who knows that Maria’s Friday cash deposits are legitimate because she’s been the florist on Main Street for 22 years? That knowledge isn’t in any system. It walks out the door when she retires.
The real problem isn’t people. It’s that you’re asking humans to do work that machines should do — sorting signal from noise at scale — instead of work that only humans can do: exercising judgment about complex situations that require institutional knowledge, investigative instinct, and an understanding of your community.
The 95% false positive rate isn’t a staffing failure. It’s a technology failure that creates a staffing crisis.
What AI Actually Changes — And What It Doesn’t
Let me be precise about this, because the hype is real and so is the risk of overpromising.
What AI does:
It triages the 95%. An AI agent that understands your membership patterns — Maria’s weekly cash deposits, the construction company’s seasonal revenue cycles, the retired teacher’s pension schedule — can flag alerts as “routine pattern, consistent with member history” before a human ever sees them. The analyst reviews the AI’s reasoning and confirms the disposition, not the raw transaction. The investigation that took 20 minutes now takes 30 seconds of review.
It drafts SAR narratives. The agent pulls member data, transaction history, prior alerts, account relationships, and external references — then drafts a coherent narrative following FinCEN’s requirements. The BSA officer reviews, edits, and approves. One to three hours of writing becomes 10-15 minutes of review and judgment.
It monitors in real time. The CDC pipelines I described in Article 5 replace nightly batch processing with real-time data streams. Suspicious patterns are detected as they happen — not 24 hours later when the window for intervention has closed.
It eliminates the spreadsheet. McKinsey found that 85% of financial crime compliance activities are administrative or non-analytical — manually collecting data from one system to import into another. AI collapses those six systems into a single intelligent interface where the analyst sees the complete picture on one screen.
The results at scale are already proven. HSBC deployed AI-powered AML monitoring and reported 60% fewer false positives while detecting two to four times more real financial crime — nearly 4x in retail banking specifically. They monitor 900 million transactions per month across 40 million customer accounts. JPMorgan reported a 95% reduction in false positives after deploying AI-driven transaction monitoring. These aren’t hypotheticals. These are deployed systems at the world’s largest banks.
What AI doesn’t do:
It doesn’t make the judgment call. Is this activity actually suspicious? That requires understanding context, exercising discretion, weighing factors that no algorithm can fully capture. FinCEN and the NCUA require human sign-off on all compliance actions. This isn’t a limitation of AI — it’s the right design. The human makes the decision. The AI makes sure the decision is informed by the full picture instead of a fraction of it.
It doesn’t replace your BSA officer. It replaces the noise that buries your BSA officer. The goal isn’t a smaller compliance team. It’s a compliance team that spends 80% of their time on the 5% of alerts that actually matter, instead of 80% of their time on the 95% that don’t.
The Examiner Conversation
Here’s the part that makes compliance officers nervous: “What will the examiner say?”
The answer might surprise you.
FinCEN has explicitly encouraged the use of AI and innovative technologies for BSA/AML compliance. Their Innovation Hours program invites financial institutions to discuss how technology can improve suspicious activity monitoring. The NCUA’s AI Compliance Plan, published September 2025, isn’t anti-AI — it’s pro-AI-with-guardrails. The requirements — monitoring, control, termination capability — are a design specification for doing AI right, not a prohibition on doing it at all.
The examiner question that should worry you isn’t “Why are you using AI?” It’s “Why are you still using the same rules-based system that generates 95% false positives while TD Bank just paid $3.09 billion for inadequate monitoring?”
TD Bank’s penalty — the largest BSA/AML fine in US history — wasn’t for lack of staff. It was for inadequate monitoring systems. They admitted to willfully failing to implement an AML program that met minimum BSA requirements. For eight years, from 2014 to 2022, they didn’t add a single new scenario to their transaction monitoring system — despite known deficiencies, emerging risks, and new products. Ninety-two percent of their total transaction volume went unmonitored. The regulators aren’t telling you to hire more analysts. They’re telling you to build better systems.
The FFIEC BSA/AML Examination Manual has been updated to acknowledge technology-assisted monitoring. The regulatory wind is blowing clearly: use better technology, document how it works, keep humans in the loop, and be able to demonstrate it to an examiner. That’s exactly the architecture I described in Article 2 — the SEC examination muscle memory that shapes how Runline builds. Every agent action logged. Every decision auditable. Every system stoppable in seconds.
The building blocks converge here. SEC-grade audit trails from Article 2. Core processor data unlocked via CDC from Article 5. AI agents that triage BSA alerts using real-time data, with full documentation your examiner can review. The BSA analyst’s six-system obstacle course from this article collapses into one intelligent interface — and every step the AI takes is logged, explainable, and reviewable.
Your BSA team isn’t failing. Your monitoring systems are failing your BSA team. The 95% false positive rate is a technology problem with a technology solution — one that doesn’t replace your best people but finally lets them do the work they were trained to do.
Sean Hsieh is the Founder & CEO of Runline, the secure agentic platform for credit unions. Previously, he co-founded Flowroute (acquired by Intrado, 2018) and Concreit, an SEC-regulated WealthTech platform managing real securities under dual federal regulatory frameworks.
Next in the series: “Stop Buying Chatbots. Start Building Infrastructure.” — why the real AI opportunity isn’t member-facing chatbots but internal agents that handle the back-office workflows drowning your staff.


