Your Core Processor Is a Time Capsule — And That's Actually Your Biggest Asset

Everyone says rip and replace. The contrarian case — your decades-old core holds 30 years of irreplaceable member intelligence, and AI makes it more valuable than every modern SaaS tool in your stack combined.

By Sean Hsieh
Read 11 min
Published August 24, 2025
Your Core Processor Is a Time Capsule — And That's Actually Your Biggest Asset

Everyone tells you your core processor is your biggest liability. It’s old. It’s slow. It’s written in a language nobody teaches anymore. Your vendor’s “modernization roadmap” has been three years away for the last fifteen years. Every conference you attend, someone is pitching you on ripping it out and replacing it with a shiny cloud-native alternative.

Here’s my contrarian take: your core isn’t a liability. It’s a time capsule. And the decades of data trapped inside it are about to become your most valuable strategic asset.

The question isn’t whether to replace your core. It’s whether to unlock the goldmine sitting inside it.


What’s Actually Inside the Time Capsule

Forty-three percent of banking systems worldwide still run COBOL. Three trillion dollars in daily commerce flows through these systems. This isn’t a bug — it’s a testament to how reliable they are. Your credit union’s core processor has been running continuously for decades, processing every transaction, every loan payment, every member interaction without interruption. That’s not a failure of modernization. That’s engineering durability.

Your core — whether it’s Jack Henry Symitar (built on IBM AS/400 architecture with RPG language, dating to the 1980s), Fiserv DNA, CU*Answers GOLD (founded in 1970, running continuously on IBM i-Series for 56 years), or one of the newer entries like Corelation KeyStone — holds something no AI vendor can replicate: your institutional memory.

Here’s what’s in there:

Twenty to thirty years of transaction history per member — every deposit, withdrawal, transfer, loan payment, fee, and reversal. Complete loan origination and servicing records from application through payoff. Member behavior patterns — channel usage, product adoption, seasonal habits, life event indicators like address changes, beneficiary updates, and new account types. Seven-plus years of compliance records — BSA/AML alerts, CTR filings, SAR history, OFAC screening results, audit trails. Communication history — every call, every dispute, every complaint, every resolution.

The problem isn’t the data. It’s the access.

I’ve been inside credit union core data centers. What you find is not uncommon for infrastructure that’s powered an entire vertical for this long — I saw similar patterns in telecom before the industry modernized. Massive IBM Power infrastructure running single logical partitions, thousands of programs accumulated over 30+ years, and schema metadata that one architect diplomatically described as “not strong.” Transaction categorization often uses a handful of generic buckets instead of granular merchant category codes — even though the MCC codes are captured on every card transaction. The spending intelligence is already in the data. Nobody built the bridge to access it.

This data sits in proprietary formats, batch-processing cycles, flat files, and schemas that evolved through decades of append-only additions. Nobody designed it to be queried by AI. Nobody designed it to be queried by anything modern. But the data itself — 30 years of member relationships, behavioral patterns, and institutional knowledge — is irreplaceable.


Why “Rip and Replace” Is the Wrong Answer

The credit union industry has been talking about core modernization for 20 years. The track record is not encouraging.

Core conversions take 12-24 months, cost millions of dollars, and carry catastrophic risk. One failed conversion can mean weeks of member-facing outages, regulatory scrutiny, and permanent trust damage. The migration process itself is a high-wire act — remapping decades of proprietary data structures into a new schema while maintaining data integrity, regulatory continuity, and zero downtime for members who rely on their accounts every day.

The economics of leaving are punitive by design. Jack Henry collected $16 million in deconversion fees in FY2025 alone — and that’s just the penalty for departing, before you’ve spent a dollar on the new system or a single hour on the conversion itself.

It’s no surprise, then, that 69% of financial institutions plan to stay on their current core, according to the American Bankers Association. They’re not staying because they love their core processor. They’re staying because the switching cost is existential.

The cloud-native alternatives exist. Thought Machine, Mambu, Temenos — they’re real and improving. But they were designed for neo-banks and fintechs starting from a blank slate, not for institutions with 30 years of member history, complex product configurations, regulatory obligations that require data continuity, and staff who’ve built their workflows around the existing system for a decade or more.

And here’s the dirty secret nobody mentions at the conference keynotes: even when a credit union does convert, they often lose historical data in the migration. Legacy formats don’t map cleanly to modern schemas. Fields that meant one thing in 1998 mean something different in 2026. Decades of institutional memory — the kind of deep member knowledge that no amount of marketing data can replace — gone.

So here’s the real question: if 69% of credit unions aren’t replacing their core, and even the ones that do risk losing their historical data in the process, what if the answer isn’t replacing the core at all? What if it’s building an intelligence layer on top of it?


Lessons from the Giants

The most valuable companies in technology share one trait: they built an access and intelligence layer on top of data that already existed.

Bloomberg didn’t create financial markets. Michael Bloomberg built a terminal that normalized, indexed, and made accessible the data that was already flowing through trading desks — pricing feeds, news wires, economic indicators, all trapped in disconnected systems. Revenue today: over $12 billion per year. Bloomberg’s product is fundamentally a data normalization and access layer. The data existed. The ability to query it intelligently didn’t.

Palantir didn’t create government intelligence data. They built Gotham — a platform that integrated, normalized, and made queryable the data that was already sitting in disconnected government databases across the CIA, NSA, FBI, and military branches. The intelligence community didn’t need more data. It needed a way to connect the data it already had. Palantir’s 2025 revenue hit $2.87 billion on that thesis.

Plaid didn’t create bank accounts. They built an API layer that made consumer financial data accessible to applications — connecting 12,000+ financial institutions to fintech apps that needed transaction data, account balances, and identity verification. Plaid became the data layer for fintech by building access infrastructure, not new data.

Morgan Stanley gave their 16,000 financial advisors RAG-powered access to over 100,000 internal documents — research reports, product guides, regulatory filings, client communication templates. Adoption hit 98% within months. The documents had existed for years, sitting in SharePoint folders and email archives where nobody could find them. The AI access layer made them useful for the first time.

The pattern is always the same. The data already exists. The value is in making it accessible, normalized, and intelligent. Nobody needs to create new credit union data. Your 30 years of member history, transaction patterns, and institutional knowledge are already there. You need a way to unlock it.


Unlocking the Time Capsule Without Breaking It

The technology to do this exists today. It’s called Change Data Capture — CDC — and it’s how you build a real-time intelligence layer on top of your legacy core without touching, replacing, or risking it.

Here’s how it works. Your IBM i-Series core processor already logs every database change in an internal journal. This isn’t a feature you need to add — it’s how the system maintains its own integrity. It’s been doing it for decades. A CDC engine reads that journal, captures every change in real time, and streams it to a modern data platform.

The pipeline looks like this: your core processor’s journal feeds into a CDC engine (the open-source Debezium project supports IBM i-Series connectors), which publishes changes to an event bus like Apache Kafka, which feeds into modern storage — Parquet files, ClickHouse, a cloud data warehouse — where the data gets normalized, semantically indexed, and made accessible to AI agents.

What this means practically:

Every transaction, account change, and member update is captured in real time — sub-5 second latency — instead of waiting for last night’s batch run. The data gets normalized from proprietary formats into a modern, queryable schema that AI agents can reason about. Those agents can access the meaning of 30 years of member history without ever touching the core processor itself. And the core keeps running exactly as it does today. Zero risk. Zero disruption. Zero downtime.

The cost comparison is stark. IBM’s proprietary CDC solutions run $600,000-$1.5 million over three years. Traditional consulting firms charge $500,000 to over $1 million for data warehouse implementations, with 9-12 month timelines. With modern open-source tooling and domain expertise, this can be done in 10-12 weeks at a fraction of those costs.

But here’s the part that makes this genuinely hard — and genuinely defensible. There is no cross-core data standard in the credit union industry. CUFX (the Credit Union Financial Exchange) has limited adoption. FDX (Financial Data Exchange) is consumer-focused, designed for Open Banking use cases, not operational data integration. This means every core integration is custom engineering work — understanding different APIs, data formats, journal structures, and decades-old schema decisions that were made for reasons nobody documented.

This is hard, unglamorous work. It requires someone who has been inside the data center, explored the schema, understood why field 47 in table 12 means something different at a Symitar shop than it does at a GOLD shop. It’s the technological equivalent of archaeology — careful, methodical, deeply technical work that no slide deck can fake.

It’s also the most defensible moat in credit union technology. Once you’ve built the normalization layer for a core processor, every AI agent built on top of it gets better automatically. And every competitor who hasn’t done this work is starting from zero.


What Happens When the Time Capsule Opens

Once your core data is normalized, indexed, and AI-accessible, everything changes.

Your BSA analyst stops manually querying three systems to investigate an alert. An AI agent pulls the member’s full transaction history, account relationships, and prior alert patterns in seconds — not minutes, not hours. The next article in this series goes deep on what this means for BSA compliance specifically, but the preview is this: the reason 95% of BSA alerts are false positives isn’t bad analysts or bad systems. It’s that no human can hold the full context of a member’s 20-year transaction history in their head while triaging 200 alerts per week.

Your loan officers see AI-generated member insights — predicted needs, risk signals, cross-sell opportunities — drawn from decades of behavioral data, not a single credit score from a single moment in time. The member who’s been banking with you for 15 years has a story that a three-digit number can’t tell. Your data can.

Your MSRs get a complete member picture on every call — transaction context, communication history, product usage, life events — without toggling between seven systems. The obstacle course of vendor UIs from Article 4 collapses into one intelligent interface.

Your compliance team gets real-time monitoring instead of nightly batch reports. Suspicious patterns detected as they happen, not 24 hours later when the window for intervention has closed.

And your examiners see a credit union with a documented, auditable, real-time data infrastructure — the kind of institution they hold up as a model, not a risk.

Right now, credit unions are using less than half the value of the data they already own. Cornerstone Advisors’ data utilization index puts the industry average at roughly 250 out of 500. Half your data’s value is sitting in the time capsule, waiting.

The SaaSPocalypse — the subject of the previous article — wipes out execution-layer vendors. But the data layer doesn’t just survive. It appreciates. Your core processor data — the same data everyone tells you is a liability — is the one asset that gets more valuable as AI gets more capable. Every other vendor in your stack is replaceable. Your 30 years of member history is not.

Everyone’s selling you a new core. Nobody’s offering to unlock the one you already have. That tells you more about vendor incentives than it does about your technology.

Your core processor isn’t old. It’s seasoned. And in an AI era where data is the only durable competitive advantage, that 30-year time capsule is worth more than every modern SaaS tool in your stack combined. The question isn’t whether to replace it. It’s whether to finally unlock what’s inside.


Sean Hsieh is the Founder & CEO of Runline, the secure agentic platform for credit unions. Previously, he co-founded Flowroute (acquired by Intrado, 2018) and Concreit, an SEC-regulated WealthTech platform managing real securities under dual federal regulatory frameworks.

Next in the series: “Why 95% of BSA Alerts Being False Positives Is an AI Problem, Not a Staffing Problem” — the compliance crisis that’s burning out your best analysts, and what happens when AI handles the noise so your people can handle the judgment calls.

Get Started

Ready to see what stateful AI agents can do for your credit union?

Runline builds purpose-built AI agents for regulated financial institutions. Every interaction compounds institutional intelligence.

Schedule a Demo