Reg BI and AI: Best-Interest Disclosure in Automated Workflows
How AI changes Reg BI compliance for broker-dealers. Where automation helps, where it stays human, and what gets stuck in supervision.
AI workflows touch every part of this. Done well, they make Reg BI compliance more rigorous, not less. Done badly, they create new failure modes. This is the operator-level read on where AI fits and where it doesn't.
What Reg BI requires (in plain language)
Four obligations the firm must meet:
- Disclosure Obligation: disclose material facts about the recommendation and the relationship before or at the time of the recommendation
- Care Obligation: exercise reasonable diligence, care, and skill to (a) understand the recommendation, (b) have a reasonable basis to believe it's in the customer's best interest, and (c) (for a series of recommendations) act in the customer's best interest across the series
- Conflict of Interest Obligation: establish and maintain procedures to identify and address conflicts; eliminate certain conflicts (revenue-sharing on a specific transaction type) or mitigate others
- Compliance Obligation: establish and maintain written policies and procedures designed to ensure compliance with Reg BI
Where AI fits each obligation
Disclosure Obligation
The CRS (Customer Relationship Summary, Form CRS) is the document. It's required at firm-customer onboarding and updated on material changes. AI doesn't draft this — it's a regulator-prescribed format — but AI can manage the workflow:
- Tracking which customers have received the current version
- Flagging customers due for re-disclosure on material changes
- Generating supplemental disclosures for recommendation-specific facts
Care Obligation
This is where AI adds the most value. The "reasonable basis" requirement for understanding the recommendation and believing it's in the customer's best interest requires:
- Understanding the customer's profile (investment objectives, financial situation, risk tolerance, time horizon)
- Understanding the recommendation (features, fees, costs, risks)
- Comparing the recommendation to reasonably-available alternatives
- Surface the customer's profile in a structured way (pulling from CRM, planning software)
- Compare the recommendation against alternatives (lower-fee version, different fund family, different structure)
- Document the comparison so the reasonable-basis analysis is preserved
- Flag situations where the recommendation doesn't appear to be in the customer's best interest given the data
- Make the recommendation. The registered rep does.
- Decide what "best interest" means for this customer. That's judgment.
- Bypass the supervisor review.
Conflict of Interest Obligation
AI pipelines can monitor for conflict patterns at scale:
- Rep production concentrating in higher-commission products
- Rollover recommendations from low-fee plans to higher-fee accounts
- Mutual fund share class selection patterns
- Inactive accounts moving to higher-fee structures
Compliance Obligation
Written policies and procedures (WSPs) need to specifically address AI use in recommendation workflows. The standard topics:
- What AI tools are used and how
- What oversight exists (sampling, review, validation)
- What records are kept (prompts, outputs, supervisor reviews)
- How the firm validates AI accuracy
- Who is responsible for AI-related supervisory functions
Specific AI workflows under Reg BI
Rollover recommendations
The Reg BI scrutiny on 401(k)-to-IRA rollovers is intense. Every rollover recommendation requires documented analysis of:
- Fees in the existing plan vs. the recommended IRA
- Investment options available
- Services available
- Distribution flexibility
- Other relevant factors
This is one of the highest-impact AI deployments at broker-dealers because rollover documentation is one of the most heavily-cited Reg BI examination findings.
Mutual fund share class selection
Examiners look at share class selection patterns. Are reps recommending higher-fee share classes when lower-fee classes were available? Are clients being kept in transaction-based accounts when advisory accounts would be cheaper given their pattern of activity?
AI surveillance pipelines can monitor share class and account-type patterns across the firm, flag outliers, and document the firm's response to flagged situations.
Series-of-recommendations analysis
Reg BI explicitly considers series of recommendations as a unit, not just individual recommendations. A rep who makes a series of high-cost recommendations to a client over time may violate Reg BI even if each individual recommendation looks defensible.
AI can detect series patterns at scale. Surface for supervisor review.
What goes wrong
Three failure modes specific to AI in Reg BI contexts:
Black-box "AI-generated recommendations"
Some vendors market AI-generated recommendations. The Reg BI problem: the firm can't easily document why this recommendation is in the customer's best interest if the rationale is in a model's hidden weights. Examiners want documentation. Solution: AI surfaces options + rationale + comparison; the rep selects and documents. The recommendation is human; the analytical support is AI.
Skipping the comparison
Some AI tools generate a recommendation without comparing alternatives. Reg BI requires the comparison. The pipeline must include it.
Conflict signal ignored
If the AI pipeline flags a conflict pattern (rep concentrating in high-commission products) and the firm doesn't act on it, the examiner finding is worse than if there was no AI pipeline at all. Build the response process before turning on the surveillance.
The audit trail
Every AI-supported recommendation should generate:
- Customer profile snapshot used
- Recommendation analyzed
- Alternatives considered
- Rationale supporting the recommendation
- Rep approval / sign-off
- Supervisor review record
- Any client disclosures provided
Deployment realities
For a broker-dealer deploying AI under Reg BI:
Phase 1 (60 days): Build the data layer for one recommendation type (most firms start with rollovers because the examination scrutiny is highest). Document the workflow. Update WSPs.
Phase 2 (next 60 days): Shadow mode — AI runs in parallel with current recommendation workflow. Reps see AI analysis but make recommendations through normal channels. Compare outcomes.
Phase 3 (next 90 days): AI-augmented full workflow. Reps make recommendations with AI-supplied analysis. Supervisor reviews include AI documentation. Examiner can review the trail.
Expect the WSP update + CCO conversation to be the slowest part. Build that into the timeline.
Where Prometheus comes in
We've shipped Reg BI AI workflows at broker-dealers ranging from independent BDs to large hybrid platforms. The work isn't glamorous — it's documentation discipline at scale — but it's the difference between a clean examination finding and an enforcement referral.
If your firm is wrestling with Reg BI documentation at scale, that's a 30-minute conversation worth having.
Frequently asked questions
Can AI make Reg BI recommendations?
No. Reg BI recommendations require a registered representative's judgment and the firm's reasonable-basis determination. AI surfaces analysis and alternatives; the rep makes the recommendation. The fiduciary-equivalent obligation stays human.
Where do examiners focus on Reg BI?
Rollover recommendations (401(k)-to-IRA documentation), mutual fund share class selection, account type recommendations (advisory vs. brokerage), and series-of-recommendations patterns are the most-cited areas in recent examinations.
Do we need to update WSPs to address AI?
Yes. Written supervisory procedures should specifically address AI use in recommendation workflows: what tools, what oversight, what records, who is responsible. Examiners specifically ask about this in 2026 examinations.
What's the audit trail for AI-supported recommendations?
Customer profile used, recommendation analyzed, alternatives considered, rationale, rep approval, supervisor review, client disclosures provided. More comprehensive than typical pre-AI documentation, which is why doing this right is a net improvement to firm position.
What's the right phased deployment?
60 days build + WSP update, 60 days shadow mode, 90 days AI-augmented full workflow. About 7 months from kickoff to full production. Start with rollover recommendations since the examination scrutiny is highest there.
Related guides
Need help implementing this?
//prometheus does onsite AI consulting and implementation in Milwaukee. We set it up, train your team, and make sure it works.
let's talk