California Just Drew the Line on AI
After shaping national privacy norms with the CCPA and CPRA, the state has now finalized the country’s first comprehensive consumer-facing AI governance framework. The rules—known as Automated Decision-Making Technology (ADMT) regulations—set new transparency and accountability standards for any company using automation to make “significant decisions” about people.
While these rules were written with artificial intelligence in mind, their reach goes far beyond what most of us picture when we hear “AI.” The definition includes any system that replaces or substantially assists human decision-making—from algorithms and scoring models to fraud detection tools and risk engines.
In short, if your company uses automation to decide whether to hire, onboard, lend, or flag a transaction, California’s new regime has you in its sights.
What Exactly Is ADMT?
The term “automated decision-making technology” sounds futuristic, but the concept is practical: effectively, it’s any technology that uses computation and personal information to make—or meaningfully shape—decisions about individuals.
The California Privacy Protection Agency (CPPA), which finalized the rules in late 2025, lists examples such as:
- Screening and scoring job applicants or contractors
- Approving or denying loans or housing applications
- Determining access to education, healthcare, or financial products
- Evaluating transactions for fraud or suspicious activity
That scope means many fintech and crypto companies are already using ADMT today—whether they realize it or not.
Why Crypto and Fintech Firms Should Care
For digital financial asset businesses, this isn’t a “wait and see” moment.
Automated tools are everywhere in crypto compliance:
- KYC and AML systems that flag risky users or transactions
- Fraud-monitoring software scoring behavior across wallets
- Customer onboarding and verification workflows
- Trading and lending algorithms that influence access or pricing
These systems increasingly rely on AI models, decision trees, or data-driven scoring—exactly what ADMT targets.
The result: companies subject to California’s privacy law (for-profit entities collecting personal information from state residents) will soon need to meet new obligations around transparency, opt-outs, and human review.
What the Rules Require
The new framework combines privacy law principles with operational compliance expectations. Here’s what’s inside:
- Transparency and Notice
- Businesses must notify consumers before using ADMT to make a significant decision.
- The notice must explain what the system does, what information it uses, and how consumers can opt out or appeal.
- Think of it as a privacy notice meets algorithmic accountability statement.
- Opt-Out and Appeal Rights
- Consumers can opt out of automated decisions and request human review.
- Appeals must be handled by a qualified human reviewer—not an automated process.
- Risk Assessments and Audits
- Before using ADMT, companies must conduct risk assessments documenting purpose, logic, data inputs, potential bias, and foreseeable harms.
- Ongoing assessments are required after material changes.
- Separate cybersecurity audits are phased in for larger firms starting in 2028.
- Vendor Oversight
- Businesses remain responsible for compliance even when relying on third-party AI or compliance vendors (e.g., software applications).
- Contracts should require vendors to disclose model logic, data sources, and mitigation measures.
- Recordkeeping
- Companies must retain risk assessments and audit documentation for up to five years, or the duration of the processing.
For most organizations, that means privacy, compliance, HR, and technology teams will all need to coordinate.
The Deadlines
The timeline matters:
- Effective Date: October 1, 2025
- Compliance Deadline: January 1, 2027 for businesses already using ADMT
- Risk Assessment Submissions: Required beginning April 1, 2028 (with phased thresholds through 2030)
That may sound distant, but given the internal reviews, vendor coordination, and documentation required, early preparation is key.
Risks for Crypto Firms That Ignore It
The crypto industry already operates under a maze of overlapping regulations—from FinCEN to the SEC, CFTC, and various state money-transmitter laws. California’s new AI rulemaking adds yet another layer, and failing to comply won’t just invite scrutiny from Sacramento. It could ripple across every level of oversight.
There’s the risk of regulatory overlap, where falling short of ADMT standards could raise flags with federal and state examiners already reviewing AML and consumer-protection controls. Then there’s the challenge of operational complexity: the new requirements for human review and appeals may force companies to rethink existing workflows or redesign automated systems entirely. The potential for reputational exposure is equally significant—using AI in opaque or biased ways can quickly undermine customer trust in an industry that already fights perception battles. Finally, there’s the cost factor. Meeting new documentation, audit, and consent obligations will demand time and resources, particularly from smaller firms without deep compliance benches.
In short, ignoring California’s AI rules won’t make them go away—it’ll just make the reckoning more expensive when they arrive.
Opportunities Hidden in the Rulemaking
While most headlines focus on compliance burden, there’s strategic upside for early movers.
- Governance maturity: Building transparency into your AI systems improves examiner confidence and investor trust.
- Customer confidence: Showing how your tools make fair, explainable decisions can become a competitive differentiator.
- Regulatory alignment: ADMT compliance maps closely to existing AML and risk frameworks—use that overlap to streamline.
- Market leadership: California rules often set the tone for federal or multi-state adoption. Getting it right now means less scrambling later.
In other words: this is not just about risk avoidance. It’s about building the kind of AI governance that regulators—and customers—will expect everywhere.
What Crypto Businesses Should Do Now
- Inventory your automation.
Map every tool or process that makes or influences customer decisions—from onboarding to monitoring to HR. - Assess ADMT exposure.
Identify where models use personal information and where outcomes affect consumer access, benefits, or treatment. - Update policies.
Draft internal policies explaining ADMT use, transparency, opt-out handling, and appeals. - Engage vendors early.
Ensure contracts include data-use disclosures, explainability obligations, and cooperation clauses. - Train your teams.
Compliance, legal, HR, and IT should understand how these new AI expectations intersect with existing AML and privacy requirements. - Document everything.
Keep records of risk assessments, review procedures, and consumer communications—these will be the first things an examiner asks for.
California has Always Been the Trendsetter
From privacy to environmental standards, California’s rulemaking often becomes a template for other states across the U.S.
The state’s new AI rules are likely to inspire similar frameworks elsewhere—and possibly federal action.
For crypto and fintech companies, that means one thing: readiness now equals resilience later.
Bottom Line for Operating in the Golden State
California’s ADMT rules may feel abstract, but their implications are very real. They reshape how automation and AI intersect with compliance, customer rights, and governance.
For crypto and fintech companies, the message is clear:
Demonstrable transparency, accountability, and fairness in automated decision-making are no longer optional—they’re operational.
Bottom line: If your business touches customers in California, it’s time to start treating AI governance like AML—core, documented, and exam-ready.
Connect
Not sure where to start? BitAML can help you inventory your automation, assess exposure, and build an AI governance roadmap that meets both examiner and regulator expectations. Let’s get your systems compliant—and future-proof—before the clock runs out. Book a discovery call with BitAML