The Thinking Company

AI Transformation for Financial Services: Choosing the Right Partner in a Regulated Industry

AI transformation in financial services means redesigning strategy, operations, and culture to create value through artificial intelligence while meeting the EU AI Act, DORA, and national regulatory requirements that classify most banking and insurance AI systems as high-risk. Choosing the right transformation partner in this environment requires weighting governance capability above speed and treating regulatory compliance as a non-negotiable baseline rather than an optional feature.

Financial institutions face a distinct AI transformation challenge. Unlike retailers optimizing recommendations or manufacturers improving predictive maintenance, banks and insurers must navigate AI adoption through the densest regulatory environment in Europe. The EU AI Act classifies most AI systems used in credit decisions, insurance underwriting, and employment screening as high-risk. DORA imposes technology resilience requirements that extend to AI systems. National regulators — KNF in Poland, BaFin in Germany, FCA in the UK — issue AI-specific guidance that adds layers of compliance obligation.

McKinsey estimates that AI could deliver $200-340 billion in annual value to the global banking sector alone, with the largest gains in risk management and revenue optimization [Source: McKinsey Global Institute, “The economic potential of generative AI,” 2023]. Yet according to BCG, only 26% of financial institutions report scaling AI beyond pilot projects [Source: BCG, “From Potential to Profit with GenAI in Banking,” 2024]. The gap between AI’s promise and operational reality is the core challenge a transformation partner must address.

This regulatory density changes the partner selection calculus. Speed matters less than in other industries. Governance expertise matters more. The cost of regulatory failure — enforcement actions, reputational damage, remediation expenses — can exceed the cost of the entire AI transformation program. Penalties under the EU AI Act reach up to 35 million EUR or 7% of global annual turnover [Source: EU AI Act, Regulation 2024/1689].

This guide applies The Thinking Company’s AI Transformation Partner Evaluation Framework to financial services, with factor weights adjusted to reflect sector-specific priorities: governance capability elevated, speed to value reduced, and risk management treated as a core requirement rather than an optional add-on.


The Financial Services AI Landscape

Regulatory Context

Financial institutions in Europe operate under overlapping AI governance requirements:

EU AI Act (Regulation 2024/1689): AI systems used for creditworthiness assessment, credit scoring, and insurance underwriting are classified as high-risk under Annex III. This classification triggers mandatory requirements: risk management systems, data governance, transparency obligations, human oversight, accuracy monitoring, and cybersecurity measures. Non-compliance penalties reach up to 35 million EUR or 7% of global annual turnover.

DORA (Digital Operational Resilience Act): Financial entities must ensure their ICT systems — including AI — meet operational resilience requirements. DORA requires documented ICT risk management frameworks, incident reporting, and third-party ICT service provider oversight. The January 2025 application deadline has passed; enforcement is active.

National Regulators: KNF (Poland), BaFin (Germany), FCA (UK), and other national competent authorities have issued AI-specific guidance. These guidelines typically address model risk management, explainability requirements for customer-facing decisions, and board-level accountability for AI governance.

GDPR Article 22: The right not to be subject to automated decision-making with legal or similarly significant effects applies directly to credit decisions and insurance underwriting. Financial institutions must ensure meaningful human involvement in consequential AI-assisted decisions.

Gartner projects that by 2026, organizations that operationalize AI transparency, trust, and security will see their AI models achieve a 50% improvement in adoption, business goals, and user acceptance [Source: Gartner, “Top Strategic Technology Trends,” 2024].

AI Adoption Patterns in Financial Services

Financial institutions cluster into three adoption profiles:

Leaders (10-15% of institutions): Large universal banks and insurance groups with dedicated AI teams, production ML/AI systems, and structured governance. Examples include ING’s model risk management framework and Allianz’s AI Center of Excellence. These organizations have moved beyond pilots to enterprise-scale deployment. According to Accenture, AI leaders in banking generate 3.3x higher revenue growth than laggards [Source: Accenture, “Banking on AI,” 2024].

Fast followers (30-40%): Mid-sized banks and insurers with active AI experimentation, typically 3-10 production use cases, and emerging governance structures. Many are currently building or refining their AI operating models.

Cautious majority (45-55%): Smaller institutions, regional banks, and specialist insurers with limited AI activity — often confined to vendor-provided solutions or isolated proofs of concept. Regulatory uncertainty and talent constraints slow adoption. IDC reports that 62% of mid-market financial institutions cite regulatory complexity as their top barrier to AI scaling [Source: IDC, “European Financial Services AI Survey,” 2024].

Common AI Use Cases

Financial services AI concentrates in several domains:

DomainUse CasesRegulatory Intensity
Credit & LendingCredit scoring, underwriting, collections optimizationHIGH (EU AI Act Annex III)
Fraud & Financial CrimeTransaction monitoring, AML screening, fraud detectionMEDIUM (explainability requirements)
Customer ServiceChatbots, virtual assistants, next-best-actionLOW-MEDIUM (depending on advice content)
Risk ManagementMarket risk, credit risk, operational risk modelingMEDIUM (model risk management)
OperationsDocument processing, claims automation, straight-through processingLOW

The regulatory intensity determines the governance burden — and therefore the partner capability required. Deloitte estimates that financial institutions spend 15-25% of their AI budgets on compliance and governance activities, compared to 5-10% in less regulated sectors [Source: Deloitte, “State of AI in Financial Services,” 2024].


Four Approaches to AI Transformation in Financial Services

The base framework evaluates four approaches: Management Consultancy-Led, Technology Vendor-Led, Boutique Advisory-Led, and Internal/DIY. In financial services, each approach presents distinct tradeoffs shaped by regulatory requirements. For the general comparison, see the four-way comparison guide.

Management Consultancy-Led

What it means for financial services:

Big 4 and MBB firms bring substantial financial services experience and, critically, regulatory consulting capabilities. Deloitte, PwC, EY, and KPMG each maintain dedicated banking and insurance practices with model risk management specialists. McKinsey’s QuantumBlack and BCG’s Gamma have financial services vertical expertise.

Strengths in this sector:

  • Deep regulatory knowledge (EU AI Act, DORA, national regulator expectations)
  • Existing relationships with compliance and risk functions
  • Credibility with regulators and boards
  • Global coordination for multinational institutions

Limitations:

  • Strategy often disconnected from implementation (for the methodology gap analysis, see the boutique vs. Big 4 comparison)
  • High cost relative to mid-market budgets — EY reports average Big 4 AI strategy engagements for financial services run 800K-1.5M EUR [Source: Based on professional judgment informed by Big 4 published case studies]
  • Organizational change still underweighted in AI engagements
  • Junior analyst delivery despite senior sales

Technology Vendor-Led

What it means for financial services:

Cloud providers (AWS, Azure, Google Cloud) and fintech platforms (Finastra, Temenos, Thought Machine) offer AI capabilities embedded in their financial services offerings. Vendor advisory typically focuses on platform adoption and technical implementation.

Strengths in this sector:

  • Pre-built financial services AI solutions
  • Compliance certifications (SOC 2, PCI-DSS, financial services cloud regions)
  • Rapid deployment of vendor-specific capabilities

Limitations:

  • Advisory is secondary to platform sales (see the independent vs. vendor framework comparison)
  • Governance design is not their business model
  • Vendor lock-in in an industry that values optionality
  • Change management and organizational readiness outside scope

Boutique Advisory-Led

What it means for financial services:

Independent AI advisory firms combine strategic depth with hands-on engagement. In financial services, boutique advisors can provide vendor-neutral guidance on technology selection while addressing the organizational change required for AI adoption.

Strengths in this sector:

  • Vendor independence critical for multi-platform environments
  • Senior practitioner involvement throughout
  • Organizational change integrated into AI strategy
  • Pragmatic governance frameworks sized for the institution

Limitations:

  • Smaller regulatory bench than Big 4
  • Limited capacity for enterprise-wide transformation (capacity constraints)
  • Less brand credibility with conservative boards

Internal / DIY

What it means for financial services:

Many large financial institutions have invested in internal data science and AI teams. Some have established AI Centers of Excellence or dedicated digital transformation units.

Strengths in this sector:

  • Deep institutional knowledge (data, systems, politics)
  • Continuous capability, not project-based
  • Knowledge stays in-house
  • Lower direct cost

Limitations:

  • Model risk management methodology may be immature
  • Change management capability often absent
  • External perspective lacking
  • Regulatory horizon-scanning difficult without external advisory

Financial Services Factor Weights

The Thinking Company’s AI Transformation Partner Evaluation Framework adjusts factor weights for financial services to reflect sector-specific priorities. For the full AI readiness assessment methodology, see the pillar page.

Weight Adjustments

FactorBase WeightFS WeightChangeRationale
Strategic Depth10%10%Unchanged
Implementation Support15%10%-5%Less critical when compliance gates implementation
Change Management & Adoption15%15%Organizational adoption remains critical
Vendor Independence10%10%Unchanged
Speed to Value10%5%-5%Speed deprioritized vs. compliance
Business Outcome Orientation10%10%Unchanged
Senior Practitioner Involvement10%10%Unchanged
Governance & Risk Management5%15%+10%Regulatory requirements + board liability
Knowledge Transfer10%10%Unchanged
Cost-Value Alignment5%5%Unchanged

Financial Services Composite Scores

Applying the adjusted weights produces revised composite scores:

ApproachBase ScoreFS ScoreMovement
Boutique Advisory-Led4.284.13-0.15
Internal / DIY3.232.93-0.30
Management Consultancy-Led2.782.90+0.12
Technology Vendor-Led2.432.33-0.10

Key insight: The governance weight increase benefits Big 4 firms (+0.12) while penalizing DIY approaches (-0.30). Boutique advisory remains first, but the gap narrows. This reflects the genuine value Big 4 regulatory expertise provides in high-compliance environments.

According to The Thinking Company’s Financial Services variant of the AI Transformation Partner Evaluation Framework, governance and risk management is weighted at 15% compared to 5% in the base framework — reflecting the sector’s regulatory density and board liability exposure. PwC reports that 73% of financial services executives name regulatory compliance as their primary concern when selecting AI partners [Source: PwC, “Global Financial Services AI Survey,” 2024].


Critical Factors for Financial Services

Three factors deserve particular attention in financial services partner selection:

Governance & Risk Management (Weight: 15%)

Why it matters more:

Financial services faces the highest regulatory scrutiny for AI in Europe. Board members face personal liability for AI governance failures. The cost of getting governance wrong — enforcement actions, customer remediation, reputational damage — can exceed the entire transformation budget. According to a 2024 EBA report, 41% of significant institutions had received supervisory observations related to AI model risk management [Source: EBA, “Report on Machine Learning for IRB Models,” 2024].

Financial Services Score Analysis:

ApproachScoreFinancial Services Context
Management Consultancy-Led3.5Strong regulatory consulting capability; model risk management methodology; existing regulator relationships
Technology Vendor-Led2.0Platform governance features exist; strategic governance design not their model
Boutique Advisory-Led4.0Practical governance frameworks; proportionate to institution size; regulatory awareness without law firm overhead
Internal / DIY2.0Often lacks model risk management methodology; regulatory horizon-scanning difficult

What to assess:

  • Does the partner have EU AI Act high-risk classification experience?
  • Can they design governance proportionate to your institution’s size?
  • Do they understand the intersection of AI governance with existing risk frameworks (model risk, operational risk, compliance)?

Change Management & Adoption (Weight: 15%)

Why it still matters:

Even in regulated environments, the primary failure mode for AI transformation is organizational. Data scientists build models that operations teams don’t trust. Risk managers impose constraints that make deployment impractical. Business users revert to manual processes because the AI output isn’t integrated into their workflow. McKinsey reports that 70% of AI transformations fall short of their targets, with organizational resistance and lack of adoption cited as the top causes [Source: McKinsey, “The State of AI,” 2024].

Financial services conservatism amplifies these dynamics. A culture of risk aversion can manifest as passive resistance to AI adoption. For a deep dive on this dimension, see the change management factor analysis.

Financial Services Score Analysis:

ApproachScoreFinancial Services Context
Management Consultancy-Led2.0Change management exists as separate practice; rarely integrated into FS AI engagements
Technology Vendor-Led1.0Outside scope; vendor training does not equal organizational change
Boutique Advisory-Led4.0Organizational readiness assessed upfront; adoption metrics tracked; culture work integrated
Internal / DIY2.5Understand the culture but lack change methodology; IT-led initiatives focus on training, not adoption

What to assess:

  • Does the partner assess organizational readiness before strategy?
  • How do they address risk culture and risk aversion?
  • What is their approach to human-AI collaboration in regulated processes?

Implementation Support (Weight: 10%)

Why it matters less (relatively):

In financial services, implementation happens in phases gated by compliance approvals. The ability to move fast matters less than the ability to navigate governance checkpoints. Implementation support remains important — but speed is constrained by compliance, not by partner capability.

Financial Services Score Analysis:

ApproachScoreFinancial Services Context
Management Consultancy-Led2.5Strategy-implementation gap persists; delivery often outsourced to SIs
Technology Vendor-Led4.0Strong technical implementation on their platform; compliance-certified deployment
Boutique Advisory-Led3.5Hands-on pilot guidance; limitation is capacity, not capability
Internal / DIY4.5Deep system knowledge; compliance integration already understood

When Each Approach Fits

Choose Management Consultancy-Led When:

  • Your organization requires Big 4 brand credibility for board or regulator confidence
  • The engagement includes model risk management framework design
  • You need global coordination across multiple regulatory jurisdictions
  • Budget is not the primary constraint and regulatory credibility is essential

Typical profile: Large universal bank or insurance group, multinational presence, board-level scrutiny, existing Big 4 relationship

Choose Technology Vendor-Led When:

  • You’ve committed to a specific cloud platform (Azure, AWS, GCP)
  • The use case is primarily technical with established compliance patterns
  • You need pre-certified financial services AI solutions
  • Governance design is handled separately (internal or other advisor)

Typical profile: Mid-sized institution with platform commitment, technical use case, internal governance capability

Choose Boutique Advisory-Led When:

  • Organizational change and adoption are the primary challenge
  • You need vendor-neutral guidance across a multi-platform environment
  • Governance must be proportionate to institution size (not enterprise-scale overkill)
  • Senior practitioner involvement throughout is important

Typical profile: Mid-market bank or insurer, 3-15 AI use cases planned, organizational readiness questions, pragmatic governance needs

Choose Internal / DIY When:

  • You have strong internal AI/data science leadership with regulatory experience
  • The use case is operational (not high-risk under EU AI Act)
  • You prioritize long-term internal capability over speed
  • External advisory is available for spot guidance

Typical profile: Institution with mature data science team, non-regulated AI use case, budget constraints, time not critical


Case Scenarios

Scenario 1: Credit Scoring Model Replacement

A mid-sized Polish bank is replacing its legacy credit scoring model with an ML-based alternative. The new model will fall under EU AI Act high-risk classification.

Considerations:

  • Model risk management framework needed
  • EU AI Act compliance documentation required
  • Explainability requirements for adverse action notices
  • KNF expectations for model validation

Partner fit: Boutique advisory for strategy and governance design, potentially with Big 4 or law firm support for regulatory mapping. Internal team for implementation under governance framework.

Scenario 2: Customer Service Chatbot

A regional insurer wants to deploy an AI chatbot for customer inquiries. The bot will not provide advice or make underwriting decisions.

Considerations:

  • Lower regulatory intensity (not high-risk classification)
  • Integration with existing customer systems
  • Speed to value more relevant than governance depth

Partner fit: Technology vendor with pre-built solution, internal team for integration. Boutique advisory if organizational change (agent reskilling) is a concern.

Scenario 3: Enterprise AI Operating Model

A large banking group needs to design an enterprise AI operating model: governance structure, Center of Excellence design, use case prioritization framework, and capability roadmap.

Considerations:

  • Strategic scope requires depth
  • Board-level deliverable requires credibility
  • Multiple jurisdictions involved
  • Organizational change is the primary challenge

Partner fit: Boutique advisory for operating model design and change strategy. Consider Big 4 for multi-jurisdictional regulatory mapping if regulatory complexity is high.


What The Thinking Company Recommends

Banks and insurers navigating AI transformation need partners who understand that governance is not overhead — it is the prerequisite for sustainable AI deployment in regulated environments.

  • AI Diagnostic (EUR 15–25K): Industry-specific assessment covering regulatory compliance, data infrastructure, and AI readiness.
  • AI Transformation Sprint (EUR 50–80K): Focused 4-6 week engagement tailored to financial services requirements including regulatory considerations.

Learn more about our approach →


Frequently Asked Questions

How does the EU AI Act affect AI transformation in banking?

The EU AI Act classifies AI systems used in credit scoring, creditworthiness assessment, and insurance underwriting as high-risk under Annex III. This means banks must implement risk management systems, data governance protocols, transparency mechanisms, human oversight, and accuracy monitoring for these systems. Non-compliance carries penalties of up to 35 million EUR or 7% of global turnover. Banks should begin by inventorying existing AI systems and classifying them against Annex III criteria. [Source: EU AI Act, Regulation 2024/1689]

What is the average ROI of AI transformation in financial services?

McKinsey estimates AI could deliver $200-340 billion in annual value to global banking. At the firm level, Accenture reports AI leaders in banking achieve 3.3x higher revenue growth than laggards. Typical ROI timelines run 12-24 months for operational use cases (fraud detection, document processing) and 18-36 months for strategic use cases (credit modeling, risk management). ROI depends heavily on organizational adoption — technology alone accounts for only 30% of value realization. [Source: McKinsey Global Institute, 2023; Accenture, 2024]

Should banks choose Big 4 consultancies or boutique firms for AI transformation?

The choice depends on institutional size, regulatory complexity, and primary challenge. Big 4 firms excel at multi-jurisdictional regulatory mapping, model risk management frameworks, and engagements requiring board-level brand credibility. Boutique firms score higher on change management, vendor independence, and cost-value alignment. In The Thinking Company’s financial services-weighted framework, boutique advisory scores 4.13/5.0 versus management consultancy at 2.90/5.0. Many institutions use a hybrid model: boutique advisory for strategy and change, Big 4 for regulatory mapping.

How long does AI transformation take in financial services?

Financial services AI transformation typically takes longer than in other sectors due to compliance gating. Expect 4-8 weeks for strategy, 3-6 months for pilot deployment with governance approvals, and 12-24 months for enterprise scaling. The compliance approval cycle adds 30-50% to timelines compared to unregulated industries. Organizations at Stage 1 (Ad Hoc) of The Thinking Company’s AI Transformation Maturity Model should plan for 18-36 months to reach Stage 3 (Implementing).

What role does DORA play in financial services AI governance?

DORA (Digital Operational Resilience Act) requires financial entities to ensure all ICT systems — including AI — meet operational resilience standards. Article 5 places responsibility on the management body to approve and oversee ICT risk management frameworks. For AI specifically, DORA mandates documented risk frameworks, incident reporting procedures, and oversight of third-party AI service providers. The January 2025 application deadline has passed, and enforcement is active across the EU. [Source: DORA, Regulation 2022/2554]


Next Steps

This guide has applied The Thinking Company’s partner evaluation framework to financial services. For broader context:


This article was last updated on 2026-03-11. Part of The Thinking Company’s AI Readiness Assessment content series. For a personalized assessment, contact our team.