The Power of Machine Learning: Aviva’s Success Story
This post is co-written with Dean Steel and Simon Gatie from Aviva.
With a presence in 16 countries and serving over 33 million customers, Aviva is a leading insurance company headquartered in London, UK. As one of the oldest financial services organizations, dating back to 1696, Aviva continues to innovate and protect what matters most to their customers: health, home, family, and financial future. To enhance their services, Aviva has embraced the power of machine learning (ML) across more than 70 use cases.
The Challenge: Deploying and Operating ML Models at Scale
Deploying and operating ML models at scale can be a daunting task for many organizations. According to Gartner, approximately 47% of ML projects never reach production. Aviva, processing about 400,000 insurance claims annually with expenditures of £3 billion, understands the need for automation through AI technology to provide a streamlined digital experience to their customers. Developing and deploying more ML models is essential to meet the increasing claim volumes efficiently.
To showcase the platform’s capabilities, Aviva chose the Remedy use case as their pilot project. This use case involves a data-driven approach to categorize car insurance claims as total loss or repair cases, utilizing 14 ML models and business rules to make recommendations based on external data.
Solution Overview of the MLOps Platform
Aviva’s journey to building a robust MLOps platform began with the AWS Enterprise MLOps Framework, integrated with Amazon SageMaker. This platform streamlined model development, deployment, and monitoring, providing a standardized approach to the ML lifecycle. Aviva’s adoption of this framework enabled them to replicate processes and deploy multiple ML use cases efficiently.
Building Reusable ML Pipelines
The Remedy use case processing and training code were developed in SageMaker Studio, facilitating collaborative work and rapid experimentation. The CI/CD pipeline constructed in SageMaker enabled the seamless processing of data, training models, and model evaluation resulting in the registration of models in the SageMaker Model Registry.
Serverless Workflow for Orchestrating ML Model Inference
Integrating the ML model inference with Aviva’s internal systems involved orchestrating model predictions and combining them with business logic to generate recommendations for claims handlers. This workflow ensured efficient decision-making based on model predictions and external data.
Monitor ML Model Decisions to Elevate Confidence Among Users
Real-time monitoring of ML model decisions is crucial for oversight and system enhancement. Aviva leveraged Snowflake to consolidate data from Step Functions, providing insights for business analysts to create dashboards and reports, enhancing decision-making processes.
Security
Ensuring data protection and maintaining security standards, Aviva implemented robust security measures for handling personally identifiable information (PII) and securing intellectual property. Networking restrictions, data encryption, and access control mechanisms were put in place to safeguard customer data and intellectual assets.
Conclusion
Aviva’s partnership with AWS and adoption of the AWS Enterprise MLOps Framework have revolutionized their ML operations, enabling them to deploy ML use cases efficiently and reduce infrastructure costs significantly. The success of the platform showcases the importance of integrating DevOps best practices into the ML lifecycle for seamless operations and innovation.
For more information on the AWS Enterprise MLOps Framework and Amazon SageMaker, visit the GitHub repository and accelerate your organization’s MLOps journey today.
About the Authors
Dean Steel, Simon Gatie, Gabriel Rodriguez, Marco Geiger, and Andrew Odendaal are passionate professionals in the field of AI, ML, and data science, driving innovation and success in their respective roles.