InsuranceInformINS

InformINS Modernizes FNOL Intake with Conversational Agentic AI with Bedrock

Customer Challenge

FNOL (First Notice of Loss) is a critical entry point in the insurance claims lifecycle. The speed and accuracy of FNOL intake directly impact:

  • customer satisfaction
  • claim processing timelines
  • operational efficiency
  • data quality for downstream workflows

Informins sought to modernize this process by introducing AI-driven automation while ensuring the platform could operate reliably at scale. However, its existing FNOL workflow relied on traditional, form-based data entry.

InformINS needed to validate whether AI-driven conversational intake could replace forms while preserving accuracy, compliance, and operational control.

Key Challenges

Rigid FNOL Experience
Static forms could not adapt dynamically to claimant responses.
Inefficient Data Collection
Users were forced through fixed question paths regardless of relevance.
Limited Customer Experience
The process lacked conversational guidance and real-time clarification.
No AI Foundation for Future Channels
The legacy approach was not extensible to voice or intelligent automation.
Limited Scalability
The system struggled to handle spikes in claim volume, concurrent user interactions and real-time processing requirements
Lack of Operational Visibility
The platform lacked centralized monitoring for conversation success rates, system latency and failure or fallback scenarios

QyrosCloud Solution

QyrosCloud designed and implemented a containerized, AI-powered FNOL chatbot on AWS, focused on conversational accuracy, flexibility, and future extensibility.

The solution leveraged Amazon Bedrock, LangChain agents, and a web-based chat interface, while maintaining tight control over prompts, escalation logic, and data submission.

1Conversational FNOL Intake with Generative AI

  • Implemented a text-based AI chatbot that guides users through FNOL intake.
  • Dynamically adjusted questions based on prior responses.
  • Followed InformINS’ existing FNOL question flow for accuracy and compliance.

2Agentic Orchestration with LangChain

  • Used LangChain Agents with Tools to manage conversation flow and decision logic.
  • Determined when sufficient FNOL data had been collected.
  • Controlled escalation paths when human assistance was required.

3Amazon Bedrock–Powered Intelligence

  • Leveraged Claude 3 Sonnet via Amazon Bedrock for conversational reasoning.
  • Centralized system prompts stored securely in AWS Systems Manager Parameter Store, enabling prompt updates without code changes.

4API Submission & Escalation Handling

  • Submitted completed FNOL data to a backend Mock API (AWS Lambda) for validation.
  • Confirmed successful transmission of FNOL details to downstream systems.
  • Provided predefined escalation messaging and call-center handoff when needed.

5Secure, Containerized AWS Deployment

  • Deployed the solution as a Dockerized application on Amazon EC2.
  • Delivered a web-based chat interface using Chainlit.
  • Ensured consistent deployment and simplified environment management.

6Proactive Application Monitoring

A centralized observability layer was implemented using Amazon CloudWatch.

Monitored KPIs

  • FNOL submission latency
  • AI response time
  • conversation completion rate
  • fallback/error rate
  • system throughput

7Governance and Compliance Controls

QyrosCloud implemented governance mechanisms using Bedrock and AWS-native services.

Controls

  • structured prompt templates
  • guardrails for safe and compliant responses
  • audit logging of interactions
  • controlled data capture flows

Results & Business Impact

Partnering with QyrosCloud enabled InformINS to validate conversational AI as a viable By replacing static FNOL forms with an AI-driven conversational intake system, InformINS significantly improved intake efficiency, scalability, and operational visibility while maintaining structured data integrity.

⏱ 60–75% reduction in FNOL completion time

Dynamic questioning eliminated irrelevant form fields and reduced claimant friction.

📉 40–55% reduction in manual intake review effort

Structured AI extraction reduced downstream correction and clarification cycles.

💬 100% dynamic question routing

The conversational engine adapts in real time based on user responses, eliminating fixed-path logic constraints.

📈 Improved data completeness and consistency

LLM-guided extraction ensured required fields were collected before submission, reducing incomplete FNOL submissions.

🔄 Instant API submission and validation

Completed FNOL reports are transmitted in near real-time, accelerating claims initiation.

⚡ Scalable intake without infrastructure overhead

Containerized deployment on AWS enables horizontal scaling as FNOL volume increases.

🔐 Centralized prompt control and governance

Secure prompt management through AWS Systems Manager allows iterative improvements without code redeployment.

Technology Stack

AWS Services

Amazon EC2AWS LambdaAmazon BedrockAWS Systems Manager Parameter Store

AI & Orchestration:

LangChain AgentsClaude 3 Sonnet

Frontend & Runtime

Chainlit (Web Chat UI)Docker

Integration

Mock FNOL API (Lambda-based)

About InformINS

InformINS provides advanced, scalable software and analytics solutions for the property & casualty insurance industry. By adopting AI-driven FNOL workflows, InformINS is improving customer experience while laying the groundwork for intelligent, automated claims processing.

Visit InformINS