Veladzic, Arnel and Gomes Silva, Fabio (2025) Conversational Ai Client For Bank Client Onboarding. Other thesis, OST Ostschweizer Fachhochschule.
HS 2025 2026-SA-EP-Veladzic-Gomes Silva-Conversational AI for Bank Client Onboarding.pdf - Supplemental Material
Download (3MB)
Abstract
Introduction:
Atfinity is a Swiss AI-powered no-code platform founded in 2016 that digitalizes and orchestrates client onboarding processes in banking. While these workflows are structured, Relationship Managers (RMs) currently handle the entire case creation manually, which is time-consuming. To improve efficiency, Atfinity plans to extend its platform with AI-assisted capabilities, enabling RMs to create and update onboarding cases automatically and retrieve structured information. This project develops a prototype where an RM can submit client details via a free-text prompt to a Large Language Model (LLM). The LLM extracts and maps the data to Atfinity’s internal data model and interacts with the Case Management System (CMS) through the Model Context Protocol (MCP).
Problem:
Atfinity’s system uses a conditional field architecture, where the visibility of fields depends on prior inputs - for example, fields like Total Wealth or Name appear only after the Account Type is specified. The underlying data model is a JSON structure containing definitions, dependencies and permitted values. Because the platform is highly configurable, banks can define their own fields and rules, resulting in payloads that vary widely and making static mapping infeasible. Additional complexity arises from the lack of established patterns for integrating AI-based extraction with MCP. Limited guidance on architectures, data flow and interaction models necessitates exploratory development.
Result:
The project successfully extracted client information from free-text prompts and converted it into structured JSON compatible with Atfinity’s CMS. This enables automated onboarding, allowing RMs to submit information conversationally while the system handles extraction, mapping and case population via MCP. Several architectures were explored, of which three were selected and implemented, all sharing the same core architecture but differing in communication flow.
1 – Iterative LLM Mapping (v5) Adapts to Atfinity's conditional field architecture using iterative LLM calls. It achieved high accuracy in simpler cases (Precision 90.91%, Recall 100%, F1 95.24%) with 23–25s latency. Performance dropped in corporate onboardings (F1 80%, Recall 66.67%).
2 – Iterative Mapping with Preemptive Cache (v6) Builds a cache before case creation. It achieved 100% Recall but lower Precision (66.67–73.68%) and the slowest speed (32–44s).
3 – Preemptive Cache with Single-Pass Mapping (v7) Generates a cache once and maps fields without further LLM calls. It achieved the best speed (10–24s), however Recall decreased in complex cases (44.44%).
Summary: Prototype three provides the best speed–quality trade-off, while prototype one is the most reliable in terms of accuracy.
| Item Type: | Thesis (Other) |
|---|---|
| Subjects: | Area of Application > Business oriented Technologies > Programming Languages > Python Technologies > Databases > mongoDB Metatags > IFS (Institute for Software) |
| Divisions: | Bachelor of Science FHO in Informatik > Student Research Project |
| Depositing User: | OST Deposit User |
| Date Deposited: | 26 Feb 2026 09:02 |
| Last Modified: | 26 Feb 2026 09:02 |
| URI: | https://eprints.ost.ch/id/eprint/1361 |
