how nlp is helping a european financial institution
play

How NLP is Helping a European Financial Institution Enhance - PowerPoint PPT Presentation

How NLP is Helping a European Financial Institution Enhance Customer Experience Tal Doron Director, Technology Innovation Agenda Introduction 1 Challenges 2 Use Case 3 Project Milestones 4 Whats Next 5 2 ABOUT ME @taldor oron


  1. How NLP is Helping a European Financial Institution Enhance Customer Experience Tal Doron Director, Technology Innovation

  2. Agenda Introduction 1 Challenges 2 Use Case 3 Project Milestones 4 What’s Next 5 2

  3. ABOUT ME @taldor oron on taldor oron on84 tald ld@gigaspaces.com Ta Tal Doron Director, Technology Innovation

  4. About GigaSpaces 300+ Direct customers We provide one of the leading in-memory computing platforms for real-time insight to action and extreme transactional processing. With GigaSpaces, enterprises 50+ / 500+ can operationalize machine learning and transactional Fortune / Organizations processing to gain real-time insights on their data and act upon them in the moment. 5,000+ Large installations in production (OEM) 25+ InsightEdge is an in-memory real- In-Memory Computing time analytics platform for instant Platform for microsecond ISVs insights to action; analyzing data scale transactional as it's born, enriching it with processing, data scalability, historical context, for smarter, and powerful event-driven faster decisions workflows

  5. 74% want to be data driven only 23% are successful,

  6. How Can You Gain the Most Value from Your Data? Near real-ti Ne time data ta is highly valuable if you act on it on time Time-cr Ti critical cal Tr Trad aditional al “ “bat atch ch” de decision business bu in intellig igence His istorical orical + near ar Value re real-ti time data ta is more ore valuable if you have the means Actionab Reactive Historical le to combine them Preventive/ Actionable Reactive Predictive Historical REAL-TIME SECONDS MINUTES HOURS DAYS MONTHS Time

  7. The Velocity of Business (once upon a time) “A typical e-commerce “To prevent fraud, “A call center receives website will experience anomaly detection 450, 45 0,000 000 calls lls/min , across 40% 40% bounce if it loads in needs to happen 200 phone numbers, more than 3 s 3 seconds , against 500,000 each call needs to be including txn/sec in less than routed in less than 60 60 personalization offers” 200 mi 20 millise seconds ” mi milliseconds ” ECOMMERCE TELCO FINANCIAL SERVICES

  8. ABOUT THE CUSTOMER This Financial IT Service provider serves the leading banks in Germany with core solutions and services Business Goals: Enhance customer experience with quicker First Call Resolution Reduce Average Handle Time for optimized efficiency

  9. BUSINESS CHALLENGES KEEPING UP WITH DISJOINTED CUSTOMER AN OMNICHANNEL EXPERIENCE EMPOWERED CUSTOMERS INTERACTIONS Customers are smarter and Customers want a Disparate data sources and have more insights into systems, led to inefficient consistent experience juggling between screen and competitive products and across all channels and systems and poor data services, raising expectations agents, demanding faster quality & poor customer to a new standard resolution times experience

  10. TECHNICAL CHALLENGES HIGH PERFORMANCE MILLISECOND LATENCY CONTINUOUS ML TRAINING Ingestion of millions of CRM Customers demand Insights constantly need to cases and data from other an immediate response time, adapt to changing repositories into a unified requiring high performance conditions for smartest analytics platform solutions that leverage ML insights models in real-time

  11. PROPOSED SOLUTION If a live agent is needed during a call, the NLP based solution automatically supplies the agent with articles and knowledge documents based on the conversation

  12. Search… DATA SOURCES CUSTOMER CUSTOMER TICKET Ticket ID #54367 95.32% Credit Limit exceeded #56409 93.05 % Authentication Customer Name required #33487 #54367 86.16 % Beneficiary account unknown International #180762 Payment Type declined Beneficiary 77.98% account dormant Enterprise #180762 71.53% Intermediary bank changes #60975 Support Level Bronze Case Description Case Resolution Case Description Last Contact Date International payment to Check that credit limit Check here to email 20.12.18 supplier declined is not exceeded instructions to customer Read more Read more Email

  13. Training the model Time to results based on ~50ms 2M CRM records in 27min

  14. General Architecture & Data Flow Server 1 Server 2 APPLICATION REST INTEGRATION WEB SERVER SERVER PLATFORM SERVICE SERVICE BROKER 1 – 3 CLUSTER-PARTNER Find Similarities Model & ONLY FOR FAILOVER Tickets API Initial Load SERVICE DB

  15. General Architecture & Data Flow Initial Load 1 Server 1 Hibernate on Object Store APPLICATION REST INTEGRATION WEB SERVER SERVER PLATFORM SERVICE SERVICE BROKER 1 – 3 Find Similarities Model & Tickets API 1 Initial Load SERVICE DB

  16. General Architecture & Data Flow Training/building model 2 Server 1 • Train • stopTrainModel APPLICATION REST INTEGRATION • WEB SERVER SERVER getTrainModelStatus PLATFORM SERVICE SERVICE • checkModelInSpace BROKER 1 – 3 • destroylModel Find Similarities 2 Model & Tickets API 1 Initial Load SERVICE DB

  17. General Architecture & Data Flow 3 Long Running Spark Job Server 1 API • startModel • stopModel APPLICATION • REST INTEGRATION checkModelIsRunning WEB SERVER SERVER PLATFORM SERVICE SERVICE • getFindSimilartiesStatus BROKER 1 – 3 3 Find Similarities 2 Model & Tickets API 1 Initial Load SERVICE DB

  18. General Architecture & Data Flow 4 findSimilarities Server 1 • Write findSimilaritiesRequest APPLICATION REST object to the space using task INTEGRATION WEB SERVER SERVER PLATFORM SERVICE SERVICE BROKER 1 – 3 • Spark long time running job takes the object perform the 3 find similarities action (set the Find Similarities 4 object status to processed 2 Model & true) Tickets API 1 Initial Load SERVICE DB

  19. General Architecture & Data Flow 4 findSimilarities Server 1 ticketId>72018 APPLICATION gs.exec(modelId, “my search”) REST INTEGRATION WEB SERVER SERVER PLATFORM SERVICE SERVICE The result is the following similar cases: BROKER 1 – 3 70534 (0.823432215) 70874 (0.726937532) 70110 (0.719002341) 3 Find Similarities 4 70998 (0.528010191) 2 Model & Tickets API 1 Initial Load SERVICE DB

  20. General Architecture & Data Flow 5 Support Tickets (the data) Server 1 • Incremental Feed APPLICATION REST • INTEGRATION WEB SERVER SERVER Delete PLATFORM SERVICE SERVICE BROKER 1 – 3 3 Find Similarities 4 5 2 Model & Tickets API 1 Initial Load SERVICE DB

  21. Unified Transactional & Analytical Processing for Operationalizing ML VARIOUS APPLICATION DATA SOURCES UNIFIED REAL-TIME ANALYTICS, AI & TRANSACTIONAL PROCESSING REAL-TIME INSIGHT TO ACTION DISTRIBUTED IN-MEMORY MULTI MODEL STORE HOT RAM DATA SSD STORAGE WARM PERSISTENT MEMORY DATA DASHBOARDS AnalyticsXtreme AnalyticsXtreme • No ETL, reduced complexity • Built-in integration with external Hadoop/Data Lakes S3-like • Fast access to historical data • Automated life-cycle management COLD DATA BATCH LAYER

  22. RESULTS CO CONTIN TINUOUS ML REAL AL-TIM TIME EMPOW OWER THE HE AG AGENT TRAI AINING Allow the agents an Average time of 27 Minutes 50ms immediate response background training to search and find time, reducing mean time for 2 million similar cases time to resolution records

  23. Overcoming Challenges

  24. Step 1: Initial Load UNIFIED REAL-TIME ANALYTICS, AI & TRANSACTIONAL PROCESSING IN-MEMORY MULTI MODEL STORE RAM HOT DATA STORAGE-CLASS MEMORY Load 2 million records from a slow tier SSD STORAGE WARM DATA to a distributed in-memory data fabric (e.g. Multi-model Store) REAL-TIME LAYER BATCH LAYER DATABASE

  25. Distributed Multi-Model Object Store DYNAMIC SCALE CLIENT DB NODE 1 BACKUP NODE 1 PRIMARY NODE 2 BACKUP NODE 2 PRIMARY NODE 3 PRIMARY NODE 3 BACKUP

  26. Step 2: Create Model and Save to… UNIFIED REAL-TIME ANALYTICS, AI & TRANSACTIONAL PROCESSING Submit a Spark job to read from space and create an RDD IN-MEMORY MULTI MODEL STORE Create a Model (or “Customized RAM HOT DATA Model”) and save to: 1. Spark – can lose model STORAGE-CLASS MEMORY 2. Disk – too slow & no HA 3. Distributed Datagrid SSD STORAGE WARM DATA REAL-TIME LAYER Challenge # 1 BATCH LAYER Not a built-in Spark MLlib algorithm, DATABASE had to work around to persist to the grid.

  27. Step 3: Request/Response via Message Broker UNIFIED REAL-TIME ANALYTICS, AI & TRANSACTIONAL PROCESSING Spark job to “Find similarity request stream from Kafka” (long running job) IN-MEMORY MULTI MODEL STORE Run through model to get a response RAM HOT (model is loaded once to Spark) DATA STORAGE-CLASS MEMORY SSD STORAGE WARM DATA Write the response back to Kafka REAL-TIME LAYER Challenge # 2 BATCH LAYER Message broker is adding too much DATABASE latency

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend