APPROACH

                    The approach to address the client’s challenge included:

                    • Building re-usable data pipelines to move away from the current Hadoop-based system to GCP
                    • Leveraging the GCP environment to build a cost-optimized ecosystem to cater to both the business users and the data-scientist community
                    • Merging demographics, customer, geo-location, credit-model, clickstream & marketing campaign data to create a single source of truth

                    Using a model management on GCP to track and maintain 150+ ML models

                    KEY BENEFITS

                    Our solution helped the client:

                    • Develop an auto-scalable infrastructure on GCP to manage variable workloads thus reducing cost
                    • Use big-query and data-proc in tandem to provide compute-horsepower on a case-to-case basis based on cost
                    • Leverage Kubeflow and Kubernetes for model management and deploying model endpoints for down-stream consumption

                    RESULTS

                    • The UDP since its inception has been processing 250 TBs (Terabytes) of data weekly
                    • Overall reduction of processing time in computation-intensive jobs by 70 percent
                    • Overall costs reduced by 30-35 percent

                    免费观看成年午夜视频