Client
Leading game developer
Tech stack
Google Cloud
Solution
MLOps
Service
AI + Machine Learning
Client
Leading game developer
Tech stack
Google Cloud
Solution
MLOps
Service
AI + Machine Learning
Our client is a leading interactive entertainment company in the mobile gaming industry, with over 200 titles and more than 250 million monthly players worldwide. At this scale, data + AI are critical to optimising operations and enhancing gaming experiences. To help our client do this effectively, Datatonic developed a Command Line Interface with MLOps best practices to simplify training ML models on the cloud. This increased the efficiency of its Data Scientists and allowed them to focus on developing innovative ML use cases.
In gaming, data and AI play an important role in driving business value. This can be broken down into three main areas:
The large team of over 150 Data Scientists uses ML to predict user churn, recommend bundles of in-app purchases, and predict the lifetime value of customers, among other key use cases. To do this effectively, they need the right platform and MLOps tooling to deploy Machine Learning use cases at scale.
Before working with Datatonic, model development and training were performed on local machines or JupyterHub Virtual Machines. Hardware limitations meant that training jobs could be tediously slow, and using larger datasets was challenging, with the option to use massive datasets completely out of the question.
“Our client had a strong level of existing maturity with ML but they had variance across the organisation within their skillsets, tech stack and processes for using it successfully. Their challenge was staying ahead.” – Jamie Curtis, Senior Business Development Manager
Datatonic worked closely with the Data Science team to develop a Command Line Interface (CLI) tool so that users can seamlessly train ML models on their local machine, on existing JupyterHub Virtual Machines, and now with the massive compute available in Google’s Vertex AI Pipelines. This included:
“Development of this CLI tool was one of many initiatives to create seamlessness with the evolving ways of working amongst Data Science teams, to accelerate and simplify model training. It’s a great example of increasing the productivity of data science teams and outputs.” – Jamie Curtis, Senior Business Development Manager
With this new tool, the Data Science team can seamlessly journey from developing their ML code locally, or in JupyterHub, to training on enormous datasets with the powerful compute resources available serverless in Vertex AI, without needing to alter their code between local and cloud-based training.
This greatly reduces the manual effort required by its Data Scientists, allowing them to focus on innovative ML models.