Taking Gaming to the Next Level with MLOps


Leading game developer

Tech stack

Google Cloud




AI + Machine Learning

Our client is a leading interactive entertainment company in the mobile gaming industry, with over 200 titles and more than 250 million monthly players worldwide. At this scale, data + AI are critical to optimising operations and enhancing gaming experiences. To help our client do this effectively, Datatonic developed a Command Line Interface with MLOps best practices to simplify training ML models on the cloud. This increased the efficiency of its Data Scientists and allowed them to focus on developing innovative ML use cases.

Our impact

  • Enhanced efficiency by developing a Command Line Interface (CLI) tool to accelerate ML model training, while minimising the learning curve required
  • Upskilled + enabled 150+ Data Scientists to use Google Cloud seamlessly and scale AI + ML models across large datasets with MLOps
  • Implemented experiment tracking out-of-the-box, for easier collaboration among Data Scientists and greater reproducibility of ML models and their results


The challenge

In gaming, data and AI play an important role in driving business value. This can be broken down into three main areas:

  1. Augmenting player experience
  2. Improving the game development process
  3. Growing the user base + reducing churn

The large team of over 150 Data Scientists uses ML to predict user churn, recommend bundles of in-app purchases, and predict the lifetime value of customers, among other key use cases. To do this effectively, they need the right platform and MLOps tooling to deploy Machine Learning use cases at scale. 

Before working with Datatonic, model development and training were performed on local machines or JupyterHub Virtual Machines. Hardware limitations meant that training jobs could be tediously slow, and using larger datasets was challenging, with the option to use massive datasets completely out of the question.

“Our client had a strong level of existing maturity with ML but they had variance across the organisation within their skillsets, tech stack and processes for using it successfully. Their challenge was staying ahead.” – Jamie Curtis, Senior Business Development Manager


Our solution

Datatonic worked closely with the Data Science team to develop a Command Line Interface (CLI) tool so that users can seamlessly train ML models on their local machine, on existing JupyterHub Virtual Machines, and now with the massive compute available in Google’s Vertex AI Pipelines. This included:

  • Creating templated ML codebases, giving Data Scientists a strong foundation to build upon, with best practices already established.
  • Implementing out-of-the-box experiment tracking, making it much easier for the Data Science team to collaborate on ML use cases. 
  • Abstracting the use of containers, reducing the barrier to entry for Data Science teams while maintaining portability and cloud-native best practices.
  • Automating pre-commit checks, unit testing, and end-to-end testing to ensure high code quality.
  • Using CI/CD pipelines for automating the testing, building, and publishing of new versions of the tool to the Artifactory, allowing for rapid iteration and improvement.
  • Auto-generating Markdown documentation, as well as a thorough user guide to get Data Scientists up to speed quickly with the new tool.

“Development of this CLI tool was one of many initiatives to create seamlessness with the evolving ways of working amongst Data Science teams, to accelerate and simplify model training. It’s a great example of increasing the productivity of data science teams and outputs.” – Jamie Curtis, Senior Business Development Manager

With this new tool, the Data Science team can seamlessly journey from developing their ML code locally, or in JupyterHub, to training on enormous datasets with the powerful compute resources available serverless in Vertex AI, without needing to alter their code between local and cloud-based training. 

This greatly reduces the manual effort required by its Data Scientists, allowing them to focus on innovative ML models.