Cloud Architect (GCP)

  • Engineering
  • London, United Kingdom
  • Full-time

Description

As Google Cloud's premier partner in data and analytics, we provide world-class businesses with cutting-edge data solutions in the cloud.

We help clients take leading technology to the limits by combining our expertise in machine learning, data engineering, and analytics. With Google Cloud Platform as our foundation, we help businesses future-proof their solutions, deepen their understanding of consumers, increase competitive advantage and unlock operational efficiencies.

Our team consists of experts in machine learning, data science, software engineering, mathematics, and design. We share a passion for data & analysis, operate at the cutting edge, and believe in a pragmatic approach to solving hard problems.

The Role

As Cloud Architect at Datatonic you will shape the architecture for key projects. You'll be engaging with our customers and prospects to provide pre- & post-sales architectural advice and thought leadership for machine learning & analytics projects. In addition to that, you will be helping us to build out the architectural side of our next-generation machine learning products.

This is an excellent opportunity for an experienced professional in cloud technologies who would like to work as part of a team of experts in the fields of AI, machine learning, big data analytics and data engineering, as the resident subject matter expert in solution/cloud architecture. The role is customer facing, working closely with business and technical influencers, and will require a background in Computer Science. The right candidate for the role will have multiple years’ experience with cloud platforms like GCP or AWS. You'll have served as an SME in designing and migrating applications to live in the cloud, working with complex cloud deployments and be adept at the design and implementation of complex platforms.


As Cloud Architect at Datatonic, you will:

  • Work with the most innovative and scalable data processing and cloud technologies
  • Build innovative state-of-the-art solutions with our customers
  • Support our sales teams when engaging with new customers / projects. Provide technical inputs to help shape and scope solutions to solve our clients most challenging needs
  • Execute architecture reviews for some of our key customers. Identify and share recommendations with senior client stakeholders
  • Input into our internal knowledge base helping us develop collateral and thought leadership
  • Work in an agile and dynamic environment together with a small team with our data scientists, machine learning experts, data analysts and data engineers
  • Be a leader and mentor to other team members
  • Work closely with our tech partners: Google Cloud Platform, Tableau, Looker
  • Play a key role in shaping the architecture team at Datatonic

Requirements

  • BSc or MSc degree in Computer Science or a related technical field
  • 5+ years of experience building big data cloud architecture
  • The ability to take ownership from end-to-end, finding creative solutions
  • Demonstrated, strong analytical and technical capabilities, with an innovative 'edge'
  • Exceptional communication skills; both written and verbal with great attention to detail.
  • Able to present concepts in an authoritative and clear manner to customers through white-boarding, presentations, and proposals
  • Ability to develop and maintain relationships with key external stakeholders at various business levels
  • Experience in technical pre-sales, able to evangelise disruptive proposals
  • Expertise in cloud technologies and their design/usage (across all areas i.e. data/security/networking etc)
  • Knowledge and experience with container technologies
  • Programming experience, ideally in Python, Java and SQL
  • Experience building scalable and high-performant code
  • GCP or AWS Certified Solutions Architect certification
Bonus Points:
  • Experience with ETL tools, Hadoop-based technologies (e.g. Spark) and/or batch/streaming data pipelines (e.g. Beam, Flink)
  • Experience designing and building data lake and data warehouse solutions using technology such as BigQuery, Azure Synapse, Redshift, Oracle, Teradata, etc
  • Experience designing and building analytical products using technology such as Looker, Tableau, Data Studio, PowerBI, Qlik, etc
  • Experience with Agile methodologies such as Scrum
  • Basic knowledge of and ideally some experience with data science topics like machine learning, data mining, statistics, and visualisation
  • Contributions to open source projects

Benefits

  • 25 days holiday plus bank holidays
  • Pension scheme
  • Situated in the innovation hub of Canary Wharf
  • Laptop of your choice
  • Monthly social events and team offsites
  • Free fruit, cookies, tea/coffee throughout the week
  • Regular networking events, mentoring events and conferences
  • Exposure to experts from a number of industries
  • Freedom to explore the latest tools and technologies
  • Knowledge-sharing activities
Apply for this job