Job Type: Permanent

Position Description

As a small but expanding company, we are looking to build out our data architecture to cope with the additional requirements of data science and other business stakeholders. This involves the development and rollout of a new cloud-native (GCP based) data architecture and ecosystem to drive the future direction of the business – including new product and capability development. As a SQL developer in the data team, you will:

● Gather and Analyse Requirements

● Develop ETL routines to ingest, transform and store data.

● Deploy data scoring routines and make data available to customers through API’s and Web based portals

● Manage the steady-state operation of the data ecosystem ensuring it meets required service levels, is cost efficient, and well maintained (including provision of documentation).

The ideal candidate will bring the required technical capabilities, be able to learn and work independently, collaborate and communicate well with co-workers and share their passion to advance Company Watch’s mission to provide outstanding analytics to our customers. In return Company Watch will provide an excellent working environment within a small but highly effective team where leading-edge data analytics is at the heart of what we do, presenting a real opportunity for personal and professional growth.

Key Responsibilities

● Build, Track and Maintain the flow of data within ETL (extract, transform, load) and analysis pipelines, ensuring successful processing and data validity

● Work with the software and information technology teams to specify, design, and implement the infrastructure for storing, searching and integrating new datasets

● Management of the data ecosystem, ensuring it meets required business service levels is well maintained and cost effective

● Work with data scientists to identify optimal ways to prepare, store and navigate their datasets



● Experience writing SQL code (preferably using DBT or equivalent)

● Database/Data Warehouse: BigQuery, MS SQL Server or Postgres familiarity

● Proficient programming skills in a scripting language – preferably Python

● Experience of data engineering best practices. E.g. CI/CD and using version control.


● Cloud Experience with GCP / AWS (preferably GCP – object storage using lambdas etc..)

● Working in an Agile Development environment

● Experience of data quality assessment and validation

Nice to have

● Deployment / use of GCP Components including: Cloud Run, DataFlow, DataProc, Cloud Composer & Cloud Monitoring

● Experience in the Business or Credit Information industry

● Experience of supporting/implementing data pipelines – using Apache Airflow


● Excellent English communication (written and verbal)

● Works well independently and as part of a team

● Self-motivated

● Detail-oriented with strong organisational and analytical skills

● Strong problem solver

● Keen learner

If you are interested in this role, please send your CV and cover letter to