Data Engineer - GCP
We are currently recruiting for a Data Engineer who has worked in a GCP environment to join a global media and analytics business in central London. As a business they operate a flexible working model and in future you can expect to be in the office around 2 days a week.
Data Engineer Role Objectives
The Data Engineer will be expected to work with stakeholders in the business, as well as external suppliers and technology providers, to build out and maintain a cohesive strategy for data management. The successful candidate will work within the Global Platform team. You must be able to work effectively across all areas of data engineering and management, helping establish and build a data centre of excellence.
- Set and maintain a strategy and deployment of an optimal data infrastructure, including data streams, integrations, transformations, databases, and data lakes
- Establish and maintain a data schema for the standardisation and aggregation of all relevant data streams
- Support application development and deployment of advanced statistical models with the required infrastructure, capabilities, and datasets
- Create and maintain processes to ensure that data is valid, up to date, reliable and of a high quality
- Ensure that the requisite security policies are maintained
- Design, develop and maintain a process for the identification and ingestion of new data streams
- Ensure that processes are optimised and automated to maximise efficiency
- Support the BI and MI capabilities
- Maintain a working knowledge of relevant technologies and tools to support data management
- Translate business requirements into technical solutions and documentation
- Build out solutions for new business requirements
- Act as the primary point of liaison with third party developers and contractors on subjects that relates to data architecture and management
- Relevant degree (or equivalent) - Computer Science, Mathematics, Physics, Engineering
- Relevant experience (+3 year) in data engineering or data operations roles
- Data architecture & infrastructure
- Machine Learning, AI, Algorithms
- Platform APIs; process automation
- Extensive experience with setting up ETL architecture and dataflow
- 3rd party integrations via API or stream feeds
- Relational Database concepts: denormalization, normalisation, joins, etc.
- Understanding of database development and reporting dashboard design
- Security best practices
- Cloud computing, especially GCP
- Compute Engine
- Understanding of RESTful applications
- Data pipeline tools: Airflow, Stitch, Google Dataflow
- Provisioning techniques, such as Terraform with GCP
- CI/CD systems
If you have the desired skills and experience and would be interested in finding out more please respond to this advert by following the link below and attaching a copy of your CV. If successful we will be in touch to discuss the role in more detail.
Lorien Plc is acting as an Employment Agency in relation to this vacancy.
your application has been submitted
Back to job search