My banking client based in Edinburgh require Data Engineer's to support them on a Permanent basis!
This is an exciting opportunity to use your technical expertise to collaborate with colleagues and build effortless, digital first customer experiences.
You'll be simplifying the bank through developing innovative data driven solutions, inspiring to be commercially successful through insight, and keeping our customers' and the bank's data safe and secure
Participating actively in the data engineering community, you'll deliver opportunities to support the bank's strategic direction while building your network across the bank.
What you'll do
As a Data Engineer, you'll play a key role in delivering value for our customers by building data solutions. You'll be carrying out data engineering tasks to build a scalable data architecture including carrying out data extractions, transforming data to make it usable to analysts and data scientists, and loading data into data platforms.
We'll also expect you to be:
Developing comprehensive knowledge of the bank's data structures and metrics, advocating change where needed for product development
Building automated data engineering pipelines through the removal of manual stages
Working closely with core technology and architecture teams in the bank to build data knowledge and data solutions
Developing a clear understanding of data platform cost levers to build cost effective and strategic solutions
The skills you'll need
To be successful in this role, you'll need to be an entry level programmer and Data Engineer with a qualification in Computer Science or Software Engineering. You'll need ETL experience ideally with a solid grasp of StreamSets or Informatica BDM. Along with experience of data modelling and building and maintaining data pipelines, you'll have strong coding experience in two or more of the following - SQL, Python, PySpark and SCALA. You'll also need a good understanding of data usage and dependencies with wider teams and the end customer, as well as a proven track record in extracting value and features from large scale data. Familiarity with Mongo DB and AWS cloud would be desirable, as would Snowflake experience.
You'll also demonstrate:
Good critical thinking and proven problem solving capabilities
Experience of ETL technical design, automated data quality testing, QA and documentation, data warehousing, data modelling and data wrangling.
A good understanding of modern code development practices
We are an equal opportunities employer and welcome applications from all suitably qualified persons regardless of their race, sex, disability, religion/belief, sexual orientation, gender reassignment, marriage and civil partnerships, pregnancy or maternity or age
your application has been submitted
Back to job search