Senior Data Engineer at Renmoney


Renmoney - We are a passionate team determined to challenge the status quo and make financial inclusion count for the millions of under-banked individuals and small business owners in Nigeria. We provide loans, savings, and fixed deposit solutions to our customers. Our vision is to be the most convenient lending company, delivering outstanding service experiences.

We are recruiting to fill the position below:

 

Job Title: Senior Data Engineer

Location: Ikoyi, Lagos
Employment Type: Full Time

Job Description

  • We are seeking a skilled Data Engineer to join our team. The ideal candidate will have experience in designing, building, and maintaining scalable and efficient data pipelines and data storage solutions.
  • As a Data Engineer, you will be responsible for developing, deploying, and maintaining ETL pipelines, data models, and data governance policies to ensure data consistency, accuracy, and availability.

Key Responsibilities

  • Design and implement data processing and storage solutions using AWS services such as S3, Glue, Athena, Redshift, and EMR.
  • Develop and maintain ETL (Extract, Transform, Load) pipelines to ingest, process, and transform large volumes of data from various sources.
  • Monitor and optimize data processing and storage performance to ensure data availability, reliability, and security.
  • Collaborate with cross-functional teams to understand their data requirements and provide solutions that meet their needs.
  • Stay up-to-date with emerging trends and technologies in the data engineering field and evaluate their potential impact on the organization's data strategy
  • Build and manage data warehouses and data lakes that can support complex analytics and reporting requirements.
  • Develop and deploy data models, data dictionaries, and data governance policies to ensure data consistency and accuracy.

Requirements
Must Have:

  • Experience in building data pipelines for Financial Services or similar projects on the AWS ecosystem.
  • Minimum of 3 years of experience in data engineering or related roles.
  • Experience with measuring data quality.
  • Experience with data modeling, data warehousing, and data governance.
  • Strong analytical and problem-solving skills.
  • Excellent communication and collaboration skills.
  • Experience with AWS services such as S3, Glue, Athena, Redshift, and EMR.
  • Expert knowledge in Python, Pyspark, SQL, and code management (git).
  • Strong ETL Skills, experience with data management systems and SQL tuning.

Good to have:

  • AWS certifications such as AWS Certified Big Data - Specialty or AWS Certified Data Analytics - Specialty.
  • Experience with NoSQL databases such as MongoDB and Cassandra.
  • Knowledge of Data Security and Privacy.
  • Experience with containerization technologies such as Docker and Kubernetes.
  • Experience working with Airflow, Prefect, dbt, and other related tools.
  • Experience with stream processing technologies such as Kafka and Kinesis.

Benefits
This job is perfect for you if you:

  • Are creative and an out-of-the-box thinker
  • Have excellent execution skills and are passionate about achieving excellence
  • Enjoy analytical thinking and have problem-solving capabilities
  • Enjoy collaborating with others, building relationships

You will not enjoy this job if you:

  • Work best in structured, hierarchical settings
  • Require clear, pre-set deliverables and constant direction
  • Are used to working in/with a large team

What is in it for you:

  • You’ll work on solutions to complex, real-world challenges with tangible social and economic impact. You will receive competitive compensation and work with passionate teammates in a flat, performance-driven culture.

 

How to Apply
Interested and qualified candidates should:
Click here to apply