Related Jobs
%20(1).jpg)
Related Jobs
Related Jobs
Share this Job
About the Organisation
Data Engineer job at DFCU Bank | Apply Now
Kampala, Uganda
DFCU Bank is a leading financial institution in Uganda dedicated to transforming lives and businesses through innovative financial solutions that drive economic growth and financial inclusion. Established in 1964 as the Development Finance Company of Uganda, the bank has grown from a development finance institution into a full-fledged commercial bank, with a significant milestone being its acquisition of Crane Bank in 2017. Recognized for its strong financial performance, customer service excellence, and digital banking innovations, dfcu Bank has received multiple industry awards and remains a trusted partner for individuals, SMEs, and corporate clients.
The bank fosters a dynamic and inclusive work culture that emphasizes integrity, customer focus, innovation, teamwork, and excellence, offering employees professional growth, career advancement opportunities, and competitive benefits. With a strong presence across Uganda through an extensive network of branches, ATMs, and digital banking services, dfcu Bank continuously invests in technology to enhance accessibility and efficiency.
Committed to corporate social responsibility, the bank actively supports financial literacy programs, women empowerment initiatives, environmental sustainability efforts, and youth entrepreneurship. Headquartered at 26 Kyadondo Road, Kampala, Uganda, dfcu Bank remains a key driver of financial empowerment and economic development in the country. For more information
Are you looking for Information Technology jobs in Uganda 2025 today? then you might be interested in Data Engineer job at DFCU Bank
Full Time
Deadline:
29 Oct 2025
Job Title
Data Engineer job at DFCU Bank
DFCU Bank
Job Description
Job Title: Data Engineer
Organisation: DFCU Bank
Duty Station: Kampala, Uganda
Job Summary:
Reporting to the Head Of Data & Insights, the role holder will be responsible for designing, building, and maintaining robust, scalable, and secure data pipelines and infrastructure across global data platforms. The role ensures that data from various systems is efficiently ingested, transformed, stored, and made available for advanced analytics, reporting, and machine learning use cases in compliance with global data governance and privacy standards.
Duties, Roles and Responsibilities
Qualifications, Education and Competencies
See all details of the qualifications, competencies and education for this role under the "How to Apply" section below.
If you believe you meet the requirements as noted above, please forward your application to the APPLY Button below:
Deadline: 29th October 2025
NB: Only short-listed candidates will be contacted.
Applications are managed via the AfriCareers Jobs Portal:
-
Click the Apply button below
-
New users: Select Create Profile and complete the Profile Creation Wizard
-
Existing users: Log in and update your profile if needed
-
Go to the "Jobs" tab
-
Read the detailed job description, Roles and Qualifications.
-
Submit your application via the jobs portal
-
Track progress under "My Applications" tab
Important Note: Employers now hire directly on the AfriCareers New Jobs Portal — keep your profile updated so employers can easily view your CV and hire you instantly.
How to Apply
Design, build, and maintain data pipelines to ingest data from structured and unstructured sources (internal and external).
Develop and optimize ETL/ELT processes to ensure reliability, scalability, and performance across large datasets.
Implement data warehousing and data lake architectures using cloud and on-prem technologies (e.g., Snowflake, Azure Synapse, BigQuery, AWS Redshift, Databricks, SSMS).
Create reusable data assets and frameworks for repeatable and standardized data integration.
Implement data validation, cleansing, and quality monitoring frameworks.
Integrate and support Master Data Management (MDM) and Metadata Management practices.
Partner with the Data Governance and Data Protection Officers to ensure compliance with Data protection and Privacy laws in Uganda and other global data protection laws.
Manage data lineage, cataloging, and access control using enterprise tools such as Azure Purview, Collibra, or Alation.
Build scalable data pipelines using tools such as Azure Data Factory, Apache Airflow, NiFi, or AWS Glue.
Develop real-time and batch data streaming solutions using Kafka, Event Hubs, or Kinesis.
Support API-based integrations and data sharing across systems and geographies.
Work closely with Data Scientists and Analysts to provision and prepare data for predictive and prescriptive modelling.
Collaborate with BI and reporting teams to ensure data consistency across dashboards and analytical layers.
Partner with cross-functional teams to define and implement data standards and reusable assets.
Research and implement best-in-class tools and frameworks for data engineering.
Lead or contribute to cloud modernization and data platform migration initiatives.
Ensure cost optimization and performance tuning of data workloads.
Stay updated on emerging technologies (AI-driven data management, Data Mesh, Data Fabric, GenAI-enhanced data tools.
Bachelor’s Degree in Computer Science, Software Engineering, Statistics, Mathematics, Data Science, Information Systems, or other Quantitative fields.
Preferred: Master’s degree or equivalent experience in Data Engineering, Cloud Computing, or Analytics.
Certifications in one or more of the following:
Azure Data Engineer Associate / AWS Certified Data Analytics / Google Professional Data Engineer;
Databricks Certified Data Engineer;
Snowflake SnowPro Core / Advanced Architect
Minimum 3–5 years’ experience in Data Engineering or Data Platform Development.
Proficiency in SQL, Python, PySpark, or Scala for data transformation.
Experience with cloud data platforms (Azure, AWS).
Strong understanding of data modelling, data warehousing, and ETL orchestration.
Hands-on experience with data versioning, CI/CD for data pipelines, and Infrastructure as Code (IaC) using Terraform or ARM templates.
Familiarity with data governance frameworks and data privacy principles.
Experience with modern architecture patterns such as Data Mesh or Data Fabric is a plus.
Excellent communication, collaboration, and problem-solving skills in cross-functional, multicultural environments.

.jpg)
.jpeg)

_jfif.jpg)








