
BI Data Engineer (Remote)
- Cape Town, Western Cape
- Permanent
- Full-time
- Manage, monitor, and optimize BI / Data infrastructure costs through GCP.
- Lead and improve the end-to-end data pipeline process, from data ingestion and transformation to reporting systems, automating as much as possible.
- Introduce AI tools and methodologies across the data pipeline to improve efficiencies, data monitoring, data quality and overall business value delivered.
- Collaborate with peers and stakeholders to understand and translate business requirements into robust data solutions.
- Design, develop, and maintain data models, including machine learning models, ensuring data accuracy, reliability, availability, and evolution.
- Use reporting tools such as PowerBI to create and automate comprehensive data visualizations and reports.
- Manage Google Analytics and Google Tag Manager accounts to drive decisions for web-based platforms.
- Communicate actionable insights with proactive recommendations using statistical techniques.
- Collaborate with stakeholders to identify new opportunities for data insights and develop strategies to meet business goals and drive more value to their clients.
- Create and maintain reporting templates and dashboards for internal and external audiences.
- Independently identify, troubleshoot, and resolve data-related issues, ensuring a seamless data flow and maintaining high data quality and accuracy standards.
- Take ownership of data-related decisions and provide guidance on future data-specific hiring needs as well as tools.
- Collaborate with stakeholders to identify new opportunities for data-driven insights and develop strategies to meet business goals and drive more value to their clients.
- Bachelor''s degree in computer science, Data Science, AI/ Machine learning, Information Systems, or a related field.
- Proven experience (5+ years) as a BI Developer, Data Engineer, AI Engineer or similar role, with hands-on experience across the entire data pipeline.
- Experience with cloud platforms like Google Cloud Platform (GCP), AWS, or Azure for deploying AI models and managing data infrastructure.
- Proficient in Microsoft PowerBI, or similar tool.
- Experience in developing, training, and deploying machine learning models would be beneficial.
- Experience with handling / processing large sets of data exceeding 8 terabytes in size.
- Coding skills in languages commonly used in AI & BI development, such as Python and Java.
- Experience in designing and implementing algorithms for predictive analytics, recommendation systems, or other AI applications.
- Analytical, with the ability to independently to collect, organise, analyse, and disseminate significant amounts of information with attention to detail and accuracy.
- Demonstrated ability to design and implement ETL processes and reporting solutions.
- Prior experience influencing decision-making and providing guidance on data-related initiatives.
- Solid understanding of data warehousing concepts, data integration, and data quality best practices.
- Excellent problem-solving skills and ability to work independently to meet project deadlines.
- Exceptional communication skills to collaborate effectively with cross-functional teams and business stakeholders, translating complex concepts into simple business language.
- Business acumen, with the ability to translate business requirements into technical solutions.
JobPlacements.com