[Remote] DataOps and Build Engineer

Remote Full-time
Note: The job is a remote job and is open to candidates in USA. GENNTE Technologies is seeking an experienced DataOps & Build Engineer to lead the architecture and optimization of a next-generation data platform. This role involves driving technical direction, mentoring teams, and automating complex CI/CD pipelines to enhance enterprise-level decision-making. Responsibilities • Establish DataOps Framework: Define, document, and champion the organizational framework and guidelines for DataOps—including release management processes, environment promotion strategy, and data quality standards • Best Practice Dissemination: Create and enforce standard operating procedures (SOPs) for data pipeline development, CI/CD, and testing across the engineering teams, ensuring consistency and adherence to architectural standards • Data Pipeline Automation: Design and implement robust continuous integration and continuous delivery (CI/CD) pipelines for data code and infrastructure • Workflow Orchestration Implementation: Configure, optimize, and manage the deployment of data workflows using orchestrators such as Dagster or Talend, focusing on automated testing and deployment steps • Version Control & Repository Management: Enforce best practices for source code management (e.g., Gitflow), branching strategies, and repository organization across all data projects • Infrastructure as Code (IaC): Work with Infrastructure teams to automate provisioning and management of data platform resources efficiently within AWS • Resilience and Failure Recovery: Design and implement automated rollback and self-healing mechanisms within pipelines to quickly recover from transient failures • Monitoring and Logging: Set up comprehensive monitoring, logging, and alerting using Cloud native tools, or other tools to ensure visibility into pipeline performance and quickly identify and resolve issues • Security and Compliance: Ensure data security and compliance by implementing IAM policies, encryption, and other security measures in AWS, adhering to best practices for handling sensitive data • Testing Frameworks: Implement automated testing strategies across the data lifecycle, including unit tests, integration tests, and data quality validation checks (e.g., column integrity, schema drift) to ensure data reliability before deployment • Resource and Cost Optimization: Implement automated policies and monitoring to track and control cloud resource consumption, ensuring that pipelines run efficiently and cost-effectively Skills • 8+ years of hands-on experience in Data Engineering, DevOps, or a dedicated DataOps role, focused heavily on automation and operational excellence • Proven experience implementing CI/CD practices specifically for data pipelines and data infrastructure • Strong conceptual understanding of data warehousing, ETL/ELT methodologies, and cloud-native architecture • Automation First Mindset: A strong drive to automate repetitive tasks and eliminate manual intervention in the data lifecycle • Collaboration: Excellent communication skills, capable of working effectively with Data Engineers, Data Scientists, and Infrastructure teams • Establish DataOps Framework: Define, document, and champion the organizational framework and guidelines for DataOps—including release management processes, environment promotion strategy, and data quality standards • Best Practice Dissemination: Create and enforce standard operating procedures (SOPs) for data pipeline development, CI/CD, and testing across the engineering teams, ensuring consistency and adherence to architectural standards • Data Pipeline Automation: Design and implement robust continuous integration and continuous delivery (CI/CD) pipelines for data code and infrastructure • Workflow Orchestration Implementation: Configure, optimize, and manage the deployment of data workflows using orchestrators such as Dagster or Talend, focusing on automated testing and deployment steps • Version Control & Repository Management: Enforce best practices for source code management (e.g., Gitflow), branching strategies, and repository organization across all data projects • Infrastructure as Code (IaC): Work with Infrastructure teams to automate provisioning and management of data platform resources efficiently within AWS • Resilience and Failure Recovery: Design and implement automated rollback and self-healing mechanisms within pipelines to quickly recover from transient failures • Monitoring and Logging: Set up comprehensive monitoring, logging, and alerting using Cloud native tools, or other tools to ensure visibility into pipeline performance and quickly identify and resolve issues • Security and Compliance: Ensure data security and compliance by implementing IAM policies, encryption, and other security measures in AWS, adhering to best practices for handling sensitive data • Testing Frameworks: Implement automated testing strategies across the data lifecycle, including unit tests, integration tests, and data quality validation checks (e.g., column integrity, schema drift) to ensure data reliability before deployment • Resource and Cost Optimization: Implement automated policies and monitoring to track and control cloud resource consumption, ensuring that pipelines run efficiently and cost-effectively • Insurance industry experience preferred but not mandatory Company Overview • At GENNTE Technologies, we are dedicated to revolutionizing the recruitment and staffing industry with a focus on personalized, high-impact solutions. It was founded in 2024, and is headquartered in Austin, Texas, US, with a workforce of 11-50 employees. Its website is Apply tot his job
Apply Now

Similar Opportunities

Senior Data Engineer IS (DataOps)

Remote Full-time

Marketing Data Ops Specialist

Remote Full-time

Senior Data Engineer, Data Ops

Remote Full-time

[Remote] Sr Manager, Talent Acquisition

Remote Full-time

Engagement Specialist, Experienced (IKC)

Remote Full-time

[Hiring] Director, Talent Acquisition @DaVita Kidney Care

Remote Full-time

Sr. Marketing Specialist, Cloud Computing and AI Deep Learning

Remote Full-time

AI & Machine Learning Engineering Consultant - Manager - Consulting - Location OPEN

Remote Full-time

Machine Learning Engineer; Remote

Remote Full-time

Senior Deep Learning Performance Engineer - Training at Scale

Remote Full-time

Reading Tutor – Dedicated One‑on‑One Literacy Coach for K‑12 Students in Emmet County, MI (Full‑time/Part‑time, On‑site)

Remote Full-time

Experienced Remote Data Collection Specialist – Data Entry, Analysis, and Business Insights Professional

Remote Full-time

Experienced Customer Service Representative – Federal Student Loan Support and Guidance in a Fast-Paced Remote Environment

Remote Full-time

**Experienced Full Stack Software Engineer – Web & Cloud Application Development at arenaflex**

Remote Full-time

Lead Our Zero-Harm Strategy: System Medical Director, Infection Prevention (Reports to Chief Quality Officer; up to 0.75 FTE admin / 0.25 clinical) - Now Hiring

Remote Full-time

Remote Non-Profit Accountant (Full-Time)

Remote Full-time

Entry-Level Remote Chat Support Specialist – Launch Your Career in Digital Marketing with No Experience Required and No Phone Calls

Remote Full-time

Senior Lead Counsel & Group Director - International Employment Law & Immigration

Remote Full-time

Air Customer Service Agent - Remote US - Delivering Exceptional Travel Experiences with Holland America Group

Remote Full-time

Urgently Need Certified Nursing Assistant (CNA) – Work-Life Balance in North Hollywood, CA

Remote Full-time
← Back to Home