Career Center

Sr Staff Data Engineer

Location: Remote
Posted On: 08/03/2022
Requirement Code: 59932
Requirement Detail

Required :

The Staff Data Engineer will lead data teams in understanding and implementing enterprise data strategy leveraging an understanding of complex data management technologies. This role is responsible for maintaining data quality through data governance, meta data management and master data while ensuring data integrity for data consumers from data analysts to data scientists. This role will work with cyber security and architecture teams on identifying and implementing best data practices that stay aligned with constantly changing innovations in technology. This position will present and contribute data solutions including investigation of ML and AI enablement to technical leadership at all levels with roadmaps reflecting current and future state. The Staff Data Engineer works closely with data architects and security teams in designing enterprise class data pipelines - while constantly investigating and learning about changes in data management.


Perform all responsibilities in accordance with BECU Competencies, compliance, regulatory and Information Protection requirements.

Organize and manage complex and enterprise-wide data solutions including analysis, design, coding, testing, debugging, and documentation.

Lead data engineering teams in optimizing data processes used during interactions with business and technical processes.

Present and communicate technical topics to the larger engineering and technology community.

Lead design and development efforts of complex workflows for implementing technical requirements into data workflow and pipelines within data team.

Design and implement prototypes, proofs of concept, and data solutions by combining technical expertise with a deep understanding of data workflow design. Design and deliver highly available and scalable data services in a production environment.

Design and implement metadata management and master data policies into data workflows and data pipelines.

Develop guidelines for implementing and managing data curation policies on data workflows and data pipelines in partnership with manager and other senior leaders.

Lead and mentor data engineering teams in the development and testing of data workflow system components/services, code, and design reviews.

Partner with Architects and Product Owners to design and document the team's technology roadmap and vision.

Partner with leadership team in developing a training plan for expanding data engineering competencies as they align with modern day and future industry standards.

Contribute to BECU code quality and extensibility by aligning with other engineering teams in the creation of coding standards for data modules and components. Create and define standards.

Perform additional duties as assigned.


Bachelor's degree in Computer Science or related discipline, or equivalent work experience required. Advanced degree preferred.

Minimum 7 years of experience in data engineering required.

Minimum 5 years of experience in coding languages like Python, Scala, Java, SQL required.

Previous experience working with cloud technologies (platforms like Azure, AWS, GCP) required.

Advanced knowledge of one or more of Modern data platforms required. experience in distributed parallel processing (Azure Synapse, Databricks, Redshift, Big Query, Snowflake) required.

Proven ability to use best practices in ETL/ELT tools, the usage of SQL in stored procedures, and the usage of advanced cloud-based integration tools - as well as an understanding of security considerations for handling data in flight and data at rest required.

Demonstrated experience using SQL, data modeling, large datasets, data warehousing, data analysis, analytics engines. Knowledge of cloud-hosted SQL-based datastores, and NoSQL systems required.

Demonstrated experience developing use cases and roadmaps for implementing master data (MDM), metadata and data governance tools (Alation, Erwin, Collibra, Azure Purview, Infosphere, Informatica MDM, Profisee, SAP Master Data Governance, Ataccama or other governance tools) required.

Previous experience working with Continuous Integration and Continuous Delivery systems and tools such as Azure DevOps Services, GitHub Actions, Jenkins, or Teamcity. Proficient at building build / deployment pipelines in YAML required.

Experience using git (and other comparative tools), performing code reviews, pull requests, and following branching standards such as Git Flow or Trunk-Based Development required.

Proven ability to set guidance and standards for Test Driven Development concepts, methods, and tools used in unit testing, integration testing or performance/load testing required.

Proven ability to deliver highly scalable data solutions using a programming language compatible with the current technical environments over the entire data lifecycle (from ideation to retirement) - as well as design scalable data solutions implementing newer technologies that align with the data strategy and future data management roadmaps required.

Proven ability to stay current with emerging technologies and new data technologies, through work or continuing industry or education involvement required.

Proven experience leading and collaborating with multiple teams, including business unit teams, to deliver solutions through all aspects of the SDLC required.

Experience presenting in front of technically adept and functional audiences required.

Proficient verbal and written skills to effectively communicate required.

Full time hours required.