Sr Data EngineerLocation: Remote
Posted On: 08/03/2022
Requirement Code: 59931
Primary Skills: Minimum 5 years of functional experience in data engineering required. Minimum 4 years of experience in coding languages like Python, Scala, Java, SQL required.
Secondary Skills: 100% Remote in: Georgia, Texas, Idaho, Washington, South Carolina, Oregon
The Sr Data Engineer is responsible for writing data work flows according to design specifications and completing complex development work. This role is also responsible for designing and coding features, automated tests, and scripts, design data models, and using data management and governance tools. This role will recommend and implement technical data management solutions for business problems, contributing to system and service design and architecture. This position will present and contribute to data solutions to technical leadership. The Sr Data Engineer will work closely with and mentor other data engineers and coordinate with business and systems analysts to build enterprise class data pipelines.
Perform all responsibilities in accordance with BECU Competencies, compliance, regulatory and Information Protection requirements.
Design and develop complex and multi-tier system, analysis, coding, testing, debugging, and documentation.
Support and mentor data engineering teams in automating and improve business data processes and interactions with limited business guidance.
Present and communicate technical topics to the larger engineering community.
Coordinate, support, and execute design and development efforts of complex workflows for implementing technical requirements into data workflow and pipelines.
Design and support implementation of prototypes, proofs of concept, and data solutions by combining technical expertise with a deep understanding of data workflow design
Support and conduct implementation of metadata management and master data policies into data workflows and pipelines.
Organize and guide data curation policies on data workflows and data pipelines as outlined in the technical requirements.
Mentor, organize, and coordinate small -to-medium sized teams in the development and testing of data workflow system components/services, code, and design reviews.
Work with Architects and Product Owners to design and document the team's technology roadmap and vision.
Work with leadership team in developing a training plan for advancing data engineering skillsets.
Contribute to BECU code quality and extensibility by exampling and enforcing existing coding standards within delivery teams. Assist senior staff in creating and defining those standards.
Perform additional duties as assigned.
Bachelor's degree in Computer Science or related discipline, or equivalent work experience required.
Minimum 5 years of functional experience in data engineering required.
Minimum 4 years of experience in coding languages like Python, Scala, Java, SQL required.
Demonstrated experience using cloud technologies (platforms like Azure, AWS, GCP) required.
Demonstrated experience using one of the Modern data platforms (Azure Synapse, Databricks, Redshift, Big Query, Snowflake) required.
Demonstrated experience using ETL/ELT, SQL, and cloud data integration tools (Azure Data Factory, AWS Glue, Informatica, etc.) required.
Demonstrated experience using SQL, data modeling, data warehousing, data analysis required.
Previous experience in master data (MDM), metadata and data governance tools (Alation, Erwin, Collibra, Azure Purview, Infosphere, Informatica MDM, Profisee, SAP Master Data Governance, Ataccama or other governance tools) preferred.
Previous experience using Continuous Integration and Continuous Delivery systems and tools such as Azure DevOps, GitHub required.
Experience using git, performing code reviews, pull requests, and following branching standards such as Git Flow or Trunk-Based Development required.
Experience in Test Driven Development concepts, methods, and tools. Demonstrated experience in unit testing, integration testing or performance/load testing required.
Proven ability to deliver highly scalable data solutions using a programming language compatible with the current technical environments over the entire data lifecycle (from ideation to retirement) required.
Proven ability to stay current with emerging technologies and new applications of existing data technologies, through work or continuing industry or education involvement required.
Proven experience leading and collaborating within team, including business unit teams, to deliver solutions through all aspects of the SDLC required.
Experience presenting in front of technically adept audiences required.
Proficient verbal and written skills to effectively communicate required.
Full time hours required.