Data Engineer & Database Administrator
Location: Bellevue, WAPosted On: 05/03/2026
Requirement Code: 73631
Requirement Detail
|
· Design, build and maintain ETL/ELT pipelines across multiple systems · Design and optimize data models in PostgreSQL, MongoDB, and Amazon Redshift · Develop and manage data workflows using AWS (S3, Glue, Lambda, Step Functions, Kinesis) · Administer and maintain database environments across development, staging, and production · Monitor data pipelines and databases, troubleshoot issues, and implement alerts · Optimize query performance, indexes, and configurations across relational and NoSQL systems · Manage database provisioning, upgrades, backups, and disaster recovery (RDS, MongoDB, Redshift) · Ensure database security, including access control, encryption, and role management · Plan capacity and scale systems to support growing data needs · Define and enforce data retention and archival policies · Collaborate with analytics and product teams to support reporting and data needs · Document data pipelines, database processes, and operational procedures · Participate in code reviews and follow engineering best practices |
|
Required Education and Experience: |
|
|
Bachelor’s degree in Computer Science, Engineering, or a related field — or equivalent experience. |
3-5 years of Data Engineering Experience or equivalent experience |
|
Qualifications: |
|
|
· 3–5 years of experience in data engineering, database administration, or similar roles · Strong experience with PostgreSQL, MongoDB, and Amazon Redshift · Solid SQL skills for both transactional and analytical workloads · Experience with AWS data and database services (S3, Glue, Lambda, RDS, Redshift, etc.) · Proficiency in Python or another scripting language · Experience with workflow orchestration tools (Airflow, Step Functions, etc.) · Hands-on database administration experience, including: · MongoDB (replica sets, sharding, indexing, backups) · Redshift (cluster management, query tuning, WLM, snapshots) · PostgreSQL (replication, performance tuning, connection pooling) · Familiarity with monitoring tools (CloudWatch, pgBadger, MongoDB Atlas, etc.) · Understanding of database security (encryption, auditing, least-privilege access) · Strong problem-solving and analytical skills · Ability to translate business needs into data solutions · Comfortable working in a fast-paced, collaborative environment · Clear communicator with both technical and non-technical audiences · Self-motivated with a focus on clean, maintainable code Nice to Have · Experience with Kafka or Kinesis (streaming data) · Familiarity with dbt for data transformation · Knowledge of data lake/lakehouse architectures (Delta Lake, AWS Lake Formation) · Experience with Terraform or CloudFormation · CI/CD experience for data pipelines · Basic DevOps skills |
|