Back to all jobs

Data Engineer 4Hybrid

Work TypeContract/Temp
Positions1 Position
Published At:a day ago
  • AWS
  • python
  • Scala
  • Databricks
  • Data Engineering
  • Airflow
Onsite: Mon-Thurs
Category: Data and Analytics
  • Innovative Technology; High Quality Products, Self-Empowerment
  • Globally Responsible; Sustainable Products, Diversity of Thought
  • Celebration of Sports; If You Have a Body, You are an Athlete

Title: Data Engineer 4

Location: Salem, OR

Duration: 9 Month Contract

NIKE, Inc. does more than outfit the world's best athletes. It is a place to explore potential, obliterate boundaries and push out the edges of what can be. The company looks for people who can grow, think, dream and create. Its culture thrives by embracing diversity and rewarding imagination. The brand seeks achievers, leaders and visionaries. At Nike, it’s about each person bringing skills and passion to a challenging and constantly evolving game.

WHAT YOU WILL DO

  • Contribute to Design and implement data products and features in collaboration with product owners, data analysts, and business partners using Agile / Scrum methodology
  • Contribute to overall architecture, frameworks and patterns for processing and storing large data volumes
  • Contribute to evaluation of new technologies/tools/frameworks centered around high-volume data processing
  • Translate product backlog items into logical units of work in engineering
  • Implement distributed data processing pipelines using tools and languages prevalent in the big data ecosystem
  • Build utilities, user defined functions, libraries, and frameworks to better enable data flow patterns
  • Work with engineering leads and other teams to ensure quality solutions are implemented, and engineering best practices are defined and followed
  • Build and incorporate automated unit tests and participate in integration testing efforts
  • Utilize and advance continuous integration and deployment frameworks
  • Troubleshoot data issues and perform root cause analysis
  • Work across teams to resolve operational & performance issues 

WHAT YOU WILL NEED

  • Bachelor’s degree in Computer Science, or related technical discipline
  • 10+ years of experience in large-scale software development, 5+ years of big data experience · Programming experience, Python or Scala preferred.
  • Experience working with Hadoop and related processing frameworks such as Spark, Hive, etc. · Experience with messaging/streaming/complex event processing tooling and frameworks
  • Experience with data warehousing concepts, SQL and SQL Analytical functions
  • Experience with workflow orchestration tools like Apache Airflow
  • Experience with source code control tools like Github or Bitbucket
  • Ability to communicate effectively, both verbally and written, with team members
  • Interest in and ability to quickly pick up new languages, technologies, and frameworks
  • Experience in Agile/Scrum application development 
  • Experience with Java
  • Experience working in a public cloud environment, particularly AWS, databricks
  • Experience with cloud warehouse tools like Snowflake
  • Experience working with NoSQL data stores such as HBase, DynamoDB, etc.
  • Experience building RESTful API’s to enable data consumption
  • Experience with build tools such as Terraform or CloudFormation and automation tools such as Jenkins or Circle CI
  • Experience with practices like Continuous Development, Continuous Integration and Automated Testing 

WHO WE ARE LOOKING FOR

These are the characteristics that we strive for in our own work. We would love to hear from candidates who embody the same:

  • Desire to work collaboratively with your teammates to come up with the best solution to a problem
  • Demonstrated experience and ability to deliver results on multiple projects in a fast-paced, agile environment
  • Excellent problem-solving and interpersonal communication skills
  • Strong desire to learn and share knowledge with others 
  • Published on 31 Oct 2025, 10:34 PM