Manager, Data Engineering
Our Opportunity:
Chewy's Supply Chain Team is seeking a Manager, Data Engineer to join the growing Supply Chain Long Range Forecasting Analytics team in Bellevue, WA. Combining a background in data architecture, data engineering, data analysis, reporting, and a basic understanding of data science techniques and workflows, you will be a part of a team responsible for strategic forecasting and research initiatives centered around our operational and financial planning. This includes building high quality data pipelines that drives analytic solutions and creating data products for analytics and data scientist team members to improve their productivity. The Supply Chain team operates in a fast-paced environment where every day brings new challenges and new opportunities.
You will be reporting into the Director of Long Range Forecasting and will be responsible for building and implementing data products and technologies which will handle the growing business needs and play a key role in redefining what it means to be a world-class ecommerce organization.
This role will start as a hands-on data engineer then will rapidly recruit and grow a team of data engineers to support all relevant forecasting activities.
What You’ll Do:
- Design, develop, optimize, and maintain data architecture and pipelines that adhere to ETL principles and business goals
- Solve complex data problems to deliver insights that helps our business to achieve their goals
- Create data products for analytics and data scientist team members to improve their productivity
- Design and implement a data governance framework for long range forecasting activities.
- Assist in creating Proof of Concepts and advise, consult, mentor and coach other data and analytic professionals on data standards and practices
- Lead the evaluation, implementation and deployment of emerging tools and process for analytic data engineering in order to improve our productivity as a team
- Develop and deliver communication and education plans on analytic data engineering capabilities, standards, and processes
- Code, test, and document new or modified data systems to create robust and scalable applications for data analytics
- Work with Data Science Leads and developers to make sure that all data solutions are consistent
- Partner with Business Analysts and Solutions Architects to develop technical architectures for strategic enterprise projects and initiatives
- Ensure all automated processes preserve data by managing the alignment of data availability and integration processes
- Document technical details of your work
What You’ll Need:
- Bachelor of Science or Master’s degree in Computer Science, Engineering, Information Systems or related field
- 5+ years of experience in Data Engineering or Business Intelligence roles working with ETL, Data Modeling, and Data Architecture, developing modern data pipelines and applications for analytics (e.g., BI, reporting, dashboards) and advanced analytics (e.g., machine learning, deep learning, mathematical optimization) use cases
- 3+ years managing a team in a formal or informal capacity.
- Experienced with setting up end to end Data pipelines in an Enterprise environment
- Experience with software development processes such as building, unit testing, code analysis, release process, and code coverage
- Demonstrated analytical and problem-solving skills, particularly those that apply to a ‘big data’ environment
- Expert ETL and SQL skills, knowledge of industry best practices, and a deep understanding how data is extracted, transformed, scrubbed and loaded in a large Data Warehouse environment
- Proficiency in Python and Jupyter, JSON a plus
- Expertise with managing time series data sets with various native frequencies
- Experience with developing solutions on cloud computing services and infrastructure with AWS (preferred) or similar (such as GCP or Azure)
- Experience with database development using a variety of relational (Oracle), NoSQL, and cloud database technologies
- Expertise designing and implementing data pipelines using modern data engineering approach and tools: Spark, PySpark, Scala, Docker, Databricks, Glue, cloud-native DWH (Snowflake, Redshift), Kafka/Confluence, Presto/ Dremio /Athena
- Experience with CI/CD process and platforms e.g. Airflow
- Excellent written and oral communication skills and ability to work with development teams
- Position may require travel
Bonus:
- AWS Certified Developer/AWS Machine Learning
- Tableau Dashboarding
- Ability to effectively operate both independently and as part of a team
- Self-motivated with strong problem-solving and self-learning skills
- Flexible with respect to changes in work direction as the project develops
- Strong advocate of a culture of process and data quality in all development team
- E-com, Retail or startup experience is a plus
Chewy is committed to equal opportunity. We value and embrace diversity and inclusion of all Team Members.
If you have a disability under the Americans with Disabilities Act or similar law, or you require a religious accommodation, and you wish to discuss potential accommodations related to applying for employment at Chewy, please contact [email protected].
To access Chewy’s Privacy Policy, which contains information regarding information collected from job applicants and how we use it, please click here: https://www.chewy.com/app/content/privacy).