Location: San Francisco, CA
Description:
At Levi Strauss & Co, we are revolutionizing the apparel business and redefining the way denim is made.
We are taking one of the world’s most iconic brands into the next century:
from creating machine learning-powered denim finishes to using block-chain for our factory workers’ wellbeing, to building algorithms to better meet the needs of our consumers and optimize our supply chain.
Be a pioneer in the fashion industry by joining our global Data, Analytics & AI “startup with assets,” where you will have the chance to build exciting solutions that will impact our Global business and at the same time be part of a bigger, across-continents, data community.
As a Senior Big Data Engineer, you will bring to life and execute architecture blueprints & help our teams by mapping out solutions to some of their complex technical challenges. You’ll provide technical expertise, mitigate risk and offer solutions tailored for their business needs. From migrations of existing workloads to building advanced cloud solutions, you’ll help shape and execute to increase agility, improve security, reduce costs and meet utilization targets.
This role will work closely with the Data Engineering, Data Science, Infrastructure & Platform and will focus on creating blueprints to nourish a culture of engineering excellence. We need someone who will bring thoughtful perspective, empathy, creativity, and a positive attitude to solve problems at scale.
Responsibilities:
- Experience in at-scale infrastructure design, build and deployment with a focus on distributed systems.
- Build and maintain architecture patterns for data processing, workflow definitions, and system to system integrations using BigData and Cloud technologies.
- Retail domain knowledge and experience in creating features in one or more specific areas, supply chain, customer centricity, operations and planning.
- Evaluate and translate technical design to workable technical solutions/code and technical specifications at par with industry standards. Drive creation of re-usable artifacts.
- Actively scan and evaluate relevant new technologies which drive standardization and reduction of complexity within the enterprise.
- Contribute to and promote good software engineering practices across the team.
- Works on the collaborative Enterprise team and deliveries work products that support a digital transformation by leading assigned product and solution teams
- Embody the values and passions that characterize Levi Strauss & Co., with empathy to engage with colleagues from a wide range of backgrounds
- Performs technical design reviews and code reviews, to include cloud-based architecture and integrations.
- Communicate clearly and effectively to technical and non-technical audiences
Qualifications:
- Bachelor’s Degree or higher in Computer Science or related discipline
- Minimum of 7 years in a hands-on technical role as an engineer and/or Data engineering role.
- Experience with at least one cloud provider solution (AWS, GCP, Azure)
- Strong experience working with Big Data technologies ex: Spark, Spark SQL, Pyspark Structured Streaming, Kafka.
- Expertise in AI/ML Life cycle and Hands on experience in cloud and big data based Technologies including Spark, containers, Kubernetes.
- Demonstrated capability to deploy feature pipelines to production.
- Ability to work with Data Engineers and AI/ML Scientists and demonstrated capability to create advanced features.
- Experience designing various consumption patterns on top of data lake/lake house to cater to different personas within organization.
- Strong experience with at least one object-oriented/functional languages: Python, Java, Scala, etc.
- Strong knowledge of data pipeline and workflow management tools (Airflow).
- Hands-on experience in Data engineering Spectrum, for e.g. developing metadata-based framework based solutions for Ingestion, Processing etc., building Data Lake/Lake House solutions.
- Strong experience working with a variety of relational SQL and NoSQL databases and ability to choose a database based on the need.
- Working knowledge of Git hub /Git Toolkit.
- Strong experience with MPP Database systems like Teradata, Oracle or SAP Hana
- Experience with data modeling, data warehousing, KPI generation etc.
- Proficient in documenting use cases, requirements and functional specifications.
- Ability to effectively prioritize and execute tasks in a high-pressure environment is crucial.
- Working in a collaborative environment and interacting effectively with technical and non-technical team members equally well.