KDnuggets Home » Jobs » Foot Locker: Data Platform Engineer ( 18:n15 )

Foot Locker: Data Platform Engineer


Seeking a candidate for a journey of building a brand new data lake platform being built on top of a cloud based platform utilizing the latest tech stacks.



At: Foot LockerFoot Locker
Location: Bradenton, FL
Web: www.footlocker.com
Position: Data Platform Engineer

_Contact_:
Apply online.

 

OVERVIEW

 
Foot Locker, Inc. is seeking an innovative individual who has a proven track record of building enterprise level platform components to support product development from multiple teams and lines of business. This role is expected to drive innovation through collaboration across our data science teams and business to help push Foot Locker, inc. to the next level. The team is embarking on a journey of building a brand new data lake platform being built on top of a cloud based platform utilizing the latest tech stacks. Ideal technologies for this individual would be Scala/Python/R, Spark, Streaming Libraries (Spark Structured Streaming, Flink, etc.), Kafka, Azure (AWS is ok too)

 

RESPONSIBILITIES

 
This role will include, but will not be limited to the following responsibilities:

  • Build and operate our cloud based data platform
  • Utilize container based technologies to help support a micro service based architecture
  • Have experience with DevOps / Automation tools to help minimize operational overhead for our platform
  • Build new data sets, and products helping support Foot Locker business initiatives.
  • Help grow our data catalog through ingestions of a variety of third party data sources, both internal and external
  • Work within an Agile/Scrum model
  • Build production quality ingestion pipelines with automated quality checks to help enable the business to access all of our data sets in one place
  • Participate in the continuous envolution of our schema / data model as we find more data sources to pull into the platform
  • Support our Data Scientists by helping enhance their modeling jobs to be more scalable when modeling across the entire data set
  • Participate in a collaborative, peer review based environment fostering new ideas via corss team guilds / specialty groups
  • Maintain comprehensive documentation around our processes / decision making

 

QUALIFICATIONS

 
To perform this job successfully, an individual must be able to perform each essential duty satisfactorily. The requirements listed below are representative of the knowledge, skill, and/or ability required. Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions. The successful candidate will have a high degree of motivation, exceptional attention to detail, and the ability to derive answers from communications with multiple parts of the business. This person will also require the ability to work well within a team atmosphere and has the flexibility to help with a variety of the department goals.
 

  • Bachelors Degree in Computer science or related field
  • 3+ or more years of related information technology experience
  • 2-5 years of strong experience directly related to Big Data technologies
  • Industry recognized certifications (simiar to below):
    • Azure / AWS Solutions Architect
    • TOGAF certified architect
    • J2EE Architect
  • Experience with enabling Data Science and Spark/Streaming based environments (Scala/Python/R, Spark, Streaming Libraries (Spark Structured Streaming, Flink, etc.)
  • Deep familiarity with PaaS services, containers, orchestrations specifically around Docker and Kubernetes
  • Demonstrated experience with the Scrum Agile methodology
  • Ability to influence others using reasoning, persuasion, and negotiation; excellent interpersonal skills.
  • Strong ability to learn new technologies in a short time.
  • Must possess well-developed verbal and written communication skills.

Desired Characteristics:

  • Strong technical presentations skills to Senior Executives
  • Previous experience at a fortune 500 company

Sign Up