Data Engineer III

  • Full-time

Company Description

Les Schwab Tire Centers bring you the best selection, quality, and service every time on tires, brakes, wheels, batteries, shocks, and alignment services.

Job Description

POSITION SUMMARY:
Data Engineers provide data services to stakeholders within portfolios and throughout the
enterprise by building and managing data resources to support reporting, business
intelligence, analytics, data science projects and operational applications. They engage with
data-centric product managers and developers, technology teams, analysts, and business
partners to understand capability requirements and define and construct data structures and
solutions based on priorities.
The Data Engineer III has advanced proficiency in the design, development, and operational
tasks required to provide accurate, timely, high quality data critical to feed data products (i.e.,
reports, analyses and data science models) and support downstream applications. The Data
Engineer III has deep understanding of available business data resources and how to craft
robust, practical data solutions and works with business stakeholders and technical
counterparts on best practices for stewardship, building and maintaining data resources.
This role is within the Information and Digital Services organization at Les Schwab
headquarters in Bend, OR.

GENERAL % OF WORK TIME

PRIMARY RESPONSIBILITIES/FUNCTIONS

20%

Operational Support
● Build and deliver ad hoc data sets to support business analysis, data
analysis, analytics, proofs-of-concept, and other use cases
● Create and review complex SQL scripts and queries in support of
reporting and analytics applications
● Evaluate and implement appropriate technologies and methods to
automate data preparation and data movement with Les Schwab
standard data stores, tools, and platforms
● Monitor and troubleshoot manual and automated data preparation and
data movement processes
● Solve complex technical problems and mentor/support other technical
staff in developing workarounds and resolving operational issues
● Create, maintain and review operational procedures and related
documentation

30%

Data Structure and Solution Design
● Collaborate with business systems analysts, data analysts, business
stakeholders, and analytics practitioners to understand data product
and downstream system data requirements
● Collaborate with data stewards and data source managers to
understand data definitions and business rules relevant to data
structure design
● Perform detailed design of data structures from inception through
production support
● Create data models for relational and dimensional database schemas
for a range of use cases from targeted reporting solutions to support
of downstream applications
● Conduct information modeling, including conceptual models, relational
database designs and message models
● Perform and review designs and solutions for data manipulation, data
preparation, and data movement processes for a variety of scenarios
from simple file-based export/import to enterprise-grade ETL
workflows connecting multiple structured and unstructured endpoints

20%

Data Structure and Solution Development
● Catalog existing data sources and enable access to resident and
external data sources
● Develop programming, modeling and data integration, supporting task
automation.
● Create physical database tables, views, and flat files for analytics
research projects, reporting, analytics applications, and for publication
to downstream subscribers
● Solve complex technical problems and mentor/support other technical
staff in developing data models, data structures and ETL solutions

February 2016 Headquarters and Prineville Job Description Page 2 of 5

● Create and maintain data dictionaries, data model diagrams, data
mapping documents, data security and quality requirements and
related data platform documentation
● Work in an agile team environment to deliver timely analytics solutions
and insights within a dynamic learning organization

15%

Data Platform Quality and Governance
● Develop a quality assurance framework to ensure the delivery of high
quality data and data structure analyses to stakeholders
● Collaborate with Digital Services colleagues to select and adopt best
practices within a culture of data management excellence
● Implement and continuously improve development best practices,
version control and deployment strategies to ensure product quality,
agility and recoverability
● Implement and test data access roles and permissions to ensure
“least privileged” access to enterprise data and reduce the enterprise
risk of data exposure
● Proactively identify opportunities to improve data work flows and
query performance

10%

Stakeholder Relationship and Vendor Management
● Support Data Stewards to establish and enforce guidelines for data
collection, integration and maintenance
● Provide expert advice to empower Information and Digital Services
colleagues to understand and utilize Enterprise Data Platform
Services and Data resources
● Collaborate with Portfolios across the enterprise to ensure that roles
for data access follow “least privilege” principles, yet there is high data
literacy and awareness of enterprise data available for use
● Make all stakeholders ethically aware of the unintended
consequences of the use of data and identify risks and opportunities
to be communicated throughout the enterprise

5%

Resource Development: Mentoring and Best Practices
● Provide mentoring and expert advice for the development of complex
and high-performance SQL
● Promote best practices for building quality into data structure, ETL
and data solutions to improve efficiency and lower probabilities of
defects within the data platform and downstream
● Move beyond the scope of individual projects and promote, guard and
guide the organization toward common semantics and proper use of
metadata
● Take responsibility for streamlining data pipelines across the
enterprise, ensuring they are coordinated, consistent, efficient and
production ready

Qualifications

Required Technical Skills/Knowledge:
● Expert knowledge of data modeling
● Expert developer of Advanced SQL (analytical functions)
● Deep knowledge of query performance tuning
● Deep knowledge of data analysis techniques for testing and troubleshooting
● Deep knowledge of ETL process development
● Expert proficiency in writing and maintaining data management documentation such as
data dictionaries, data catalogs, and integration data maps.
● Understanding of stakeholder processes for reporting and data analytics to serve
business decision-making
● Understanding of data stewardship concepts
● Proficiency and demonstrated experience with a programming language such as
Python, R, JavaScript, Java C#, Go, or similar. Advanced procedure or function
development in T-SQL, Oracle PL/SQL or equivalent also acceptable
● Proficiency and demonstrated professional experience working with flat file data
formats including delimited files, XML, and JSON
● Practical experience using solution delivery collaboration software such as Service
Now, Jira, TFS, or similar

Additional Information

Educational/Experience Requirements:
● Bachelor’s degree (BS or BA) in STEM related discipline with major in Computer
Science/Information Management/Database Development and Analysis or equivalent
disciplines or equivalent experience with appropriate time-in-role
● 6+ years of experience with data warehouse technical architectures, ETL/ELT,
reporting/analytic tools and scripting
● Experience with AWS services including S3, Data-pipeline and cloud based data
warehouses
● Experience with data visualization tools such as Tableau or Birst is a plus