Enterprise Data Architect

  • Full-time

Company Description

QuantumWork was founded on a culture that is passionate about transforming the way the world acquires talent by delivering innovative solutions that make a difference for businesses worldwide.  Our mission is to intelligently organization talent data to match all forms of work to all forms of talent while providing data-driven insights.  Our integrated solutions drive business results.

As an industry leader, we draw upon decades of experience to design innovative technology, products, and services. We develop competitive practices that position organizations for growth, and we deliver the insight needed to succeed in today's global marketplace.

Job Description

The Enterprise Data Architect position plays a key role in providing essential data architecture, integration delivery/support, and contributions as part of QuantumWork team.  This position needs to foster innovative thinking and very high focus on quality and timeliness of deliverables working closely with application architects, data science and architecture teams, as well as clients and business partners. This position will serve as a critical liaison between clients and business partners; working closely with clients and business groups to understand needs and define requirements and solutions for the core QuantumWork data structure, integration layer, and overall analytics.  The Enterprise Data Architect position will lead processes and builds that cover requirements gathering, systems analysis, and cost/benefits analysis to align data architecture and solutions for all initiatives. This position is critical to driving the strategic value of QuantumWork.

 Responsibilities

  • Be able to architect, design, and develop data models (actual models and as required be able to create physical structures of those models).
  • Have a hands-on approach to solving problems; when required, be able to lead-by-example-type of person in developing code and references that team can use and follow.
  • Have in-depth knowledge of data architectures, data warehouses and methodologies in AWS cloud implementations
  • Good working knowledge of data models and be able to use any industry-leading data modeling tools to create data models and update them as needed.
  • Maps the systems and interfaces used to manage data sets, set standards for data management and our integration layer
  • Analyzes current state and conceives desired future state, and formulates projects needed to close the gap between current and future goals.
  • Ability to astutely operate in the organization and being able to emphasize methodology and governance.
  • Ability to work closely with the users, application architects and the developers on a project both in Agile & SDLC methodology
  • Need to be able to have an end-to-end vision, and to see how a logical design will translate into one or more physical databases, and how the data will flow through the successive stages involved.
  • The data architect will need to be able to address issues of data migration (validation, clean-up, and mapping), and understanding the importance of data dictionaries
  • Coordinate with business partners, technology analysts, and developers to identify and define specifications, indicate areas of system impact, and continuously communicate project status and needs.
  • Manages change control processes and ensures program/project communications.
  • Strong leadership, communication, and presentation skills with the ability to communicate status to the senior management team.

Qualifications

  • 7+ years of experience as a data architect, integration developer, data warehouse developer or data related operations
  • Graduation from a 4-year college or university with major course work in a discipline related to the requirements of the position is preferred.
  • Data modeling – concepts, modeling experience, and be able to create from scratch.
  • Minimum 2+ years of expertise in designing, implementing large scale data pipelines for data curation and analysis, operating in production environments, using Spark, PySpark, SparkSQL, with Java or Scala on-premise or on Cloud (AWS)
  • Mid-level to expert knowledge in data integration concepts, source-to-target mapping techniques and requirements documentation for ETL team to follow
  • Working knowledge of Big Data concepts in organizing both structured and unstructured data is a big plus
  • Highly skilled in managing all aspects of all sizes and complexities of multiple projects simultaneously across the software development life cycle phases (SDLC).
  • Strong track record of interviewing business partners, setting and managing expectations concerning deliverables and requirements
  • Understanding of Agile development methodologies and software project life cycle and associated methodologies.
  • Ability to prioritize and handle multiple projects.
  • Excellent problem-solving skills and analytical skills.
  •  Excellent verbal and written communication skills with the ability to establish a deep understanding of client’s business issues.

Additional Information

All your information will be kept confidential according to EEO guidelines.

Privacy Policy