Senior Software Engineer, Data Services

  • San Francisco, CA, USA
  • Full-time

Company Description

Optimizely is the world’s leading experience optimization platform, providing website and mobile A/B testing and personalization for the world’s leading brands. The platform’s ease of use and speed of deployment empowers organizations to conceive of and run experiments that help them make better data-inspired decisions. Optimizely meets the diverse needs of thousands of customers worldwide looking to deliver connected experiences to their audiences across channels. To date, those customers have created and delivered more than 700 billion optimized visitor experiences.

Job Description

Data Services team is the backbone of Optimizely, providing data, and analytics infrastructure to the entire company.  Our customers are different departments within Optimizely - Finance/Accounting, Sales, Marketing, Success, Product/Engineering, and Human Resources.   We are laser-focused on enhancing operational efficiency in various parts of the organizations by providing tools, services, and data to our employees and empower them to accelerate the delivery in their respective job function.  Today, solutions from Data Services, feed into critical business processes, positively impacting and influencing initiatives of tens of millions of dollars.

To effectively realize our team ambition, we are looking for a seasoned Software Engineer with a focus on data-oriented applications. It is a must that our future colleague to be passionate about building data services solutions to significantly contribute to Optimizely’s revenue growth. This is a high visibility role across the organization and the solutions delivered by the team will have an organization-wide impact. The job would be technologically challenging with a variety of tech stack and it also involves working with a datastores ranging from NoSQL datastores, relational databases, to 100s of terabytes of Optimizely events. The role will work closely with engineers from other teams, product managers, data analysts and data scientists on data initiatives.

Some of Our Impactful Past Accomplishments:

  • Evangelize Data-Driven Culture at Optimizely on Customer and Product Information:

  • Sales lead curation: Optimize Sales process to maximize the deals

  • Insights into Renewal processes: Customer retention and expansion are critical to the business and data plays a big role in the effectiveness of this process

  • The backbone of Pricing Innovation: Pricing innovation and Plans structure is the key to help customers understand the value of Optimizely products to their business.

  • MRR Engine: #1 business metric for SaaS companies.  This engine helps finance department to speed up operations as well as help in the forecasting process.

  • Influence Product Investments: Monitoring product/feature usage from our customers greatly helps the Product Organization to make investments with high ROI

Responsibilities: 
  • Create, maintain optimal data pipeline architectures with a high focus on efficiency dealing with large datasets

  • Assemble and aggregate large, complex data sets that meet functional / non-functional business requirements.

  • Design, and implement services and solutions for internal process improvements to maximize business efficiency: automating manual processes

  • Build infrastructure required to work with various systems such as AWS, Google Data Store, and several systems via APIs.

  • Build analytics tools that utilize the data to provide actionable insights into customer acquisition, operational efficiency, and other key business performance metrics.

  • Closely work with data scientists to understand and potentially productize the output from the models into applications

Qualifications

  • Strong communication and organizational skills.  Prior experience working with GTM teams would be a plus

  • Strong analytic skills related to working with unstructured datasets.

  • Strong acumen to perform root cause analysis at both data and processes level to articulate a response to specific business questions.

  • Experience building and optimizing data pipelines, architecture that involves large datasets would be a plus

  • 5+ years of engineering experience with a focus on data-intensive applications; Graduate degree in Computer Science, Statistics, Information Systems or any quantitative field.  

  • Tech Stack:

    • Good programming and algorithmic skill; We are language agnostic.

    • Advanced working SQL knowledge and experience working with relational databases, and non-relational databases.

    • Experience with AWS cloud services: EC2, RDS, Redshift

    • Experience/Exposure to workflow management tools like Airflow, etc.

    • Experience/Exposure to big data ecosystem tools: Hadoop, Spark, Kafka, etc.

Additional Information

Perks:

  • Commuter and transportation benefits
  • Catered in-office lunch and dinner on weekdays
  • Full medical insurance with very low co-pay and deductible. HMO, PPO, and HSA options available
  • Full dental coverage including orthodontics
  • Full vision coverage including contacts
  • Dependents 100% covered for medical, dental, and vision
  • Wellness Grant
  • Unlimited vacation policy and seventeen weeks of paid parental leave
  • 401k benefit
  • Working with a great team and having a huge impact!

 

All your information will be kept confidential according to EEO guidelines.

Privacy Policy