Sr. Data Pipeline Engineer
- Downtown, Seattle, WA, United States
LiveStories is a venture-backed company building modern tools to make civic data actionable and usable. Governments - world’s largest sector - depend on LiveStories to be more data-driven, transparent, and productive. Businesses, schools, and researchers depend on LiveStories to streamline their data operations. Our customers are located as far as Kenya or as close as Washington and California.
You can find the latest coverage about LiveStories here: https://www.livestories.com/news/
We love data and believe in its power to transform how we live, work, and play. We obsess over simplifying access to data, so that anyone can gather insights. We look for people who can demonstrate versatility and creativity in working with data. If you obsess over changing the world, and are passionate about open data, cool data visualizations, data-driven businesses, high velocity sales, and Sim City, we would love to talk to you.
What we want to build doesn't exist in the world. Like all great inventions, we are seeking visionaries who can strike the delicate balance between technical innovation and shipping products on time. As a Sr. Data Pipeline engineer at LiveStories, you will build and scale our data ingestion, processing and pipelines. You will be responsible for building a pipeline that aggregates hundreds of billions of data points and performs well for web-based data apps doing deep analysis. You will work closely with both the data ingestion, data science and UI teams and will make sure all requirements are adequately met. You will mentor other engineers and help provide technical leadership for the team.
- You have 5+ years of experience building, scaling and maintaining software and data infrastructure at a large scale SAAS company.
- You take pride in building simple, effective solutions that improve engineering productivity.
- You have hands-on experience implementing backend services, reactive pipelines, and databases.
- You understand the DevOps mindset and are experienced with container technologies.
- Ultimately, you want to be part of something bigger than software and make a durable change.
Technologies you master:
- You are comfortable writing data processors in distributed systems. Threads, asynchronous messaging, and data consistency patterns are standard tools.
- You have implemented scalable Web APIs and understand caching layers and controls.
- You write code for scalability and efficiency, and ultimately realize that simplicity is the most important thing to accomplish.
- You are familiar with cloud providers and container automation (AWS and Kubernetes in particular).
- You have designed and maintained databases, both relational and NoSQL. Postgres and Cassandra are primary stores in our tech stack.
- Golang, YAML, and Cloud Formation are good general purpose tools for the job.
Our benefits include:
- Competitive, performance based compensation in a growing, disruptive startup
- Health, dental, and vision benefits
- Flexible vacation policy, including paid vacation
- 3 day weekends
- An incredible team of diverse, smart, dedicated, and supportive people