Related Jobs

Share this Job
Backend Data Analytics Engineer job at beehiiv | Apply Now
Remote Tiger Inc, Ivy Lane, Greenbelt, MD, USA
Full Time
Are you looking for Remote Data Science and Analytics jobs in 2025 today? then you might be interested in Backend Data Analytics Engineer job at beehiiv
About the Organisation
beehiiv is a fast-growing startup revolutionizing how creators, publishers, and companies build and monetize their audiences through powerful newsletter platforms. Trusted by tens of thousands of top newsletters globally, beehiiv has scaled rapidly, achieving $7M in ARR by early 2024 with a projection of $40M by the end of 2025. Built by newsletter people for newsletter people, the company is known for its action-driven culture, ownership mentality, and deep commitment to user success.
Job Title
Backend Data Analytics Engineer job at beehiiv
beehiiv
Job Description
As a Backend Data Analytics Engineer at beehiiv, you’ll play a key role in developing and maintaining a robust and scalable data platform. This position is ideal for someone with a strong background in SQL, data modeling, and backend engineering who thrives in a startup environment.
You’ll be responsible for creating and maintaining APIs that expose business-centric metrics, refactoring schemas, building testing frameworks, and supporting real-time analytics services. The role requires close collaboration with application engineers and the broader data team. You’ll also have opportunities to work on segmentation architecture and event-driven systems.
You’ll work in an agile and fast-paced environment where your contributions directly impact product development and user experience. beehiiv values initiative, continuous learning, and collaboration across functions.
Duties, Roles and Responsibilities
Maintain and refine data model definitions and APIs for internal and external consumption
Translate business metrics into structured SQL queries and data models
Build RESTful APIs and services to abstract database complexity
Work with OLAP databases to power real-time analytics
Contribute to real-time event-based services and segmentation architecture
Refactor and consolidate data schema migrations
Create unit and integration testing frameworks
Collaborate with engineers across product and data teams
Enable customizable environments for testing and deployment
Assist in performance tuning, debugging, and production issue resolution
Qualifications, Education and Competencies
Required Skills:
Advanced SQL knowledge and experience with OLAP databases
Experience building and deploying RESTful APIs
Proficiency in Python or Ruby (or willingness to learn Ruby)
Familiarity with production-grade backend systems and debugging practices
Strong problem-solving and communication skills
Ability to work independently and in a team within a fast-paced environment
Nice to Have (Preferred):
Experience with Clickhouse, Kafka, Kafka Streams, Avro, and schema registries
Background in real-time streaming technologies
CLI development experience for data retrieval
Knowledge of GoLang, Bash scripting
CI/CD, Kubernetes, Helm, and AWS (EC2, IAM, ALB, networking)
Soft Skills:
Ownership mindset and a proactive attitude
Comfort with feedback and collaboration
Strong work ethic and alignment with startup culture
Passion for innovation and user success
How to Apply
ONLINE APPLICATION ONLY!
Interested candidates are advised that applications for this position must be submitted online. To apply please click the “Apply” button below.