Noetica is solving critical NLP problems at the heart of the trillion-dollar capital markets ecosystem with an in-house machine learning platform and decades of ML PhD expertise. Founded in 2022 and based in New York, our high-caliber team is dedicated to bringing innovation and efficiency to legal and financial industries.
We count many of the top law firms in the world among our customers.
To date we have raised ~$30M with our last round (Series A) led by Lightspeed.
We're fortunate to have been covered by Matt Levine, Bloomberg Law, and Business Insider, amongst others, and named a top AI company 2024 and 2025 by Business Insider, CB Insights, and others.
The global capital markets are among the largest markets in the world valued at $50T+ and growing. Transactions in these markets are complex. Critical, nuanced legal terms are woven into lengthy documents. These documents must be digested and analyzed by many different parties over the course of a single transaction and multiple decisions are made based on the terms laid out and quantitative and qualitative attributes within. This fundamental need plus a booming market drives enormous demand for precise extraction and benchmarking of legal terms.
Why We Need You
As a Senior/Staff Data Platform Engineer at Noetica, you will be responsible for developing and maintaining the core infrastructure that processes and enriches complex data at scale. The Data Platform team is at the heart of our cutting edge AI/ML research, connecting it directly to our data product that delivers market-moving insights to our users. You will also play a vital role in scaling up the platform to meet the needs and expectations of our rapidly growing user base. The team is also the efficiency engine that removes the burden of productionization from our researchers, and offers readily accessible, high-quality structured data that enables product teams to build exceptional end-user experiences.
Here’s What You’ll Be Doing
As a member of the Data Platform team, you will:
Build and Maintain the Core Data Platform: Design, develop, and operate the shared data platform infrastructure that powers our data products. Ensure high availability and high performance at scale. Ensure data security and compliance in a highly-regulated industry.
Empower AI/ML Research: Productionize cutting-edge research results into data pipelines. Design, develop, and operate internal systems that enable research experimentations and enable rapid iterations on our AI/ML technology.
Enable Product Team Velocity: Design, develop, and operate performant and highly-available data serving systems that accelerate end-user product development.
Solve Complex Data Processing Challenges: Implement elegant, robust, and highly scalable solutions to address cutting-edge data processing requirements and performance/efficiency bottlenecks.
Reliability and Observability: Design and develop full observabilities into the entire data processing platform. Design and develop internal systems for monitoring the quality of our foundational datasets. Ensure that our foundational datasets are reproducible and trackable.
Scale Infrastructure for Growth: Proactively scale the infrastructure to meet the increasing demands of our rapidly expanding client-facing products and user base.
Our Ideal Candidate
Has Proven Experience in Microservices: You possess hands-on experience developing backend applications with microservice architectures, ideally deployed on Kubernetes. You are familiar with API/library development and have a good taste on designing durable interfaces.
Is Familiar with Data Processing Workflows: You have practical exposure to stateful data processing workflows and tools like Airflow, Prefect, Dagster, or Flyte. You are generally familiar with the developments and trends in the data ecosystem and have opinions on the best tool for the job.
Is Passionate about Fintech/Legaltech: You have a genuine interest in the intersection of finance and law, recognizing the immense potential for technological innovation in these fields. You're excited by the opportunity to bring cutting-edge solutions to an industry ripe for transformation.
Is a Driven and Agile Learner: You are a quick learner, eager to explore new technologies and adapt to a fast-paced environment. You like to challenge yourself to go deeper into the stack.
Is a Collaborative Communicator: You are diligent, highly motivated, dynamic, and possess excellent communication skills, thriving in a team setting.
You Need These Qualifications
Bachelor's degree in Computer Science, Information Technology, or a related field (or equivalent experience).
3+ years of industry experience as a Software Engineer.
3+ years working with Python.
Strong understanding of AWS services and architectures, with hands-on experience in managing cloud infrastructure.
The reasonably estimated yearly salary for this role is $180,000-$220,000 USD. You will also be entitled to receive a significant early stage equity package and benefits. Individual pay decisions are based on a number of factors, including qualifications for the role, experience level, and skillset.
We offer numerous employee benefits, including:
Hybrid in-office schedule
Amazing office location next to Bryant Park/Grand Central
401(k) retirement plan
Wellhub (Gympass) fitness membership
Unlimited PTO
Unlimited sick days
Medical, dental, and vision insurance
Company offsites
Commuter benefits
We are an equal opportunity employer. We search for amazing people of diverse backgrounds, experiences, abilities, and perspectives. We take care of each other to create an inclusive work environment where we love to come to work every day. We'd be happy to provide reasonable accommodations to help you apply—just email us at [email protected]. We hope you can join us.
If an employer mentions a salary or salary range on their job, we display it as an "Employer Estimate". If a job has no salary data, Rise displays an estimate if available.
Early-career data engineer needed to build ETL pipelines and data models that support workforce management analytics at a leading healthcare organization.
Experienced Dataiku Architect wanted to lead design and implementation of scalable, secure data pipelines for enterprise clients in a global consulting firm.
Experienced MicroStrategy-focused BI developer needed to architect, develop, and support enterprise BI solutions and mentor the BI team at WinCo Foods in Boise.
Lead and scale the analytics function for a high-growth fintech company, building teams and frameworks that turn complex data into strategic decisions.
Senior Analytics Engineer at Ramp to design and deliver production-grade data models, dashboards, and analytics that inform product, risk, and growth decisions across the business.
Coastal Community Bank is hiring a Head of Data Analytics to lead an AI-first analytics organization, delivering scalable data products and enabling data-driven decisions across the business (remote, US).
Senior Data Engineer needed to design and implement scalable cloud-based data pipelines and architectures for healthcare payers using Databricks, Snowflake, and modern ETL best practices.
Lead Braviant’s enterprise BI strategy and teams to build scalable dashboards, governance, and analytics that turn complex data into actionable insights for a high-growth fintech.
Design and operate scalable data pipelines and analytics solutions as a Senior Data Engineer on Visa's Foster City technology team.
Lead development of scalable cloud-native data pipelines and analytics platforms at CVS Health to enable data science, reporting, and product initiatives while mentoring junior engineers.
Rula is hiring a Staff Analytics Engineer to build scalable dbt-based data models and BI assets that power clinical and operational insights across the company.
Patterson Companies is hiring a remote Data Engineer to build and maintain data applications, optimize database performance, and support data-driven solutions across platforms.
ChabezTech seeks a seasoned Data Engineer (5+ years) proficient in PySpark, Databricks and AWS to build and optimize ETL and analytics pipelines for telecom and retail engagements.