Description
In this role, you'll contribute across the stack: developing ingest pipelines, building scalable REST APIs, and facilitating data exploration and understanding. The platform supports large-scale data ingestion, complex queries, and interactive analysis. While your primary focus will be on the data-pipeline layer, you’ll collaborate closely with other sub-teams to ensure end-to-end functionality and performance. We’re looking for someone excited to work across the system and to improve team processes and tooling, especially for faster integration of new data sources.
Responsibilities
Lead the design and implementation of data-processing workflows
Manage all aspects of the data-processing lifecycle (collection, discovery, analysis, cleaning, modeling, transformation, enrichment, validation)
Develop and maintain data models and JSON Schemas to ensure integrity and consistency
Collaborate with analysts and engineers to meet data requirements
Manage and optimize data storage/retrieval in Elasticsearch and Dgraph (plus MongoDB and Redis)
Orchestrate dataflow using Apache NiFi
Mentor teammates on best practices for data processing and software engineering
Use AI platforms to support hybrid automated/manual data transformation, code generation, and schema management
Work with analysts, product owners, and engineers to ensure solutions meet operational needs
Propose and implement process improvements for faster delivery of new data sources
Required Skills & Experience
Strong data-wrangling and dataflow background (discovery, mining, cleaning, exploration, enrichment, validation)
Proficiency in JSON and JSON Schemas (or similar)
Solid data-modeling experience
Experience with NoSQL databases (Elasticsearch, MongoDB, Redis, graph DBs)
Familiarity with dataflow tools such as Apache NiFi
Extensive experience in Python or Java (both preferred)
Experience using generative AI for code and data transformation
Git for version control; Maven for build automation
Comfortable in a Linux development environment
Familiarity with Atlassian tools (Jira, Confluence)
Strong communication and teamwork skills
Nice to Have
Experience with various corporate data formats
Knowledge of Kafka or RabbitMQ
Proficiency in Java/Spring (Boot, MVC/REST, Security, Data)
AWS (EC2, S3, Lambda) experience
API design for data services
Frontend experience (modern JS + Vue.js or similar)
CI/CD (e.g., Jenkins), automated testing (JUnit)
Docker, Kubernetes, and other containerization tech
DevOps tools (Packer, Terraform, Ansible)
Qualifications
12+ years of relevant experience and a B.S. in a technical discipline
(Four additional years of experience may substitute for a degree)
If an employer mentions a salary or salary range on their job, we display it as an "Employer Estimate". If a job has no salary data, Rise displays an estimate if available.
Lead the architecture and delivery of production-grade AI systems for cyber operations, building resilient agent orchestration, MCP serving infrastructure, and advanced prompt engineering patterns.
Lead the architecture and delivery of production-grade AI systems for cyber operations, building resilient agent orchestration, MCP serving infrastructure, and advanced prompt engineering patterns.
Experienced UI Developer/Lead with active TS/SCI + polygraph needed to modernize mission systems using Java, Python, Splunk, and IaC for a federal contract.
Lead the architecture and delivery of secure, agent-based AI systems that turn high-value data into customer-facing, production-grade solutions at 3E.
Senior full-stack engineer role at LPL Financial building and modernizing account opening and onboarding systems using .NET, Angular, AWS, IaC and AI-assisted development tools.
Lucid Software is seeking a Software Engineer Intern for Summer 2026 to help design and build modern web applications using JavaScript, TypeScript, and cloud technologies.
Senior engineer needed to lead development and maintenance of multi-platform mobile SDKs and integrations for a privacy-focused mobile analytics company.
Intern with Arcade's backend and AI engineering team to build scalable model orchestration, inference, and production backend systems for generative product creation.
Design and operate Console’s core infrastructure to enable scalable, enterprise-grade deployments and accelerate developer velocity across the company.
Lead the design and delivery of complex application enhancements and support activities for Highmark Health while collaborating with business partners and mentoring junior team members.
Bjak is looking for an experienced ML Ops Engineer to optimize, serve, and scale open-source LLMs into production for high-impact global AI products in a hybrid remote/New York role.
Help build and deploy reliable, verifiable AI agents for regulated healthcare customers by implementing agent architectures, verification loops, and production integrations.
Lead cross-team architecture and delivery for Anduril's Lattice Platform, building scalable auth, data, and real-time systems that power mission-critical autonomous capabilities.
Pano AI seeks a Senior Frontend Engineer to design and deliver accessible, high-performance React interfaces for its wildfire-detection web platform.