Let’s get started
By clicking ‘Next’, I agree to the Terms of Service
and Privacy Policy, and consent to receive emails from Rise
Jobs / Job page
Lead Specialist, Azure Data Solutions - 25050 image - Rise Careers
Job details

Lead Specialist, Azure Data Solutions - 25050

Overview

World Wildlife Fund (WWF), one of the world’s leading conservation organizations, seeks a Lead Specialist, Azure Data Solutions.

 

The Lead Specialist, Azure Data Solutions is a hands-on data engineering role responsible for designing, implementing, and maintaining end-to-end solutions using Microsoft Fabric and Azure services. This role involves configuring and managing Fabric workspaces, developing data models and pipelines, integrating data sources into OneLake and Blob Storage, and ensuring data quality and integrity. The specialist also creates and manages Power BI reports and dashboards, optimizes solutions for performance and reliability, and stays updated with the latest Fabric and Azure best practices.

 

The role also serves as a trusted proxy for the Director of Data Solutions in meetings, representing the team’s work and providing guidance on data best practices.

 

Salary Range: $126,600 to $181,900

 

Please note: Applicants must be legally authorized to work in the U.S. This position is not eligible for employment visa sponsorship.  In compliance with federal law, all persons hired will be required to verify identity and eligibility to work in the United States and to complete the required employment eligibility verification form upon hire. 

*All applicants are strongly encouraged to submit a cover letter

**This position is NOT eligible for relocation

***This position is hybrid  - 2 days per week at our DC office

Responsibilities

  • Designs, implements, and maintains data solutions using Microsoft Fabric and Azure services.
  • Configures and manages Fabric workspaces, semantic models, dataflows, and pipelines, applying best practices such as medallion architecture.
  • Designs Fabric architectures and environments to ensure scalability, reliability, and governance.
  • Supports the integration of data sources into OneLake, Azure SQL Database, and Azure Blob Storage.
  • Develops and maintains ETL/ELT processes using Python and PySpark with Fabric Data Engineering and Azure Data Factory.
  • Builds semantic models (e.g., star schema) and delivers both simple and advanced Power BI reports.
  • Evaluates and migrates legacy workflows and systems into Microsoft Fabric, modernizing pipelines and solutions.
  • Ensures data quality and integrity through data validation and cleansing processes.
  • Understands Azure permissions and access requirements (e.g., service principals, security groups, least-privilege) to coordinate requests with infrastructure/security teams.
  • Monitors and optimizes Fabric data solutions for performance, scalability, and costefficiency.
  • Provides technical support and troubleshooting for data-related issues.
  • Collaborates with other WWF teams building their own pipelines, offering best practices guidance or stepping in to support delivery.
  • Identifies opportunities to enhance Microsoft Fabric and Azure data services and stay current with the latest updates and best practices.
  • Performs other duties as assigned

 

Key Competencies

Core capabilities for this role include analytical thinking, problem solving, communication skills, and business knowledge. Specifically, the following competencies are essential:

  • Interpersonal Communication and Collaboration - collaborate effectively with colleagues, and receive and incorporate feedback well.

  • Technical Adaptability - Embraces new technologies, adjusts to workflow modifications, and thrives in dynamic environments.

  • Data analysis/Analytical Thinking - Ability to investigate a problem and find the ideal solution in a timely, efficient manner; Ability to read, understand, and interpret data to solve business challenges or create a compelling narrative to inform business decisions.

  • Big Picture Thinking - the ability to solve complex problems with a “big picture” perspective and a sensitivity to many moving parts.

Qualifications

  • Bachelor’s degree in Computer Science, Information Technology, or a related field
  • MUST HAVE:
    • Minimum 6 years of relevant data experience.
    • Strong, hands-on expertise in Microsoft Fabric (workspaces, semantic modeling, dataflows, pipelines, governance).
    • Proficiency with SQL; SQL DBA experience is a plus.
    • Experience with Python and PySpark for data engineering and workflow automation.
    • Experience integrating and managing data in OneLake, Azure SQL Database, and Azure Blob Storage.
    • Experience with Power BI (data modeling, DAX, visualization best practices).
    • Knowledge of data architecture best practices in Fabric (e.g., medallion design, workspace setup).
    • Experience migrating data pipelines or systems from legacy platforms into Microsoft Fabric
  • Strong analytical and problem-solving skills.
  • Excellent communication and teamwork abilities.  Ability to communicate effectively with all levels of the organization; ability to communicate difficult concepts, new technology, and new ways of working.
  • Willingness to learn and adapt to new technologies and methodologies.
  • A self-starter that brings the energy and resilience to continually drive improvements and change. 
  • Committed to building and strengthening a culture of inclusion within and across teams.
  • Preferred:
    • Experience with other Azure services such as Azure Synapse Analytics and Azure Databricks.
    • Familiarity with Azure AI tools (Azure AI Foundry, Cognitive Services, ML frameworks).
    • Experience advising or supporting SQL DBA activities in Azure managed environments (e.g., dedicated SQL databases, VMs).
    • Familiarity with data governance, metadata management, and security best practices.
    • Experience guiding other teams in data engineering best practices and solution alignment.
    • Experience working in Agile or iterative project environments.
    • Awareness of cloud cost optimization practices in Azure and Fabric.
    • Relevant Azure and/or Microsoft Fabric certifications.
  • Identifies and aligns with WWF’s core values:  
    • COURAGE – We demonstrate courage through our actions, we work for change where it’s needed, and we inspire people and institutions to tackle the greatest threats to nature and the future of the planet, which is our home.
    • INTEGRITY – We live the principles we call on others to meet. We act with integrity, accountability, and transparency, and we rely on facts and science to guide us and to ensure that we learn and evolve. 
    • RESPECT – We honor the voices and knowledge of people and communities that we serve, and we work to secure their rights to a sustainable future. 
    • COLLABORATION – We deliver impact at the scale of the challenges we face through the power of collective action and innovation.  

To Apply: 

  • Submit cover letter and resume through our Careers Page, Requisition #25050 
  • Due to the high volume of applications, we are not able to respond to inquiries via phone 

  

 

World Wildlife Fund (WWF) promotes equal employment opportunities for all qualified individuals regardless of age, race, color, sex, religion, national origin, disability, or veteran status, or any other characteristic protected under applicable law. 

Average salary estimate

$154250 / YEARLY (est.)
min
max
$126600K
$181900K

If an employer mentions a salary or salary range on their job, we display it as an "Employer Estimate". If a job has no salary data, Rise displays an estimate if available.

Similar Jobs
Photo of the Rise User

World Wildlife Fund seeks a motivated undergraduate intern to support Policy and Government Affairs research, legislative tracking, and outreach in Washington, DC for Spring 2026.

Posted 14 hours ago

Work remotely as an AI Data Specialist for RWS, performing annotation, evaluation, and tagging tasks to improve English AI outputs on a flexible, part-time contract basis.

Photo of the Rise User
The Krazy Coupon Lady Hybrid No location specified
Posted 14 hours ago

KCL is seeking a pragmatic Data Engineer to design scalable data pipelines and improve data reliability for a fast-growing, remote-first publisher.

Photo of the Rise User

A summer internship at Campbell's to gain practical experience in analytics and data engineering, building dashboards, improving data quality, and learning cloud/SaaS data tools within Agile delivery teams.

Photo of the Rise User

Abercrombie & Fitch Co. seeks a Senior Data Management Engineer to lead creation of metadata-driven ETL pipelines, data quality/audit frameworks, and automated deployments across Azure, Databricks, and Snowflake environments.

Photo of the Rise User
Freshpaint Hybrid No location specified
Posted 20 hours ago

Freshpaint seeks an Analytics Engineer to build and maintain analytics-ready datasets (primarily in dbt), implement data governance, and power product and growth decisions at a fast-growing, fully remote U.S.-only healthcare startup.

General Dynamics Electric Boat is hiring a Data Modeler to create Power BI data models, dashboards, and aligned SharePoint solutions that standardize metrics and improve reporting across Life Cycle Support.

Photo of the Rise User
AbbVie Hybrid Mettawa, IL
Posted 17 hours ago

AbbVie is hiring a Hybrid Data Engineer in Lake County, IL to design, build, and operationalize cloud-based data pipelines and support machine learning initiatives for commercial analytics.

Photo of the Rise User
Posted 6 hours ago

Experienced Senior Data Engineer needed to architect and maintain large-scale, event-driven data platforms and pipelines for a US company with remote or hybrid work options.

Photo of the Rise User
Posted 2 hours ago

Lead a small BI engineering team at Uncommon Schools to design, deliver, and maintain data-driven analytics and dashboards that support K-12 instruction and operations.

Photo of the Rise User
Achieve Hybrid Tempe, AZ, USA
Posted 4 hours ago

Senior Data Engineer needed to design and deliver scalable, real-time data pipelines and event-driven platforms that support Achieve’s analytics, product, and finance systems.

Posted 21 hours ago

Work with engineering and product teams to build and maintain data pipelines, automate NFRC certification workflows, and deliver dashboards and reports that drive compliance and product performance decisions.

Our mission is to build a future in which people live in harmony with nature. From our experience as the world's leading independent conservation body, we know that the well-being of people, wildlife and the environment are closely linked. That's ...

4 jobs
MATCH
Calculating your matching score...
DEPARTMENTS
SENIORITY LEVEL REQUIREMENT
TEAM SIZE
EMPLOYMENT TYPE
Full-time, hybrid
DATE POSTED
October 3, 2025
Risa star 🔮 Hi, I'm Risa! Your AI
Career Copilot
Want to see a list of jobs tailored to
you, just ask me below!