The essential functions include, but are not limited to the following:
· Business Requirements Analysis & Stakeholder Engagement
o Work directly with business stakeholders to gather reporting needs and translate them into effective technical solutions.
o Conduct requirement analysis, prototyping, and validation to ensure accuracy and usability.
o Provide user training and ongoing support to ensure adoption and success.
· Develop and Maintain Power BI Reports & Dashboards
o Design, build, and optimize dashboards, reports, and analytics using Power BI.
o Integrate data from SQL Server, ERPs, and other business systems.
o Ensure reports are intuitive, visually engaging, and easily understood by business users.
· Support Data Infrastructure & Implementation of Data Lake & Databricks
o Assist in the design and implementation of tools like Data Lake and Databricks for enterprise-wide data management.
o Collaborate with internal and 3rd party data engineers and other technical resources to ensure seamless data integration and transformation.
o Ensure data governance, security, and best practices are followed.
· Data Modeling & ETL Development
o Design and develop data models to support analytics and reporting needs.
o Create and optimize ETL processes for data ingestion, transformation, and reporting.
o Ensure data accuracy, consistency, and performance optimization.
· Collaboration & Continuous Improvement
o Partner with cross-functional teams to align data solutions with business objectives.
o Stay updated with emerging BI and data technologies to drive continuous improvement.
o Advocate for best practices in data visualization, analytics, and user experience.
Minimum Qualifications (Knowledge, Skills, and Abilities)Technical:
· Power BI expertise: advanced skills in DAX, Power Query (M), and report design.
· SQL Server & Database Knowledge: ability to query, manipulate, and model data.
· Data Integration: experience integrating data from ERPs, CRMs, and other business applications.
· Cloud Data Services: familiarity with Data Lake, Databricks, and related toolsets on Azure and/or AWS
· ETL & Data Transformation: experience with tools like Azure Data Factory, SSIS, or similar.
· Data Modeling: strong understanding of star schema, snowflake schema, and OLAP concepts.
· Working knowledge of Python or Spark for working with Databricks and large-scale data processing.
· Technical Documentation Skills: Ability to document technical processes, designs, and user instructions clearly and effectively.