Cloud ETL Developer Job at Compunnel, Richmond, VA

SHJTWnlKSWZUaUZqdjdOTUxKK2xpeEo5enc9PQ==
  • Compunnel
  • Richmond, VA

Job Description

The Client Information Technology Division (ITD) is seeking an experienced Master Data Analyst to join the Enterprise Data Asset team. This role involves supporting Agile teams in analyzing datasets for cloud-based data management platforms, with a focus on master data governance. The ideal candidate will have strong expertise in ETL development, spatial data, and data warehousing, with experience in coding languages like Python, Java, SQL, and XML. This role will require the development and testing of data pipelines, understanding data life cycles, and collaborating with various teams to implement efficient data solutions.

Key Responsibilities:

  1. Collaborate with business stakeholders and project teams to understand business processes and pain points.
  2. Develop expertise in source system datasets, including those with spatial components, and review the structure and content of these datasets.
  3. Conduct entity resolution to resolve matching and merging conflicts.
  4. Elicit, document, and manage metadata, and create process flow diagrams and data flow diagrams to visualize current and proposed data processes.
  5. Break down requirements into Epics and Features, writing clear and concise user stories for technical teams.
  6. Work with the team to map user stories to data models and architecture to facilitate master data management.
  7. Support the Product Owner in maintaining the product backlog, including creating prototypes and mock-ups.
  8. Perform Quality Analyst functions, including test case creation, execution, and facilitating User Acceptance Testing (UAT).
  9. Design and develop ETL processes for business and spatial data, contributing to the Data Hub and data warehouse development.
  10. Develop new data engineering processes leveraging cloud architecture (Azure) and migrate existing data pipelines accordingly.
  11. Design and support data warehouse schema for new and existing data sources, ensuring optimization for reporting.
  12. Collaborate with data analysts, scientists, and other stakeholders to populate the data hub and warehouse with optimized structures.
  13. Partner with Data Modelers and Data Architects to refine business data requirements.

Required Qualifications:

  1. Minimum of 10 years of experience delivering business data analysis artifacts.
  2. 5+ years of experience as an Agile Business Analyst, with a strong understanding of Scrum concepts and methodology.
  3. Proven experience organizing and maintaining Product and Sprint backlogs.
  4. Strong ability to translate client and product strategy into dataset requirements and user stories.
  5. Exceptional experience writing complex SQL queries for SQL Server and Oracle databases.
  6. Expertise with Azure Databricks, Azure Data Factory, Snowflake.
  7. Familiarity with ESRI ArcGIS and enterprise data management.
  8. Experience with reporting systems, including operational data stores, data warehouses, data lakes, and data marts.
  9. Strong communication skills with the ability to collaborate effectively with diverse teams and customers.

Preferred Qualifications:

  1. Advanced understanding of data integrations.
  2. In-depth knowledge of database architectures and problem-solving skills.
  3. Experience with various data warehousing architectures, including Kimball and Inmon.
  4. Strong analytical and relational skills, with the ability to build internal and external relationships.
  5. Ability to negotiate and resolve conflicts effectively.
  6. Desire to learn, innovate, and evolve technology solutions.

Technical Skills:

  1. Experience with Data Factory v2, Data Lake Store, Data Lake Analytics, Azure Synapse, Azure Analysis Services.
  2. Proficiency with tools like IBM DataStage, Erwin, SQL Server (SSIS, SSRS, SSAS), Oracle, T-SQL, Azure SQL Database, and Azure SQL Data Warehouse.
  3. Knowledge of operating system environments (Windows, Unix).
  4. Scripting experience with Python, Linux Shell, and Windows scripting.

Technologies and Tools:

  1. ETL tools: Data Factory, DataStage
  2. Programming languages: SQL, T-SQL, Python, Shell Scripting
  3. Business Intelligence Tools: Power BI, Tableau

#J-18808-Ljbffr

Job Tags

Similar Jobs

NEXGENS

Sanitation and Food Packing support Job at NEXGENS

 ...Sanitation and Food Packing support Must be able to lift 55 pounds safely and routinely Must follow Food plant cGMPs, food or pharma plant experience is a plus Forklift license is a plus Must be punctual and reliable and willing and able to work OT... 

Shady Creek Winery

Line Cook- $18/ph, Trailyard by Shady Creek Winery Job at Shady Creek Winery

Job Description Job Description Benefits: Employee discounts Free uniforms Training & development Wellness resources Job Summary NEWLY OPENED !! Join Our Culinary Team as a Line Cook Elevate Your Passion for Cooking! Are you a culinary enthusiast...

Advocate Aurora Health

staff - Staff - Registered Nurse (RN) - Home Health - Advocate Aurora Health Job at Advocate Aurora Health

 ...Advocate Aurora Health is seeking a Registered Nurse (RN) Home Health for a nursing job in Oak Creek, Wisconsin. Job Description &...  ...physical, psychosocial and environmental assessments of assigned patients and updates plan of care where appropriate and as a result of visit... 

ManTech

Senior ETL Developer Job at ManTech

 ...MANTECH seeks a motivated, career and customer-oriented Senior ETL Developer to join our team in Chantilly, VA. Responsibilities include but are not limited to: Designing and implementing large-scale ingest systems in a Big Data environment Optimizing... 

Tact Tech Security Solutions, LLC.

Security Officer - Overnight Job at Tact Tech Security Solutions, LLC.

 ...~ Competitive salary ~ Flexible schedule ~ Free uniforms ~ Opportunity for advancement Position Overview The Security Officer is responsible for maintaining a safe, secure, and professional environment across a variety of posts, including commercial...