Data Pipeline/ETL Developer with AWS, Glue, Kinesis, Redshift Spectrum, S3, data modeling, PCI/SOX/SOC/ISO, SQL, micro-services, cloud, Dell Boomi and DBA experience

Job ID: NC-632020 (910091011)

Data Pipeline/ETL Developer with AWS, Glue, Kinesis, Redshift Spectrum, S3, data modeling, PCI/SOX/SOC/ISO, SQL, micro-services, cloud, Dell Boomi and DBA experience

Location: Raleigh, NC (NCDHHS)
Duration: 12 months
Interview: Webcam Interview Only
Positions: 1 (1/4)

Skills:
Experience with AWS Services, especially, Data Pipeline, Glue, Kinesis, Redshift Spectrum Required 5 Years
In depth experience Amazon Redshift, working knowledge of S3 data lake environment, AWS Certified Database (preferred) Required 3 Years
Experience in schema design and data modeling Required 3 Years
Knowledge of compliance frameworks PCI, SOX, SOC 2, ISO 27001 and the ability to apply their requirements and concepts to a complex environment Required 5 Years
Knowledge and hands-on expertise with Structured Query Language (SQL) Required
Strong interpersonal communication and ability to solve complex problems Required
Experience in debugging Required
Strong technical experience, knowledge, and understanding of micro-services architecture Required
Experience with enterprise level cloud-based development, deployment, and auditing, including: PAAS, IAAS, and SAAS. (preferred) Required 5 Years
Excellent knowledge of data backup, recovery, integrity for database solutions Required
Experience working with Dell Boomi Required 1 Years

The Data Pipeline Developer is responsible for partnering with key data stakeholders, IT leadership, project managers and architecture leads around the development, management, performance, and optimization of data pipelines that support data flow to the Department’s Business Intelligence Data Platform (BIDP) solution. The position has accountability around providing technical expertise to ensure the quality and accuracy of that data, then process, design and present it in ways to help those leveraging the BIDP.

The ETL Developer is responsible for:
• Working with Data Architects, Analysts, and Scientist to aid in BIDP data pipeline efforts
• Configure ingestion and format validation of the new workstreams
• Verify initial and incremental data uploads and maintain the workstreams
• Implement data pipeline-based technologies within and associated with the BIDP environment
• Preform data pipeline testing
• Conducting data pipeline optimization, troubleshooting and debugging, report regularly on health and performance of jobs
• Maintain ownership of release activities interacting with pipeline projects
• Support and improve data pipeline automation
• Assist in discovering, evaluating, and qualifying new technologies around BIDP data pipeline functions
• Partner with BIDP stakeholders to establish and maintain policies, procedures, operational standards around the data pipeline functions
• Help business in leveraging the BIDP through the data pipeline process

15%
Preform data pipeline process management and lifecycle support by outlining the process and setting the boundaries for data consumption and processing within the BIDP. Provide architecture and data flows for the data pipelines. Document the requirements of the data pipeline process, and tools, and manage its development. Work with stakeholder in the development and maintenance of all data pipeline-based documentation. Take part in the development and implementation of data consumption tools within the BIDP environment. Ensure that data pipelines comply with all regulatory standards. Prepare and report on activities and utilization around data pipelines.

65%
Develop data pipelines, processes and designs for data being consumed within the BIDP. Identify trends and opportunities for growth through analysis of complex data sets. Evaluate methods and provide source-to-target mappings and information-model specification documents for data sets. Work directly with stakeholders to gather requirements for data pipelines for the BIDP. Work closely with business understand and maintain focus on their analytical needs, including identifying critical metrics and KPIs, and deliver actionable insights to relevant stakeholders. Define and implement data acquisition and integration logic, selecting appropriate combination of methods and tools within defined technology stack to ensure optimal scalability and performance of the solution
20%
Conduct testing of tools and data pipelines. Perform root cause analysis on all processes and resolve all production issues and validate all data and perform routine tests on data sets and provide support to all data pipeline connections. Document all test procedures for BIDP data pipeline tools and processes, and coordinate with stakeholders and exchange partners to resolve issues and maintain quality.
Competencies, Knowledge, Skills and Abilities Required in this Position

• Strong knowledge and hands-on experience with data pipeline tools
• Experience in data modeling
• Knowledge of compliance frameworks PCI, SOX, SOC 2, ISO 27001 and the ability to apply their requirements and concepts to a complex environment
• Experience in supporting a data warehouse in identifying and revising reporting requirements
• Knowledge and hands-on expertise with processing confidential data and information according to guidelines
• Strong interpersonal communication and ability to solve complex problems
• Hands-on experiences with databases e.g., SQL, MYSQL, Oracle, Redshift (preferred), etc.
• Experience in model design, segmentation techniques, and ETL Frameworks
• Knowledge and understanding of micro-services architecture
• Strong analytic skills, including mining, evaluation, analysis, and visualization
• AWS, Azure or GCP background (preferably all three)
• Experience with enterprise level cloud-based development, deployment, and auditing, including: PAAS, IAAS, and SAAS. (preferred)
• Proficiency in scripting languages e.g., XML, JavaScript, etc.

Education and Experience Required
Bachelor’s degree in Computer Science, Computer Information Systems, Computer Engineering, or other related technical degree from an appropriately accredited institution and 3 years of progressive expertise as a data analyst/scientist; or bachelor’s degree from an appropriately accredited institution and eight years progressive experience in the field of information technology; or an equivalent combination of education and experience.

RTR-632020.docx

NC_Resume_Template-632020

Leave a Reply

Search

Popular Posts

Categories

Archives

Tags

There’s no content to show here yet.

Discover more from innoSoul:

Subscribe now to keep reading and get access to the full archive.

Continue reading