Job ID: TX-529501194R2 (911591111)
Hybrid/Local Snowflake Admin (15+) with SQL, Python, data warehouse, data lakes, Visio/Erwin, MS Project, agile, JIRA, visualization, BI, PII, PHI, HL7, Azure Blob, Oracle, Epics/User Stories/Tasks, CI/CD, GitHub, Jenkins Informatica/IICS, AWS S3, Linux/Unix and healthcare experience
Location: Austin, TX (HHSC)
Duration: 9 Months
Position will be 4 days remote with 1 day (Tuesdays) required to be onsite at the location listed above. Program will allow candidates who are within the state (Any location within the State of Texas).
Skills:
8 Required ETL Administrator with a focus on data warehousing and business intelligence solutions.
8 Required Experience with Informatica IICS Administration, including installation, upgrades, and maintenance of the Informatica IICS environment.
8 Required Knowledge of Power Exchange Change Data Capture (CDC) on both PowerCenter and IICS.
8 Required Experience with a cloud background and knowledge of Continuous Integration and Continuous Deployment (CI/CD) tools such as GitHub and Jenkins.
8 Required Ability to think innovatively and automate various administrative tasks in the admin space using Python and shell scripting skills.
8 Required Excellent analytical skills to triage and resolve production issues and outages.
8 Required Proven expertise in designing, developing, and deploying ETL pipelines using industry-standard tools like Informatica and IICS.
8 Required Familiarity with a wide range of data sources: relational databases (e.g., Oracle, SQL Server, MySQL, Snowflake), flat files, and cloud platforms (e.g., AWS S3, Azure Blob).
8 Required Experience in data quality, validation, and data migration projects.
8 Required Linux/Unix experience.
8 Required Technical writing and diagraming skills, including proficiency with modeling and mapping tools (e.g., Visio, Erwin), and the Microsoft Office Suite (Word, Excel, and PowerPoint) and MS Project.
8 Required Experience on an agile sprint team
8 Required Experience with JIRA software
8 Required Experience working with multiple teams concurrently, being able to prioritize and complete work on time with high quality
8 Required Knowledge of relational databases and data warehousing, including platforms like Oracle, Snowflake, SQL Server, and MySQL.
8 Required Proficiency in SQL, Python and Bash.
8 Required Familiarity with cloud ecosystems like AWS, Azure, or GCP and their respective data services.
8 Required Data Quality & Modeling: Knowledge of data quality frameworks and data modeling techniques.
6 Preferred Proven ability to write well designed, testable, efficient code by using best software development practices
4 Preferred Understanding of security principles and how they apply to healthcare data
4 Preferred Experience with state of the art software components for a performance metrics data visualization or business intelligence environment
4 Preferred Excellent oral and written communication skills.
4 Preferred: Effectively manage multiple responsibilities, prioritize conflicting assignments, and switch quickly between assignments, as required.
4 Preferred Bachelor’s degree in Computer Science, Information Systems, or Business or equivalent experience.
3 Preferred Prior experience in the Healthcare Industry
2 Preferred Prior experience with an HHS agency
2 Preferred Prior experience working with PII or PHI data
2 Preferred Prior experience with Azure
Description:
The Department of Information Resources (DIR) requires the services of one (1) Informatica administrator/data engineer, hereafter referred to as Worker, who meets the general qualification of Systems Analyst 3, Emerging Technologies and the specifications outlined in this document for Health and Human Services Commission (HHSC) Information Technology.
All work products resulting from the project shall be considered “works made for hire” and are the property of the HHSC. HHSC may include pre-selection requirements that potential Vendors (and their Workers) submit to and satisfy criminal background checks as authorized by the Texas law. HHSC will pay no fees for interviews or discussions, which occur during the process of selecting a Worker(s).
HHSC IT is continuing to develop an HHS data integration hub with a goal to accomplish the following:
• Implementation and configuration of the infrastructure for the data integration hub
• Design, development, and implementation (DD&I) of the data integration hub using an agile methodology for all standard SDLC phases that includes, but is not limited to:
• Validation of performance metric requirements
• Creation of Epics/User Stories/Tasks
• Automation of data acquisition from a variety of data sources
• Development of complex SQL scripts
• Testing – integration, load and stress
• Deployment / publication internally and externally
• Operations support and enhancement of the data integration hub
This development effort will utilize an agile methodology based upon the approach currently in use at HHSC for the Texas Integrated Eligibility Redesign System (TIERS). As a member of the agile development team, the worker responsibilities may include:
o ETL Administration with a focus on data warehousing and business intelligence solutions.
o Experience with Informatica IICS Administration, including installation, upgrades, and maintenance of the Informatica IICS environment.
o Knowledge of Power Exchange Change Data Capture (CDC) on both PowerCenter and IICS.
o Experience with a cloud background and knowledge of Continuous Integration and Continuous Deployment (CI/CD) tools such as GitHub and Jenkins.
o Ability to think innovatively and automate various administrative tasks in the admin space using Python and shell scripting skills.
o Excellent analytical skills to triage and resolve production issues and outages.
o Proven expertise in designing, developing, and deploying ETL pipelines using industry-standard tools like Informatica and IICS.
o Familiarity with a wide range of data sources: relational databases (e.g., Oracle, SQL Server, MySQL, Snowflake), flat files, and cloud platforms (e.g., AWS S3, Azure Blob).
o Experience in data quality, validation, and data migration projects.
o Linux/Unix experience.