Job ID: TX-529501270 (913091227)
Remote/Local Snowflake Data Warehouse Developer with Oracle, PostgreSQL, HA, cloning, RBAC, SSO, SCIM, Informatica, ADF, ETL/ELT, Python/Shell scripting, Erwin/Visio, MS Project, agile, JIRA, Cognos, CI/CD, PII/PHI/HL7, Azure experience
Location: Austin, TX (HHSC)
Duration: 8 Months
Position: 1
Any location within the State of Texas
Skills:
8	Required	Experience with data modeling, data integration, data warehousing, data governance, and data security
8	Required	Experience with Oracle and/or PostgreSQL in HA deployments and Expertise in data storage
8	Required	Proficiency in Snowflake architecture and its components.
8	Required	Hands-on experience with Snowflake objects such as Databases, Procedures, Tasks, and Streams.
8	Required	Expertise in using Snowflake’s cloning capabilities for databases and schemas.
8	Required	Proven experience in managing Snowflake Warehouses and optimizing performance for efficient query execution.
8	Required	Proficiency in Snowflake RBAC (Role-Based Access Control), including implementation of roles and privileges.
8	Required	Experience with integrating Snowflake SSO (Single Sign-On) and SCIM (System for Cross-domain Identity Management) for secure access and identity management.
8	Required	Experience working with data integration tools like Informatica and ADF for seamless ETL/ELT processes.
8	Required	Ability to automate administrative tasks using Snowflake SQL and scripting languages like Python or Shell scripting.
8	Required	Expertise in monitoring and troubleshooting Snowflake environments, including usage tracking and query profiling.
8	Required	Strong understanding of Snowflake’s security features such as data masking, encryption, and network policies.
8	Required	Technical writing and diagramming skills, including proficiency with modeling and mapping tools (e.g., Visio, Erwin), and the Microsoft Office Suite (Word, Excel, and PowerPoint) and MS Project.
8	Required	Experience on an agile sprint team
8	Required	Experience with JIRA software
8	Required	Experience working with multiple teams concurrently, being able to prioritize and complete work on time with high quality
8	Required	Knowledge of Informatica 10.5
8	Required	Developing reports in Cognos Analytics 11.1
5	Preferred	Familiarity with CI/CD pipelines and version control for managing Snowflake code deployments.
5	Preferred	Prior experience in the Healthcare Industry
5	Preferred	Prior experience with an HHS agency
5	Preferred	Prior experience working with PII or PHI data
5	Preferred	Prior experience working with HL7 data
5	Preferred	Prior experience with Azure
4	Preferred	Bachelor’s degree in computer science, Information Systems, or Business or equivalent experience.
Description:
The Department of Information Resources (DIR) requires the services of One Developer hereafter referred to as Worker, who meets the general qualification of Developer 3 Emerging and the specifications outlined in this document for Health & Human Services Commission – Social Services Applications team.
All work products resulting from the project shall be considered “works made for hire” and are the property of the HHSC. HHSC may include pre-selection requirements that potential Vendors (and their Workers) submit to and satisfy criminal background checks as authorized by the Texas law. HHSC will pay no fees for interviews or discussions, which occur during the process of selecting a Worker(s).
· Design overall data structure, ensuring that Snowflake’s features (e.g., data sharing, scalability, secure data exchange, etc.) are fully utilized to meet the business requirements.
· Create a blueprint for how data will be stored, processed, and accessed within the Snowflake platform.
· Perform optimization of data pipelines and workflows for performance, scalability, and cost-efficiency.
· Design ETL (Extract, Transform, Load) or ELT (Extract, Load, Transform) processes, and optimize queries and data storage strategies.
· Integrate with other cloud services (e.g., AWS, Azure, GCP), third-party tools, and on-premises data systems.
· Designs and implements strategies to control access to sensitive data, applying encryption, role-based access control, and data masking as necessary.
· Work closely with data engineers, data scientists, business analysts, and other stakeholders to understand their requirements and ensure the Snowflake environment meets those needs.
· Monitor the performance of the Snowflake environment, identifying bottlenecks, and ensuring optimal query performance.
· Automate administrative tasks using Snowflake SQL and scripting languages like Python or Shell scripting.
· Preform data loading methods (bulk loading using COPY INTO, Snowpipe for real-time ingestion, and External tables).
· Perform Snowflake cloning capabilities for databases and schemas.
· Perform configuration and management of Snowflake Virtual Warehouses including scaling, resizing and auto-suspend/resume settings.
· Implement roles and privileges for managing secure access utilizing Snowflake RBAC (Role-Based Access Control)
· Integrate Snowflake SSO (Single Sign-On) and SCIM (System for Cross-domain Identity Management) for secure access and identity management.
· Configure alerts and monitor data pipeline failures, resource spikes, and cost thresholds.
