Bigdata/ AB Initio Developer with Hadoop, Spark SQL, Hive, Impala, Pig, Kafka, ETL, Informatica and DataStage experience

Request ID: 07722050 (97590301)

Bigdata/ AB Initio Developer with Hadoop, Spark SQL, Hive, Impala, Pig, Kafka, ETL, Informatica and DataStage experience

Location: Delaware
Duration: 12 Months

Job description
We are look for Bigdata candidates with big data technologies like Hadoop/HIVE/Spark.’
programming languages: ETL solutions, such as Ab Initio (strongly desired), Informatica, DataStage.

• BS/BA degree or equivalent experience
• Advanced knowledge of application, data and infrastructure architecture disciplines
• Understanding of architecture and design across all systems
• Working proficiency in developmental toolsets
• Ability to collaborate with high-performing teams and individuals throughout the firm to accomplish common goals
• Proficiency in one or more general purpose programming languages (Hadoop, Spark SQL, Hive, Impala, Pig, Kafka)
• Understanding of software skills such as business analysis, development, maintenance and software improvement
• 3-5 years experience in Big Data technologies, machine learning, and utilities
• Experience utilizing and extending ETL solutions (e.g., Informatica, Talend, Pentaho, Ab Initio) in a complex, high-volume data environment
• Experience with scheduling & data integration tools like Control-M and Ni-Fi is highly desired.
• Strong exposure in Data Management, Governance and Controls functions

Skills
APPS NICHE SKILLS – ANALYTICS-BI/DW/ETL

Full Name:
Contact:
Email:
Rate
Skype Id:
Date of Birth
Current location:
Open to relocate
Currently in project:
Availability to start:
Visa Status with Validity:
Last 5 Digit of SSN:
Total Years of IT Experience:
Experience working in US:
Available interview time slots
Passport Number
Bachelor’s Degree
Education (Passing year of Bachelors/Masters / University):

Leave a Reply

Search

Popular Posts

Categories

Archives

Tags

There’s no content to show here yet.

Discover more from innoSoul:

Subscribe now to keep reading and get access to the full archive.

Continue reading