Big Data Developer
Big Data Developer
on-site in Phoenix, AZ
· 6+ years of Data experience developing, deploying and supporting high-quality, fault-tolerant data pipelines (leveraging distributed, data movement technologies and approaches, including but not limited to ETL and streaming ingestion and processing).
· 2+ years of experience with cloud-based data offerings (AWS, Google GCP, Microsoft Azure)
· 3+ years of programming experience in Java, Python, Scala or C++.
· 3+ years of experience with ETL Tools (Informatica or DataStage preferred.)
· 2+ years of experience with Hadoop Ecosystem
· 4+ years of experience with distributed NoSQL databases and event brokers (Apache Cassandra, Kafka, Graph databases, Document Store databases)
· 4+ years of experience in advanced distributed schema and SQL development skills including partitioning for performance of ingestion and consumption patterns
· Experience building and implementing Cloud Warehouses (AWS); ability to take an on-premises operating system and stream data using technologies such as Kafka to S3
· Extensive 3rd party software integration experience
· Experience with Core systems modernization
· Experience migrating data capabilities to cloud
· Experience with Agile environments.
Info to keep in mind:
· USC or GCH visa status required/ no sponsorship provided by Fortune 100 client at this time.
· Fortune 100 Client (AZ) Benefits after conversion to full-time: 16% Bonus, 8% 401K match, and Tuition/certification reimbursement, onsite Daycare, onsite Pharmacy and Healthcare services, Personal Concierge Service, onsite Fitness Center, covered parking spaces, onsite food-court/s and Starbucks.
· Hiring 10 new DataStage, Informatica, ETL and AWS Data Developers.
· Also hiring PeopleSoft Developers, and 60-70+ Java Developers for the Phoenix, AZ office specifically.
· Bank client requires 100% onsite Monday-Friday work at Phoenix office location.