Description
Description:
Designs, develops, and implements Hadoop eco-system based applications to support business requirements. Follows approved life cycle methodologies, creates design documents, and performs program coding and testing. Resolves technical issues through debugging, research, and investigation.
Experience/Skills Required:
1. Bachelor's degree in Computer Science, Information Technology, or related field and 5 years experience in computer programming, software development or related
2. 3+ years of solid Java and 2+ years experience in design, implementation, and support of solutions big data solution in Hadoop using Hive, Spark, Drill, Impala, HBase
3. Hands on experience with Unix, Teradata and other relational databases. Experience with @Scale a plus
4. Strong communication and problem-solving skills.
What project or initiative will they be working on?
My health my data privacy regulations project
Top 3 Skills Needed or Required
Airflow
scala
Google cloud storage
SQL analysis