Manage customer's priorities of projects and requests
Assess customer needs utilizing a structured requirements process (gathering, analyzing, documenting, and managing changes) to prioritize immediate business needs and advising on options, risks and cost
Design and implement software products (Big Data related) including data models and visualizations
Demonstrate participation with the teams you work in
Deliver good solutions against tight timescales
Be pro-active, suggest new approaches and develop your capabilities
Share what you are good at while learning from others to improve the team overall
Show that you have a certain level of understanding for a number of technical skills, attitudes and behaviors
Deliver great solutions
Be focused on driving value back into the business
Requirements :
6 years' experience in designing & developing enterprise application solution for distributed systems
Understanding of Big Data Hadoop Ecosystem components (Sqoop, Hive, Pig, Flume)
Additional experience working with
Hadoop, HDFS, cluster management
Hive, Pig and MapReduce, and Hadoop ecosystem framework
HBase, Talend, NoSQL databases
Apache Spark or other streaming Big Data processing, preferred