< Back to jobs

Bigdata Architect , Hadoop Architect

  8 to 13 Years    Noida

Apply Now      

Job Description :

  1. Experience building large scale distributed data processing application or in general.
  2. Hands-on engineering skills.
  3. Experience with a range of big data architectures, including Cloudera, Hadoop, Pig, Hive, MapReduce, HBase, and other big data frameworks.
  4. Real time data processing using Apache Storm, Spark streaming etc
  5. Overall understanding and experience of real-time analytics, data warehousing and data modeling and data management
  6. Understanding of analytical tools, languages, or libraries (e.g. SAS, Open R, Mahout).
  7. Experience with NoSQL databases such as mongoDB.
  8. Experience in enterprise data & analytic disciplines such as Enterprise Architecture, Enterprise Data Management, Enterprise Data Warehousing
  9. Experience on Pre-Sales Activities and must be able to strike a conversation with executive level client stakeholders
  10. Domain experience in Banking/Financial Services Industry will be an added advantage.
  11. Successful background as an architect on EDW/Data Lake projects.
  12. Deep understanding of relational databases and data integration technologies.
  13. 4+ years' experience with Hadoop Eco system (HDFS, SQOOP, Hive, PIG, Spark, Scala)
  14. Experience with ETL tools (Informatica, Talend etc.) preferred
  15. 4+ years Data Warehouse Architecture experience.
  16. Experience with Data Virtualization Technologies (Tableau, Qlik etc.) is a plus.

Educational Qualifications :

Bachelor / Masters

Key Skills :

Cloudera Big data Hadoop Spark Storm Mapreduce Mapr HDFS SQOOP Scala

Contact Details :


Apply Now      

   Posted 2 weeks ago   10 Views