< Back to jobs

Bigdata/Hadoop Architect

  8 to 13 Years    Noida

Apply Now      

Job Description :

The selected candidates will be part of a talented team and implementing a challenging role to implement and provide solutions for varies business process and meet the deliverables in time.

Roles and Responsibilities :

  1. Experience building large scale distributed data processing application or in general.
  2. Hands-on engineering skills.
  3. Experience with a range of big data architectures, including Cloudera, Hadoop, Pig, Hive, MapReduce, HBase, and other big data frameworks.
  4. Real-time data processing using Apache Storm, Spark streaming etc.
  5. Overall understanding and experience of real-time analytics, data warehousing and data modeling and data management.
  6. Understanding of analytical tools, languages, or libraries (e.g. SAS, Open R, Mahout).
  7. Experience with NoSQL databases such as MongoDB.
  8. Experience in enterprise data & analytic disciplines such as Enterprise Architecture, Enterprise Data Management, Enterprise Data Warehousing.
  9. Experience on Pre-Sales Activities and must be able to strike a conversation with executive level client stakeholders.
  10. Domain experience in Banking/Financial Services Industry will be an added advantage.
  11. Successful background as an architect on EDW/Data Lake projects.
  12. Deep understanding of relational databases and data integration technologies.
  13. 4+ years experience with Hadoop Eco system (HDFS, SQOOP, Hive, PIG, Spark, Scala).
  14. Experience with ETL tools (Informatica, Talend etc.) preferred.
  15. 4+ years Data Warehouse Architecture experience.
  16. Experience with Data Virtualization Technologies (Tableau, Qlik etc.) is a plus.

Educational Qualifications :

Bachelor / Masters

Key Skills :

Cloudera Big data Hadoop Spark Storm Mapreduce Mapr HDFS SQOOP Scala

Contact Details :


Apply Now