Qualification: BCA/B.Sc /B.Tech/B.E. in any specialization
Experience: 10 – 14 years
Skills (Knowledge, Skills & Abilities) – Experience with enterprise data management, Business Intelligence, data integration, and SQL database implementations
- Experience with the major big data solutions like Hadoop, MapReduce, Hive, Spark, Scala, HBase, MongoDB, Cassandra.
- Programming/ scripting languages like Java, Linux, PHP, Ruby, Python and/or R. As well as have experience in working with ETL tools such as Informatica, Talend, Pentaho etc.
- Experience in designing solutions for multiple large data warehouses with a good understanding of cluster and parallel architecture as well as high-scale or distributed RDBMS and/or knowledge on NoSQL platforms.
- Experience in data migration from relational databases to Hadoop HDFS
- Propose best practices/standards
- Translate, load and present disparate datasets in multiple formats/sources including JSON, XML etc.
- Should have independently worked on proposing architecture, design and data ingestion concepts in a consultative mode.
Role & Responsibilities:
Work closely with the customer and the solutions architect to translate the customer’s business requirements into a Big Data solution.
- Understanding the customer data requirements, platform selection, design of the technical architecture, design of the application and interfaces, and development, testing, and deployment of the proposed solution.
- Design enterprise grade large-scale data processing systems and help identify the best options for architecture.
- Understand the complexity of data and can design systems and models to handle different variety of data with varying levels of volume, velocity and veracity.
- Lead client assessments, preparing current state and future state architectures along with go forward recommendations.
Work with the practice leads and account management team to develop statements of work, implementation plans, resource plans and project estimates