Back
India   India   Engineer   Unison Consulting PTE -

Unison Consulting - Big Data Engineer - Hadoop & Kafka (2-4 yrs) Bangalore | Engineer in Engin1

Unison consulting pte ltd

This listing was posted on hirist.

Unison Consulting - Big Data Engineer - Hadoop & Kafka (2-4 yrs) Bangalore

Location:
Bangalore
Description:

We are seeking a talented Big Data Engineer to join our team and play a key role in designing, developing, and maintaining our big data infrastructure. You will leverage your expertise in Java, Hadoop, and Kafka to build efficient data pipelines, handle real-time data streams, and ensure high-quality data processing for our data analytics needs.Responsibilities :- Design, develop, and maintain big data processing applications, workflows, and pipelines using Hadoop technologies.- Utilize Java to build and enhance data processing components and services within the Hadoop ecosystem.- Implement Kafka-based data ingestion and streaming solutions to efficiently manage real-time data flows.- Leverage Hadoop technologies (HDFS, MapReduce, Spark, Hive, HBase) for batch and real-time data processing tasks.- Develop and optimize data ingestion processes to ensure efficient data collection, transformation, and loading into the Hadoop cluster.- Monitor and optimize the performance of Hadoop jobs and data pipelines for scalability and efficiency.- Clean and transform data for analysis, ensuring data quality, integrity, and compatibility with downstream applications.- Design and implement Kafka-based streaming solutions to handle large volumes of real-time data.- Troubleshoot data processing and Kafka-related issues promptly.- Collaborate effectively with data engineers, data scientists, and other cross-functional teams to understand data requirements and deliver successful solutions.- Create and maintain comprehensive documentation of Hadoop configurations, Java code, Kafka setups, and data processing workflows.- Implement and maintain security measures to safeguard sensitive data within the Hadoop ecosystem and Kafka clusters.Requirements:- Bachelor's or Master's degree in Computer Science, Information Technology, or a related field (preferred).- Proven experience as a Hadoop Developer with strong Java and Kafka skills.- Proficiency in Java programming for building data processing applications.- Hands-on experience with core Hadoop ecosystem components (HDFS, MapReduce, Spark, Hive, HBase).- Familiarity with Kafka for data streaming and ingestion.- Strong understanding of data transformation and ETL processes.- Excellent problem-solving skills and a keen eye for detail.- Effective communication skills for collaboration with team members and stakeholders.- Experience with data security and compliance is a plus.- Ability to thrive in a fast-paced environment and adapt to evolving technologies and project requirements. (ref:hirist.tech)
Education/experience:
2 To 5 Years
Company:
Unison Consulting PTE
Posted:
April 25 on hirist
Visit Our Partner Website
This listing was posted on another website. Click here to open: Go to hirist
Important Safety Tips
  • Always meet the employer in person.
  • Avoid sharing sensitive personal and financial information.
  • Avoid employment offers that require a deposit or investment.

To learn more, visit the Safety Center or click here to report this listing.

More About this Listing: Unison Consulting - Big Data Engineer - Hadoop & Kafka (2-4 yrs) Bangalore
Unison Consulting - Big Data Engineer - Hadoop & Kafka (2-4 yrs) Bangalore is a Engineering Engineer Job at Unison Consulting PTE located in India. Find other listings like Unison Consulting - Big Data Engineer - Hadoop & Kafka (2-4 yrs) Bangalore by searching Oodle for Engineering Engineer Jobs.