This position is responsible for designing, developing and maintaining data analytics platforms for running IT operational/security analytics. He/she would be interfacing with the stakeholders and teams to design, setup and monitor the environment. Would be exposed to different phases of software development and architecture and would have the opportunity to work on data processing pipeline and big data platform.
Design, implement, test and deploy data processing infrastructure by understanding the use cases
Research and assess the viability of new processing and data storage technologies
Understanding the business objectives and goals and design services that couple business logic with reusable components for future expansion
Monitoring performance and advising any necessary infrastructure changes
Ability to communicate with stakeholders as well as engineers.
Deliver on projects assigned
Typical Role definition:
A seasoned, experienced professional with a full understanding of area of specialization. Resolves a wide range of issues in creative ways. General knowledge of related disciplines. Strong competence with the various tools, procedures, programming languages used to accomplish the job. Usually works with minimal supervision, conferring with a supervisor on unusual matters. May be assisted by (and at times direct) less senior level employees. Assignments are broad in nature and need ingenuity and originality to solve. Contributes to moderately complex aspects of a project. May assist more junior staff members with aspects of their job. Works on problems of diverse scope where analysis of data requires evaluation of identifiable factors. May play a role in high-level projects that have an impact on the companys future direction.
Bachelors in Computer Science/Statistics/Engineering / related field or equivalent work experience.
3 – 5 years of professional experience
Experience with integration of data from multiple data sources, including processing a diversity of structured and unstructured data
Strong experience with Python, Scala or Java
Has worked with SQL or NoSQL databases (such a MySQL, Cassandra, MongoDB)
Experience with Hadoop based Big Data Lakes and ecosystem tools such as Spark, Scalding, MapReduce
Experience in designing, deploying, and administrating enterprise class big data clusters (Kafka, NiFi, Storm, Spark, HDFS)
Ability to reason about performance tradeoffs
Skills & Competencies:
Coaching and mentoring
In depth knowledge of big data platform and practices
Attention to detail
Decision making/Self Starter
Salary: Not Disclosed by Recruiter
Industry:IT-Software / Software Services
Functional Area:IT Software – Other
Role Category:Programming & Design
Employment Type:Permanent Job, Full Time
NoSQL Cassandra MySQL Java MongoDB SCALA Spark Hadoop Python Mapreduce
Desired Candidate Profile
Please refer to the Job description above
UG:B.Tech/B.E. – Any Specialization, Computers
HEAT Software India Pvt. Ltd
HEAT Software is leading the unification of Service Management and Unified Endpoint Management (UEM). We empower IT, HR, Facilities, Customer Service and other enterprise functions to simplify and automate their business processes to improve service quality, while managing and securing endpoints to proactively detect and protect against threats to business continuity. HEAT Software delivers the world’s most powerful fusion of truly flexible, scalable, secure Service Management and UEM solutions. Forged by HEAT. Our Core Values: OPTIC – One Team, Passion, Trustworthy, Innovative, Customer Focused