Role Overview: This is a Software Engineering Development position that includes architecture, design and implementation of Big data solutions.
Job Description – Responsibilities
1) Creating complex data processing pipelines, using apache spark/hadoop. Maintain and expand batch aggregations/reports to existing data mart processing using Hive/Oozie.
2) Maintain and expand real-time real-time aggregations using Spark streaming,Cassandra,Graphite, Elasticsearch. Writing Unit tests, integration tests and help in defining and executing performance Tests for Datamart components. Deploying data pipelines in production based on Continuous Delivery practices.
3) Create Dashboards/Alerts exhibiting critical metrics and KPI (key performance indicators) for monitoring/tuning and scaling data processing pipelines.
4) Work with Senior Architects to define next generation Architecture for Datamart. Work Closely with Operations team to size, scale and tune existing and new infrastructure.
5) Clearly, communicate ideas, thought process, different alternatives for design and architecture in Discussions and written communication (wiki/design documents/internal blogs).
Salary: Not Disclosed by Recruiter
Industry: IT-Software / Software Services
Functional Area: IT Software – Application Programming, Maintenance
Role Category: Programming & Design
Role: Team Lead/Technical Lead
Employment Type: Permanent Job, Full Time
Desired Candidate Profile
1) 7- 10+ years experience with a BSCS, 5-7+ years with a MSCS 3+ years of experience building and deploying large scale data processing pipelines in a production environment.
2) Good to have experience on Cloudera Hadoop. Production-level hands-on experience working on HDFS, Java MapReduce, Hive, Apache Spark, Oozie and other tools in Big data stack.
3) Hands-on experience with one or more NOSQL databases like – Cassandra, Elasticsearch, Redis. Knowledge of software best practices, like Test-Driven Development (TDD) and Continuous Integration (CI) Solid knowledge of multi-threaded design approach, concurrency and Distributed Systems.
4) Good to have experience in working on SaaS (Software as a service) product in cloud Knowledge of software development methodologies including agile, TDD, CI/CD Knowledgeable of Unix scripting language such as Bash, Perl, PHP, Python.
TIBCO Software India Pvt Ltd
TIBCO Software Inc. is a provider of infrastructure software for companies to use on-premise or as part of cloud computing environments. Whether it’s efficient claims or trade processing, cross-selling products based on real-time customer behaviour, or averting a crisis before it happens, TIBCO provides companies the two-second advantageÃ¢â€žÂ¢ Ã¢â‚¬” the ability to capture the right information, at the right time, and act on it pre-emptively for a competitive advantage. More than 4,000 customers worldwide rely on TIBCO to manage information, decisions, processes, people and data in real time.
Visit www.tibco.com to know more about the exciting work we do