Job was saved successfully.
Job was removed from Saved Jobs.

Job Details


Senior Software Engineer (8261_R-594658)





Dallas, Texas, United States

Position Summary...

What you'll do...


  • Strong programming experience in Core Java, strong with Algorithms, Data structures, design patterns
  • Development and maintenance of Apache Druid and Presto platform/services.
  • Working with different customers to optimize their Apache Druid , Presto queries and data ingestion performance.
  • Using standard tools to tune, profile, and debug Java Virtual Machines (JVM).
  • Contribute to design, implementation, and maintenance of systems across Data platforms, and Tools
  • Troubleshoot and triage issues related to Data ETL framework, Metadata, Workflow management tools, Data quality, and open-source database solutions like Presto, MySQL, and cloud-native components to ensure SLA adherence
  • Own production incidents/issues and provide a response to infrastructure incidents and alerts
  • Packaging and deploying application updates and patches.
  • Builds tools to continuously monitor and alert platform components
  • Continually improve CI/CD tools, processes, and procedures
  • Exposure of building micro services using Spring Boot framework (Optional)
  • Write and maintain infrastructure documentation
  • Work with distributed teams in a collaborative and productive manner
  • Identify right open-source tools to improve the product by performing research, POC/Pilot, and/or interacting with various open-source forums
  • Support On-Call for 12x7 rotation when needed.
  • Promote and support company policies, procedures, mission, values, and standards of ethics and integrity

  • Good academic record and graduated (B.E/ B.Tech / M.E / M.Tech)
  • 6+ years of Industry experience, focused on solving data challenges
  • Deep Exposure to Apache Druid and Presto internals and troubleshooting.
  • Strong to Expert skills in Apache Druid with multiple examples of implementations or support of the software.
  • Experience in managing Data Platform, ETL pipelines with the ability to debug at a code level.
  • Strong understanding of the internals of at least 1 distributed processing frameworks like Map-reduce, Hive, and Spark (Java/Python/SQL)
  • Understanding of Linux/Unix OS and Network protocols - _TCP/IP, DNS, HTTP, and usage of system diagnostic tools.
  • Experience in one or more programming such as Java, Python
  • Building monitoring tools and automation for managing the Production systems.
  • Hands-on experience in doing proofs of concepts based on business and technology needs and communicating Technology direction to top management
  • Experience with source code repositories (Git) and CI tools (Jenkins, Maven) and software provisioning and deployment automation tools (Ansible)
  • Exposure on Azure or Google Cloud and performing Orchestration, deployments & CI/CD using Ansible & Terraform
  • Exposure in deploying and managing Kubernetes and Container technologies like Docker is an added advantage
  • Exposure to one of the databases like MongoDB/MySQL/Teradata/Bigdata is required
  • Experience in building scalable/highly available distributed systems in the production environment is desirable

Minimum Qualifications...

Outlined below are the required minimum qualifications for this position. If none are listed, there are no minimum qualifications.

Bachelor's degree in Computer Science and 3 years' experience in software engineering or related field OR 5 years' experience in software
engineering or related field.

Preferred Qualifications...

Outlined below are the optional preferred qualifications for this position. If none are listed, there are no preferred qualifications.

Master's degree in Computer Science or related field and 2 years' experience in software engineering or related field

Primary Location...
603 MUNGER AVE STE 400, DALLAS, TX 75202, United States of America