Company:
Leidos
Location: Scott Air Force Base
Closing Date: 01/12/2024
Hours: Full Time
Type: Permanent
Job Requirements / Description
Description
The Leidos Digital Modernization Group seeks an Integration Engineer with a strong Software Development background to support the Enterprise Situational Awareness/Common Operational Picture (SA/COP) Team. The candidate will be responsible for integrating various data sources into Confluent (Kafka) and Elastic platforms, developing robust integration solutions, and contributing to data governance practices. The role requires expertise in Kafka, cloud platforms, and full software lifecycle automation, with experience in deploying and managing systems in a multi-site, multi-cluster environment.
The Leidos Digital Modernization Sector provides a diverse portfolio of systems, solutions, and services covering land, sea, air, space, and cyberspace for customers worldwide. Solutions for Defense include enterprise and mission IT, large-scale intelligence systems, command and control, geospatial and data analytics, cybersecurity, logistics, training, and intelligence analysis and operations support. Our team is solving the world’s toughest security challenges for customers with “can’t fail” missions.
Joint Management Tool (JMT) supports the effective planning, deployment, trouble management, and decommissioning of deployed and non-deployed Satellite Communications (SATCOM) resources, specifically those supporting Department of Defense (DoD), mission partner (other U.S. federal agency), and international partner missions. Through the implementation of automated workflows and dynamic user interfaces, the JMT system strives to streamline business processes and significantly reduce the recurrence and impact of human error by functioning as a homogenous platform through which SATCOM planners, provisioners, operators, and customers can seamlessly coordinate SATCOM requests.?
As a key SA/COP team member, you will work as part of a fast paced, Agile development and implementation team to architect, design and develop an integration solution to ensure a unified data integrated platform that expands the foundational Integrated Data Architecture platform (Confluent and ELK platform). You will work alongside others in a dedicated scrum team in support for operational end user and support team requirements.
Primary Responsibilities:
Integration Solutions: Develop and implement integration solutions for the JMT project using Kafka and Elastic as the primary data architecture platforms.
Data Integration: Integrate data sources (ServiceNow, Terminal Cert DB, Modem Cert DB, Baseband Mission Workup, TRS, MRS, UDL) into Confluent (Kafka) and Elastic platforms. Develop Kafka system integrations between Elasticsearch/Logstash and other systems.
Kafka Integration & Development: Develop Kafka system integrations, custom connectors, and work with ksqlDB and Kafka Streams for data processing based on the design solution.
Kafka Cluster Management: Deploy and manage Kafka clusters on Kubernetes in multi-site environments (both on-premise and cloud).
Software Lifecycle Automation: Automate the full software lifecycle, from design and development to testing and deployment, including production environments.
DevOps Pipelines: Design and build application deployment pipelines, including containerized environments using Kubernetes and Docker, and automated testing pipelines.
Basic Qualifications:
B.S. in Computer Science, Mathematics, Physics, Electrical Engineering, or Computer Engineering and 5+ years of combined experience in Kafka, Java, RESTful services, AWS, and full stack development.
Must hold an active Interim Secret DoD Security clearance.
Ability to obtain Security+ certification or equivalent DoD 8570 IAT II certification within 14 days of the start date.
Software development experience with Python, Java and SQL. Working knowledge of HTML and JavaScript.
Advanced understanding of event streaming and Kafka integration.
Experience in application integration design and strong communication skills for collaboration with virtual teams.
Experience in developing software detailed designs, particularly in ksql or kstreams.
Proficiency in following a software development lifecycle and maintaining production-quality code.
Experience with distributed version control software such as Git and Bitbucket.
Knowledge of and ability to apply principles, theories, and concepts of Software Engineering.
Experience developing software on a UNIX command line platform.
Develop DoD requirements, traceability, and detailed plans/schedules. Write software systems engineering documents and interface documents (IDDs/ICDs).
Preferred Qualifications:
Text Mining & ELK Stack: Experience with text mining tools and techniques, including ELK Stack for summarization, search, and entity extraction.
CI/CD & DevOps: Familiarity with CI/CD techniques, containerized pipelines, and DevOps practices.
Search & Analytics Applications: Experience with BI tools like Kibana and Splunk, and technologies like Elasticsearch, Logstash, Kafka, and NiFi.
Kubernetes & Agile: Familiarity with Kubernetes deployment, Agile methodologies, and tools.
Cloud Expertise: Familiarity with AWS GovCloud and cloud infrastructure, including networking and security policies.
Cloud Platform Expertise: Utilize expert knowledge of cloud-integrated platforms for integration and deployment tasks.
Cross-Team Collaboration: Work within a matrixed organization, collaborating with project leadership and core GMS teams to combine software and integration practices with data engineering.
System Architecture & Operational Stability: Apply knowledge of system architecture, networks, and Centralized Logging (ELK) to support data transformation initiatives.
Cloud & DoD Environments: Experience developing and deploying software in a DoD environment (DISA experience is a plus), including experience building and deploying software applications that meet DoD security standards, including updating applications and code to meet security scans and meeting security implementation guidelines (e.g. STIGs).
Agile Processes: Experience with Agile methodologies and related tools. Experience with Atlassian tools, including JIRA and Confluence.
Certifications: Certified Confluent Developer and Certified Elastic Engineer.
Remote Teamwork: Experience working remotely with a geographically dispersed team.
This role offers the opportunity to work on advanced integration projects, combining software engineering with data integration in a dynamic environment. If you meet the qualifications and are passionate about integration engineering, we encourage you to apply.
Original Posting Date:
2024-10-02While subject to change based on business needs, Leidos reasonably anticipates that this job requisition will remain open for at least 3 days with an anticipated close date of no earlier than 3 days after the original posting date as listed above.
Pay Range:
Pay Range $(phone number removed) - $(phone number removed)
The Leidos pay range for this job level is a general guideline only and not a guarantee of compensation or salary. Additional factors considered in extending an offer include (but are not limited to) responsibilities of the job, education, experience, knowledge, skills, and abilities, as well as internal equity, alignment with market data, applicable bargaining agreement (if any), or other law.
The Leidos Digital Modernization Group seeks an Integration Engineer with a strong Software Development background to support the Enterprise Situational Awareness/Common Operational Picture (SA/COP) Team. The candidate will be responsible for integrating various data sources into Confluent (Kafka) and Elastic platforms, developing robust integration solutions, and contributing to data governance practices. The role requires expertise in Kafka, cloud platforms, and full software lifecycle automation, with experience in deploying and managing systems in a multi-site, multi-cluster environment.
The Leidos Digital Modernization Sector provides a diverse portfolio of systems, solutions, and services covering land, sea, air, space, and cyberspace for customers worldwide. Solutions for Defense include enterprise and mission IT, large-scale intelligence systems, command and control, geospatial and data analytics, cybersecurity, logistics, training, and intelligence analysis and operations support. Our team is solving the world’s toughest security challenges for customers with “can’t fail” missions.
Joint Management Tool (JMT) supports the effective planning, deployment, trouble management, and decommissioning of deployed and non-deployed Satellite Communications (SATCOM) resources, specifically those supporting Department of Defense (DoD), mission partner (other U.S. federal agency), and international partner missions. Through the implementation of automated workflows and dynamic user interfaces, the JMT system strives to streamline business processes and significantly reduce the recurrence and impact of human error by functioning as a homogenous platform through which SATCOM planners, provisioners, operators, and customers can seamlessly coordinate SATCOM requests.?
As a key SA/COP team member, you will work as part of a fast paced, Agile development and implementation team to architect, design and develop an integration solution to ensure a unified data integrated platform that expands the foundational Integrated Data Architecture platform (Confluent and ELK platform). You will work alongside others in a dedicated scrum team in support for operational end user and support team requirements.
Primary Responsibilities:
Integration Solutions: Develop and implement integration solutions for the JMT project using Kafka and Elastic as the primary data architecture platforms.
Data Integration: Integrate data sources (ServiceNow, Terminal Cert DB, Modem Cert DB, Baseband Mission Workup, TRS, MRS, UDL) into Confluent (Kafka) and Elastic platforms. Develop Kafka system integrations between Elasticsearch/Logstash and other systems.
Kafka Integration & Development: Develop Kafka system integrations, custom connectors, and work with ksqlDB and Kafka Streams for data processing based on the design solution.
Kafka Cluster Management: Deploy and manage Kafka clusters on Kubernetes in multi-site environments (both on-premise and cloud).
Software Lifecycle Automation: Automate the full software lifecycle, from design and development to testing and deployment, including production environments.
DevOps Pipelines: Design and build application deployment pipelines, including containerized environments using Kubernetes and Docker, and automated testing pipelines.
Basic Qualifications:
B.S. in Computer Science, Mathematics, Physics, Electrical Engineering, or Computer Engineering and 5+ years of combined experience in Kafka, Java, RESTful services, AWS, and full stack development.
Must hold an active Interim Secret DoD Security clearance.
Ability to obtain Security+ certification or equivalent DoD 8570 IAT II certification within 14 days of the start date.
Software development experience with Python, Java and SQL. Working knowledge of HTML and JavaScript.
Advanced understanding of event streaming and Kafka integration.
Experience in application integration design and strong communication skills for collaboration with virtual teams.
Experience in developing software detailed designs, particularly in ksql or kstreams.
Proficiency in following a software development lifecycle and maintaining production-quality code.
Experience with distributed version control software such as Git and Bitbucket.
Knowledge of and ability to apply principles, theories, and concepts of Software Engineering.
Experience developing software on a UNIX command line platform.
Develop DoD requirements, traceability, and detailed plans/schedules. Write software systems engineering documents and interface documents (IDDs/ICDs).
Preferred Qualifications:
Text Mining & ELK Stack: Experience with text mining tools and techniques, including ELK Stack for summarization, search, and entity extraction.
CI/CD & DevOps: Familiarity with CI/CD techniques, containerized pipelines, and DevOps practices.
Search & Analytics Applications: Experience with BI tools like Kibana and Splunk, and technologies like Elasticsearch, Logstash, Kafka, and NiFi.
Kubernetes & Agile: Familiarity with Kubernetes deployment, Agile methodologies, and tools.
Cloud Expertise: Familiarity with AWS GovCloud and cloud infrastructure, including networking and security policies.
Cloud Platform Expertise: Utilize expert knowledge of cloud-integrated platforms for integration and deployment tasks.
Cross-Team Collaboration: Work within a matrixed organization, collaborating with project leadership and core GMS teams to combine software and integration practices with data engineering.
System Architecture & Operational Stability: Apply knowledge of system architecture, networks, and Centralized Logging (ELK) to support data transformation initiatives.
Cloud & DoD Environments: Experience developing and deploying software in a DoD environment (DISA experience is a plus), including experience building and deploying software applications that meet DoD security standards, including updating applications and code to meet security scans and meeting security implementation guidelines (e.g. STIGs).
Agile Processes: Experience with Agile methodologies and related tools. Experience with Atlassian tools, including JIRA and Confluence.
Certifications: Certified Confluent Developer and Certified Elastic Engineer.
Remote Teamwork: Experience working remotely with a geographically dispersed team.
This role offers the opportunity to work on advanced integration projects, combining software engineering with data integration in a dynamic environment. If you meet the qualifications and are passionate about integration engineering, we encourage you to apply.
Original Posting Date:
2024-10-02While subject to change based on business needs, Leidos reasonably anticipates that this job requisition will remain open for at least 3 days with an anticipated close date of no earlier than 3 days after the original posting date as listed above.
Pay Range:
Pay Range $(phone number removed) - $(phone number removed)
The Leidos pay range for this job level is a general guideline only and not a guarantee of compensation or salary. Additional factors considered in extending an offer include (but are not limited to) responsibilities of the job, education, experience, knowledge, skills, and abilities, as well as internal equity, alignment with market data, applicable bargaining agreement (if any), or other law.
Share this job
Useful Links