Education/Experience: 8 years of experience and a B.S. in Engineering Computer Science Mathematics Statistics Physics Electrical Engineering Computer Engineering Data Science or Data Analytics. Additional experience may be accepted in lieu of degree. Data Management & Collaboration: Proficiency with Data Management platforms and strong communication skills for effective collaboration with virtual teams of data engineers and DevOps engineers. Software Development Lifecycle: Experience following a software development lifecycle with the ability to develop and maintain productionquality code. Security Clearance: Ability to obtain interim Secret DoD Security clearance before the start date.
Preferred Qualifications:
Data Automation: Experience automating data cleansing formatting staging and transformation processes. Text Mining & ELK Stack: Proficiency with text mining tools summarization search (ELK Stack) entity extraction training set generation and anomaly detection. CI/CD & Containerization: Familiarity with CI/CD techniques for developing and releasing software through containerized pipelines. BI Tools & Search Analytics: Knowledge of BI tools (e.g. Kibana Splunk) and experience with developing search and analytics applications. Big Data Technologies: Experience with Elasticsearch Logstash Kibana Kafka ksql NiFi Apache Spark ServiceNow. Elastic Engineer Certification: Certified Elastic Engineer with experience developing logstash and ingest pipelines. Experience developing in Confluent ksql and kstreams for data ETL purposes. Kubernetes Expertise: Familiarity with Kubernetes and deployment of containers. Agile Processes: Experience with Agile methodologies and related tools. Essential Requirements: US Citizenship is required. Active Secret
Job Duties Data Analysis & Problem Solving: Analyze quantitative and qualitative data to solve stakeholder problems and improve business efficiency. Design implement and document solutions as repeatable processes. ETL Pipeline Development: Perform extraction transformation and load (ETL) tasks. Develop and integrate data sets from diverse environments to support use cases involving network performance application and configuration data. Data Modeling & Management: Develop test and maintain both physical and logical data models. Ensure consistency quality accuracy and security of data by managing relevant metadata in support of the project. Identify and resolve Elasticsearch issues including slow queries and indexing problems. Adherence to Governance & SecDevOps: Follow GMS Data Governance and SecDevOps policies to develop test deploy and maintain data engineering pipelines. CrossTeam Collaboration: Work within a matrixed organization to collaborate with primary project leadership while maintaining standard practices with GMS core teams. Combine software and data engineering practices to strengthen enterprise data governance. System Architecture & Data Transformation: Apply knowledge of system architecture network and Centralized Logging (ELK) to support data transformation efforts. Data Analytics & Visualization: Secure maintain optimize and document analytics and visualization solutions including some design and build responsibilities. Agile Practices: Follow Agile scrum practices in daily operations. Elastic Cluster Management: Deploy and manage Elastic clusters on Kubernetes in both onpremise and cloud environments. Platform Expansion: Expand data platforms and analytics solutions using Elastic and Confluent platforms focusing on SATCOM metadata for the dashboard and reporting visualizations. Customer Visualization Support: Support customerdriven visualization requirements and collaborate on data integration and Kibana dashboard development.
Disclaimer: Drjobpro.com is only a platform that connects job seekers and employers. Applicants are advised to conduct their own independent research into the credentials of the prospective employer.We always make certain that our clients do not endorse any request for money payments, thus we advise against sharing any personal or bank-related information with any third party. If you suspect fraud or malpractice, please contact us via contact us page.