Title: Security Engineer
Location: Remote
Duration: 1 year
Core Responsibilities
- Responsible for design implementation and documentation of log ETL and configuration management
- Performs deployment testing and validation of data ingest pipelines
- Recommends alterations and additions to existing designs to improve quality of products
- Works with operational teams to develop and improve monitoring solutions plan and schedule maintenance and implementing changes
- Develop standards and procedures for managing monitoring and updating ETL pipelines
- Communicates progress of work in progress key initiatives and walkthroughs on complex designs and architecture
- Works under immediate supervision Typically reports to a supervisor or manager
Required Experience and Skills
- Bachelors degree and at least 4 years of experience in the field or in a related area
- Experience developing and deploying data ingestion pipelines on open source tools such as Logstash Kafka Vector and cloud based platforms such as Databricks or Snowflake
- Experience in deploying monitoring ETL and configuration management solutions
- Experience connecting to and working in a variety of database environments
- Working knowledge of storage compute and transformational functions using cloud compute services
- Ability to utilize and work with a large variety of open source technologies and tools
- Experience with network and cybersecurity systems tools and environments
- Working knowledge of developing and deploying CI/CD pipelines
- Strong communication skills understanding the value of good documentation ability to detail complex systems in simple language and express concerns about design and architecture with appropriate solutions
Technology experience and skillsets:
- Cloud: Shows competency working with compute and storage services using AWS GCP Azure or other cloud based deployments
- Database: Experience developing in cloud based data platforms such as Snowflake and Databricks or open source table formats such as Apache Iceberg preferred
- Open Source: Python Apache Spark Hashicorp Terraform Vector Apache Iceberg Logstash Kafka Kubernetes
- Monitoring Visibility and Alerting: Datadog Prometheus Grafana
- Networking: TCP/IP Stack Routing VPN Firewalls
Job Specification:
- 4 years working on scalable cloud compute/storage platforms like AWS GCP or Azure
- 4 year working in database environments programming with Python/Spark/SQL
- 4 years working with Snowflake Databricks or similar largescale data platforms
- Optional Certification(s): AWS Azure GCP Snowflake Databricks Apache or any other data platform certifications preferred
Edit Client Job Description
Additional Details
Summary
Security Data Developer will serve as a key member of the Security Development and Engineering team through the transformation of Commercial network and security event logging data to support a multitude of data driven business use cases. The Developer will work with the architecture and engineering teams to deploy document and test a multicomponent infrastructure designed to ETL the raw logging data into a structured database environment. They will work closely with Operations teams in deploying the logging solutions procedures and runbooks for troubleshooting and managing the health of the system. They will have a positive impact on the security organization and shape the way how security event monitoring is provided across the Comcast Enterprise.
Edit Additional Details
Tech Category: Technology
Edit Tech Category: Technology
Years of Experience Required