drjobs 9190-1 Telecom - Data Architect العربية

9190-1 Telecom - Data Architect

Employer Active

The job posting is outdated and position may be filled
drjobs

Job Alert

You will be updated with latest job alerts via email
Valid email field required
Send jobs
Send me jobs like this
drjobs

Job Alert

You will be updated with latest job alerts via email

Valid email field required
Send jobs
Job Location drjobs

others - USA

Monthly Salary drjobs

Not Disclosed

drjobs

Salary Not Disclosed

Job Description

We are looking for an experienced Software engineer with focus on Data Engineering, ETL processes, preferably with exposure to both batch and streaming data. The candidate should have familiarity with use of Databases and DataLake infrastructure and associated tools for ingestion, transformation and efficient querying across distributed data frameworks to include understanding of performance and scalability issues and query optimization.
Required:
- 2-4 years of experience developing Data engineering, and ad-hoc transformation of unstructured raw data
- Use of orchestration tools
- Design, build, and maintain workflows/pipelines to process continuous stream of data with experience in end-to-end design and build process of Near-Real-Time and Batch Data Pipelines.
- Expected to work closely with other data engineers and business intelligence engineers across teams to create data integrations and ETL pipelines to drive projects from initial concept to production deployment
- Maintaining and supporting incoming data feed into the data pipeline from multiple sources, including external customer feeds in CSV or XML file format to Publisher/Subscriber model automatic feeds.
- Knowledge of database structures, theories, principles and practices (both SQL and NoSQL).
- Active development of ETL processes using Python, PySpark, Spark or other highly parallel technologies, and implementing ETL/data pipelines
- Experience with Data Engineering technologies and tools such as Spark, Kafka, Hive, Ookla, NiFi, Impala, SQL, NoSQL etc
- Understanding of Map Reduce and other Data Query Processing and Aggregation models
- Understanding of challenges of transforming data across distributed clustered environment
- Experience with techniques for consuming, holding and aging out continuous data streams
- Continually improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers
- Ability to provide quick ingestion tools and corresponding access API's for continuously changing data schema, working closely with Data Engineers around specific transformation and access needs

Preferred:
- 1-2 years experience developing applications with Relational Databases, preferably with experience in SQLServer and/or MySQL.
- Some exposure to database optimization techniques for speed, complexity, normalization etc

Skills and Attributes:
1. Ability to have effective working relationships with all functional units of the organization
2. Excellent written, verbal and presentation skills
3. Excellent interpersonal skills
4. Ability to work as part of a cross-cultural team
5. Self-starter and Self-motivated
6. Ability to work without lots of supervision
7. Works under pressure and is able to manage competing priorities.

Technical qualifications and experience level:
1. 3-7 years in development using Java, Python, PySpark, Spark, Scala, and object-oriented approaches in designing, coding, testing, and debugging programs
2. Ability to create simple scripts and tools, using Linux, Perl, Bash
3. Development of cloud based, distributed applications
4. Understanding of clustering and cloud orchestration tools
5. Working knowledge of database standards and end user applications
6. Working knowledge of data backup, recovery, security, integrity and SQL
7. Familiarity with database design, documentation and coding
8. Previous experience with DBA case tools (frontend/backend) and third party tools
9. Understanding of distributed file systems, and their optimal use in the commercial cloud (HDFS, S3, Google File System, Databricks)
10. Familiarity with programming languages API
11. Problem solving skills and ability to think algorithmically
12. Working Knowledge on RDBMS/ORDBMS like MariaDb, Oracle and PostgreSQL
13. Knowledge of SDLC (Waterfall, Agile and Scrum)
14. BS degree in a computer discipline or relevant certification

Employment Type

Full Time

Company Industry

About Company

100 employees
Report This Job
Disclaimer: Drjobpro.com is only a platform that connects job seekers and employers. Applicants are advised to conduct their own independent research into the credentials of the prospective employer.We always make certain that our clients do not endorse any request for money payments, thus we advise against sharing any personal or bank-related information with any third party. If you suspect fraud or malpractice, please contact us via contact us page.