This is a remote position.
We are seeking a skilled and experienced MidLevel IT Specialist to join our dynamic team. As a crucial member of our IT department you will be responsible for providing technical support maintaining hardware and software systems and assisting in the implementation of IT projects. The ideal candidate will possess a strong knowledge of IT systems and infrastructure excellent problemsolving abilities and a passion for staying up to date with the latest technologies.
Requirements
- Bachelors degree in information technology Computer Science or related field.
- 3 years of experience as a System Administrator or a similar role.
- 3 years minimum on technical support (1st 2nd and 3rd level)
- Location: Tijuana work on site.
- Proficiency in operating systems (Windows Linux macOS) and office productivity software.
- Knowledge of networking concepts protocols and technologies (TCP/IP DNS DHCP VPN etc.).
- Microsoft 365 administration (Teams Exchange SharePoint One Drive Intune)
- Experience with virtualization technologies (VMware HyperV) and cloud platforms (AWS Azure Google Cloud).
- Familiarity with IT security best practices and tools such as firewalls (Cisco Meraki) antivirus software and intrusion detection systems.
- Manage the installation configuration and maintenance of the physical infrastructure.
- 2 years of experience with Remote support tools (Team Viewer Any Desk)
- Inventory management
- Relevant certifications (Microsoft Cloud and others)
- Scripting (Power Shell)
- Experience work on documents repositories (Wiki)
- Zoho suite experience (people Recruitment Desk etc)
- Provide technical support to endusers addressing hardware and software issues promptly and efficiently.
- Install configure and maintain computer systems networks and peripherals.
- Troubleshoot and resolve network connectivity issues including LAN WAN and wireless networks.
- Perform routine maintenance tasks such as system updates backups and patches.
- Monitor system performance and security proactively identifying and addressing potential vulnerabilities.
- Assist in the development and implementation of IT policies and procedures to ensure compliance with industry standards and regulations.
- Provide training and guidance to endusers on IT systems and best practices.
- Evaluate and recommend new technologies solutions and tools to improve efficiency and effectiveness.
- Configure and manage the cloud infrastructure.
- Maintain accurate documentation of IT systems configurations and procedures.
- Excellent troubleshooting and problemsolving skills.
- Strong communication and interpersonal abilities with a customer serviceoriented approach.
Key Responsibilities: Design, build, and maintain scalable data pipelines on Azure Databricks using PySpark and Spark SQL. Develop and optimize ETL processes for handling large data sets. Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and implement solutions. Ensure data quality, performance, and scalability by implementing best practices for coding and data architecture. Create and manage data models, databases, and data lakes on Azure. Monitor and troubleshoot data pipelines, ensuring high availability and reliability. Implement security and compliance best practices in the data pipeline, following Azure standards. Work with Azure services like Azure Data Factory, Azure Synapse, and Azure Blob Storage to orchestrate and manage data workflows. Qualifications: Bachelor's degree in Computer Science, Information Technology, or a related field. 5+ years of hands-on experience with Azure Databricks. Proficiency in PySpark and Spark SQL for building scalable data pipelines. Solid understanding of Azure data ecosystem, including Azure Data Lake, Azure Data Factory, and Azure Synapse Analytics. Experience with ETL processes, data modeling, and data architecture. Familiarity with cloud security best practices and data governance. Strong problem-solving and troubleshooting skills. Excellent communication and collaboration skills. Nice-to-Have: Experience with CI/CD pipelines using tools like Azure DevOps. Knowledge of Data Warehousing and Big Data technologies. Experience with other programming languages like Python or Scala. Familiarity with Machine Learning workflows on Databricks.