Saturday, February 28, 2026

Looking for Databricks Data Engineer with DevOps Skills - Los Angeles CA (Hybrid)

Hi Folks,

 

I am Mahesh Kumar, Senior Recruiter at Kodeva LLC. I saw your profile on LinkedIn, and we are looking for Databricks, Lakehouse Architecture, AWS, PySpark/Spark, DevOps & CI/CD.

 

Note: Please avoid GC consultants for C2C

 

Job Description:

 

Job Title: Databricks Data Engineer with DevOps Skills

Location: Los Angeles CA  (Hybrid)

Employment Type: Long term contract

 

Job Summary

We are looking for an experienced Databricks Data Engineer with strong DevOps expertise to join our data engineering team. The ideal candidate will design, build, and optimize large-scale pipelines on the Databricks Lakehouse Platform on AWS, while driving automated CI/CD and deployment practices. This role requires strong skills in PySpark, SQL, AWS cloud services, and modern DevOps tooling. You will collaborate closely with cross-functional teams to deliver scalable, secure, and high-performance data solutions.

 

Must Demonstrate (Critical Skills & Architectural Competencies)

  • Designing and implementing Databricks-based Lakehouse architectures on AWS
  • Clear separation of compute vs. serving layers
  • Ability to design low-latency data/API access strategies (beyond Spark-only patterns)
  • Strong understanding of caching strategies for performance and cost optimization
  • Data partitioning, storage optimization, and file layout strategy
  • Ability to handle multi-terabyte structured or time-series datasets
  • Skill in requirement probing, identifying what matters architecturally
  • A player-coach mindset: hands-on engineering + technical leadership

 

Key Responsibilities

1. Data Pipeline Development

  • Design, build, and maintain scalable ETL/ELT pipelines using Databricks on AWS.
  • Develop high-performance data processing workflows using PySpark/Spark and SQL.
  • Integrate data from Amazon S3, relational databases, and semi/non structured sources.
  • Implement Delta Lake best practices including schema evolution, ACID, OPTIMIZE, ZORDER, partitioning, and file-size tuning.
  • Ensure architectures support high-volume, multi-terabyte workloads.

 

2. DevOps & CI/CD

  • Implement CI/CD pipelines for Databricks using Git, GitLab, GitHub Actions, or AWS-native tools.
  • Build and manage automated deployments using Databricks Asset Bundles.
  • Manage version control for notebooks, workflows, libraries, and environment configuration.
  • Automate cluster policies, job creation, environment provisioning, and configuration management.
  • Support infrastructure-as-code via Terraform (preferred) or CloudFormation.

 

3. Collaboration & Business Support

  • Work with data analysts and BI teams to prepare curated datasets for reporting and analytics.
  • Collaborate closely with product owners, engineering teams, and business partners to translate requirements into scalable implementations.
  • Document data flows, technical architecture, and DevOps/deployment workflows.

 

4. Performance & Optimization

  • Tune Spark clusters, workflows, and queries for cost efficiency and compute performance.
  • Monitor pipelines, troubleshoot failures, and maintain high reliability.
  • Implement logging, monitoring, and observability across workflows and jobs.
  • Apply caching strategies and workload optimization techniques to support low-latency consumption patterns.

 

5. Governance & Security

  • Implement and maintain data governance using Unity Catalog.
  • Enforce access controls, security policies, and data compliance requirements.
  • Ensure lineage, quality checks, and auditability across data flows.

 

Technical Skills

  • Strong hands-on experience with Databricks, including:
  • Delta Lake
  • Unity Catalog
  • Lakehouse Architecture
  • Delta Live Pipelines
  • Databricks Runtime
  • Table Triggers
  • Databricks Workflows
  • Proficiency in PySpark, Spark, and advanced SQL.
  • Expertise with AWS cloud services, including:
  • S3
  • IAM
  • Glue / Glue Catalog
  • Lambda
  • Kinesis (optional but beneficial)
  • Secrets Manager
  • Strong understanding of DevOps tools:
  • Git / GitLab
  • CI/CD pipelines
  • Databricks Asset Bundles
  • Familiarity with Terraform is a plus.
  • Experience with relational databases and data warehouse concepts.

 

Preferred Experience

  • Knowledge of streaming technologies like Structured Streaming/Spark Streaming.
  • Experience building real-time or near real-time pipelines.
  • Exposure to advanced Databricks runtime configurations and performance tuning.

 

Certifications (Optional)

  • Databricks Certified Data Engineer Associate / Professional
  • AWS Data Engineer or AWS Solutions Architect certification

 

 

Thanks & Regards,

 

Mahesh Kumar

Senior Recruiter

mahesh.kumar@kodeva.com 

Kodeva LLC

Address: 20755 Williamsport Pl #320, Ashburn VA 20147

IT Consulting | Staffing Solutions |

USA | CANADA | INDIA

 

 

Python Developer (Azure / Terraform) for Chicago, IL (In-person interview required) - Hybrid

Hi

My name is Rohit Chauhan, and I am a Staffing Specialist at Novia Infotech LLC. I am reaching out to you on an exciting job opportunity with one of our clients.

Job Title: Python Developer - Azure
Location: Chicago, IL (In-person interview required) - Hybrid


Job Summary

We are seeking an experienced API Developer with strong expertise in Microsoft Azure, Python, and Terraform. The ideal candidate will have extensive experience in API development, infrastructure automation, and cloud-native solutions. This role requires hands-on experience with Azure services, API security, and CI/CD automation using GitHub Actions. The candidate should be capable of designing, developing, and managing scalable APIs and cloud infrastructure solutions.


Key Responsibilities

API Development

  • Design and develop APIs using REST, SOAP, and GraphQL standards.
  • Implement secure APIs using OAuth 2.0 and JWT authentication.
  • Maintain and optimize API performance.
  • Develop reusable API components and services.
  • Ensure API scalability and reliability.
  • Document API specifications and integrations.

Azure Cloud Development

  • Design and implement cloud-based solutions using Microsoft Azure.
  • Deploy and manage Azure services and resources.
  • Configure Azure networking components including:
    • Virtual Networks (VNets)
    • Subnets
    • Network Security Groups (NSGs)
    • Route Tables
    • Private Endpoints
  • Implement secure cloud architectures.
  • Monitor and troubleshoot Azure environments.

Infrastructure as Code (Terraform)

  • Author and maintain Terraform modules.
  • Automate Azure resource provisioning.
  • Refactor and optimize Terraform code.
  • Manage infrastructure deployments.
  • Implement Infrastructure as Code best practices.
  • Maintain Terraform state and configurations.

CI/CD & Automation

  • Develop and maintain CI/CD pipelines using GitHub Actions.
  • Integrate Terraform deployments into CI/CD pipelines.
  • Configure workflow automation.
  • Manage secrets and environment variables.
  • Implement automated testing in pipelines.
  • Support continuous integration and deployment processes.

Python Development

  • Develop automation scripts using Python.
  • Integrate Python scripts with Azure services.
  • Use Azure SDKs for resource automation.
  • Develop REST API integrations.
  • Automate operational tasks.
  • Maintain and optimize Python scripts.

Security & Networking

  • Implement API security standards.
  • Configure OAuth 2.0 authentication.
  • Implement JWT token validation.
  • Secure API endpoints.
  • Implement network security best practices.
  • Configure firewalls and access controls.

Required Skills

API Technologies

  • REST APIs
  • SOAP APIs
  • GraphQL
  • API Design
  • API Documentation

API Security

  • OAuth 2.0
  • JWT Authentication
  • API Security Best Practices

Cloud Technologies

  • Microsoft Azure
  • Azure Cloud Services
  • Cloud Networking
  • Cloud Security

Azure Services

  • Azure API Management
  • Application Gateway (WAF)
  • Azure Front Door
  • Azure Key Vault
  • Azure Monitoring

Infrastructure as Code

  • Terraform
  • Infrastructure Automation
  • Cloud Provisioning

Programming

  • Python
  • REST API Integration
  • Automation Scripting

DevOps Tools

  • GitHub Actions
  • CI/CD Pipelines
  • Version Control
  • Automated Testing

Networking

  • VNets
  • Subnets
  • NSGs
  • Route Tables
  • Private Endpoints

Preferred Skills

  • Azure SDK Experience
  • Cloud Automation
  • Microservices Architecture
  • API Gateway Configuration
  • Infrastructure Security
  • Performance Optimization

 

 

Rohit Chauhan

IT Recruiter

E: rohit.c@noviainfotech.com

www.noviainfotech.com

A: 4421 Avenida Ln, McKinney, TX, 75070

 

 

 

 

--
You received this message because you are subscribed to the Google Groups "NoviaJobs" group.
To unsubscribe from this group and stop receiving emails from it, send an email to noviajobs+unsubscribe@googlegroups.com.
To view this discussion visit https://groups.google.com/d/msgid/noviajobs/CAJ0-OE8rQq6LSL-kbB-Ag5nsJRUHY6tedyVgZ_jsp54V6XXuKQ%40mail.gmail.com.

(Only local to PA states) for Infrastructure Engineer || Pittsburg, PA (Onsite)

No HTML found

Hiring for a API Developer for Chicago, IL (Locals Only - F2F Interviews)

HI,

ONLY LOCAL TO API Developer

Job Title: API Developer (Azure / Terraform / Python)

Location: Chicago, IL (Locals Only - F2F Interviews)
Duration: 12+ Months
Interview Process:

  • In-person Client Evaluation (Preferred)
  • If not feasible: 1st Round – Phone | 2nd Round – In-person

Role Overview

We are seeking a skilled API Developer with strong expertise in Microsoft Azure, Terraform (Infrastructure as Code), and Python automation. The ideal candidate will have hands-on experience designing secure APIs, provisioning Azure infrastructure, and implementing CI/CD pipelines using GitHub Actions.

This role requires strong cloud engineering knowledge, infrastructure automation skills, and experience with API security standards.


Key Responsibilities

  • Design, develop, and maintain APIs using REST, SOAP, and GraphQL standards.
  • Implement secure API authentication and authorization mechanisms (OAuth 2.0, JWT).
  • Develop and maintain Terraform modules for Azure infrastructure provisioning.
  • Deploy and manage Azure cloud resources using Infrastructure as Code (IaC).
  • Build and manage CI/CD workflows using GitHub Actions.
  • Integrate Terraform deployments into automated CI/CD pipelines.
  • Develop and maintain Python scripts for cloud automation.
  • Work with Azure SDKs and REST APIs for resource management.
  • Ensure best practices in security, networking, and monitoring.
  • Collaborate with DevOps and cloud engineering teams.

Required Skills

API Development

  • Strong understanding of REST, SOAP, and GraphQL
  • Knowledge of API security concepts:
    • OAuth 2.0
    • JWT Token Validation

Terraform (Infrastructure as Code)

  • Experience authoring, maintaining, and refactoring Terraform modules.
  • Strong understanding of Azure resource provisioning using Terraform.

Microsoft Azure Cloud

  • Hands-on experience with Azure Cloud Services.
  • Experience with:
    • Azure API Management
    • Application Gateway (including WAF)
    • Azure Front Door
    • Azure Key Vault
    • Azure Monitoring solutions
  • Strong networking knowledge:
    • VNets
    • Subnets
    • NSGs
    • Route Tables
    • Private Endpoints

GitHub CI/CD Automation

  • Proficient in GitHub Actions (workflow authoring, secrets management, environment configuration).
  • Experience integrating Terraform into CI/CD pipelines.
  • Familiarity with automated testing within pipelines.

Python Scripting

  • Strong Python scripting skills for automation.
  • Experience using Azure SDKs and REST APIs for resource management.

Experience Required

  • Strong experience in Microsoft Azure
  • Python development & automation
  • Terraform (Infrastructure as Code)

Core Competencies

  • Cloud Infrastructure Automation
  • API Security & Governance
  • DevOps & CI/CD
  • Cloud Networking
  • Infrastructure Scalability & Monitoring

 

--
You received this message because you are subscribed to the Google Groups "NoviaJobs" group.
To unsubscribe from this group and stop receiving emails from it, send an email to noviajobs+unsubscribe@googlegroups.com.
To view this discussion visit https://groups.google.com/d/msgid/noviajobs/CANVQszHt2TNrkfDy52CN%2B4D_%3DjMt5-%3D8GPEyKLLeJANoXep1%2Bg%40mail.gmail.com.

Need: Embedded Systems Engineer for Mountain View, CA

Embedded Systems Engineer – Automotive Infotainment

Location: Mountain View, CA 94043
Duration: 12+ Months
Work Mode: 100% Onsite
Experience Required: 10+ Years
Competency: Embedded Software / Firmware Development


Position Overview

We are seeking an experienced Systems Engineer – Automotive Infotainment with strong expertise in embedded systems and vehicle architecture. The ideal candidate will support requirement elicitation, system bring-up, validation, and integration activities across bench and vehicle environments.

This role requires deep technical understanding of automotive embedded development processes, infotainment system architecture, and cross-functional coordination with global stakeholders.


Key Responsibilities

Requirement Management & System Engineering

  • Support requirement elicitation and clarification (SYS.1).

  • Gather, analyze, and mature system requirements.

  • Act as an interface between customer and customer’s customer.

  • Coordinate infotainment-related requirements across:

    • Software

    • Hardware

    • System

    • Function Control

  • Use tools such as:

    • DOORS

    • Polarion

  • Create and mature System Test Plans.

  • Develop UML state and sequence diagrams.


Infotainment System Bring-Up & Validation

  • Perform infotainment system bring-up:

    • Bench-level

    • Vehicle-level

  • Support hardware validation and industrialization.

  • Verify system functionality and report defects.

  • Conduct evaluation drives from a system integration perspective.

  • Analyze software performance using logs from vehicle/bench testing.

  • Lead and conduct pre-verification events.


Embedded & Automotive Technical Expertise

  • Strong understanding of:

    • Automotive embedded development processes

    • ECU hardware circuits

    • Microcontroller architecture (Peripherals, RAM, NVRAM, Flash)

    • OS fundamentals

    • Low-level driver development (ADC, I2C, Timers, RTC, LED drivers, Stepper motor drivers)

  • Experience with tools:

    • IDEs

    • Vector CAN tools (CANoe, GENy)

  • Knowledge of communication protocols:

    • CAN

    • LIN

    • Ethernet

    • SPI


Stakeholder & Cross-Functional Collaboration

  • Serve as central contact for infotainment system topics.

  • Prepare and independently handle customer demos.

  • Coordinate milestone plans and prototype needs.

  • Support Infotainment System Architect in customer discussions.

  • Collaborate with R&D teams (SW, HW, Component Design, Application).


Required Skills & Experience

  • 10+ years of experience in automotive systems and infotainment product development.

  • Strong knowledge of vehicle system architecture.

  • Hands-on experience in infotainment bring-up and validation.

  • Strong experience in requirement management tools (DOORS, Polarion).

  • Experience with embedded C and low-level driver understanding.

  • Strong knowledge of automotive protocols and tools.

  • Experience in hardware validation and system testing.


Preferred Skills

  • Experience with global automotive OEM projects.

  • Strong embedded firmware debugging experience.

  • Exposure to performance analysis and log diagnostics.

  • Knowledge of system integration best practices.

---
Thanks & Regards
Ishita Bali
Novia Infotech LLC
4421 Avenida Ln, McKinney, TX 75070
Email: ishita.b@noviainfotech.com

--
You received this message because you are subscribed to the Google Groups "NoviaJobs" group.
To unsubscribe from this group and stop receiving emails from it, send an email to noviajobs+unsubscribe@googlegroups.com.
To view this discussion visit https://groups.google.com/d/msgid/noviajobs/CAGiqXECD-5us4mfKMS7C2AfsvwOXoN64_8NhBZtXJpRM%3DmQKSg%40mail.gmail.com.

Salesforce Data Cloud Developer for Chicago, IL - Hybrid

Hi

My name is Rohit Chauhan, and I am a Staffing Specialist at Novia Infotech LLC. I am reaching out to you on an exciting job opportunity with one of our clients.

 

Job Title: Salesforce Data Cloud Developer

Location: Chicago, IL (3 Days Onsite Mandatory)


Job Summary

We are seeking an experienced Salesforce Data Cloud Developer with strong expertise in data integration, enterprise CRM architecture, and Salesforce development. The ideal candidate will have hands-on experience implementing Salesforce Data Cloud in a production environment and integrating it with Databricks. This role requires strong development skills in Apex, Lightning Web Components, and Lightning Flow, along with experience building scalable data solutions.


Key Responsibilities

Salesforce Data Cloud Implementation

  • Implement and configure Salesforce Data Cloud for enterprise use cases.
  • Enable and configure Data Cloud features, objects, and permissions.
  • Set up Data Spaces, roles, and permission sets.
  • Create and manage Data Streams for data ingestion.
  • Configure Data Lake Objects (DLOs).
  • Map data to the Customer 360 data model.
  • Configure and optimize Identity Resolution.
  • Implement activation and data push to Salesforce CRM.
  • Support production deployments and enhancements.

Data Integration

  • Design and implement integrations between Databricks and Salesforce Data Cloud.
  • Configure Databricks connectors in Data Cloud.
  • Implement batch ingestion and zero-copy federation.
  • Deploy and maintain Data Streams.
  • Support zero-copy federation use cases.
  • Ensure reliable data ingestion pipelines.
  • Monitor and troubleshoot data integration issues.

Salesforce Development

  • Develop and maintain custom Salesforce applications.
  • Implement automation using Lightning Flow.
  • Develop custom logic using Apex.
  • Build integrations using Apex and APIs.
  • Develop interactive UI components using Lightning Web Components (LWC).
  • Enhance applications using JavaScript.
  • Maintain reusable and scalable components.

Data Architecture

  • Design enterprise CRM data architecture.
  • Support Customer 360 data models.
  • Optimize data pipelines and workflows.
  • Ensure data consistency and quality.
  • Support enterprise data platforms.
  • Maintain data mappings and transformations.

Platform Management

  • Configure Data Cloud environments.
  • Manage access and permissions.
  • Monitor data ingestion processes.
  • Support deployments and releases.
  • Troubleshoot platform issues.
  • Maintain documentation.

Collaboration

  • Work with business and technical teams.
  • Gather requirements and translate into solutions.
  • Support enterprise CRM initiatives.
  • Provide technical guidance.
  • Participate in Agile ceremonies.
  • Support production environments.

Required Skills

Salesforce Data Cloud

  • Data Cloud Configuration
  • Data Streams
  • Data Lake Objects (DLO)
  • Identity Resolution
  • Customer 360 Data Model
  • Data Activation
  • Data Spaces
  • Permissions & Security

 

 

Rohit Chauhan

IT Recruiter

E: rohit.c@noviainfotech.com

www.noviainfotech.com

A: 4421 Avenida Ln, McKinney, TX, 75070

 

 

 

 

--
You received this message because you are subscribed to the Google Groups "NoviaJobs" group.
To unsubscribe from this group and stop receiving emails from it, send an email to noviajobs+unsubscribe@googlegroups.com.
To view this discussion visit https://groups.google.com/d/msgid/noviajobs/CAJ0-OE-imtkVVQeXAEA6Ya1XmTEchsdTcpxPjM-kJ_inspfE9A%40mail.gmail.com.

Urgent requirement of BI Analyst Engineer for Phoenix, AZ (Local Candidate Only)

Hi, This is Diksha Chaudhary  working with Novia Infotech. We have the below contract job opportunity with one of our direct clients a...