Apply Now
Location: Phoenix, Arizona (AZ)
Contract Type: C2C
Posted: 1 day ago
Closed Date: 04/30/2025
Skills: SQL analytics
Visa Type: Any Visa

Job Position: GCP Data Engineer

Location : Phoenix, AZ - Hybrid

Job Type: Contract

  

Job Summary:

Core data engineering

Proficiency in using GCPs big data tools like:

BigQuery: For data warehousing and SQL analytics.

Dataproc: For running Spark and Hadoop clusters.

GCP Dataflow: For stream and batch data processing.(High level Idea)

GCP Pub/Sub: For real-time messaging and event ingestion.(High level Idea).

Expertise in building automated, scalable, and reliable pipelines using custom Python/Scala solutions or Cloud Data Functions.

Programming and Scripting:

  • Strong coding skills in SQL, and Java.
  • Familiarity with APIs and SDKs for GCP services to build custom data solutions.
  • Cloud Infrastructure
  • Understanding of GCP services such as Cloud Storage, Compute Engine, and Cloud Functions.
  • Familiarity with Kubernetes (GKE) and containerization for deploying data pipelines. (Optional but Good to have)

 Backend Development (Spring Boot & Java) Design and develop RESTful APIs and microservices using Spring Boot. Implement business logic, security, authentication (JWT/OAuth), and database operations. Work with relational databases (MySQL, PostgreSQL, MongoDB, Cloud SQL). Optimize backend performance, scalability, and maintainability. Implement unit testing and integration testing.

Highly skilled Engineer with a solid experience of building Bigdata, GCP Cloud based real time data pipelines and REST APIs with Java frameworks. The Engineer will play a crucial role in designing, implementing, and optimizing data solutions to support our organizations data-driven initiatives. This role requires expertise in data engineering, strong problem-solving abilities, and a collaborative mindset to work effectively with various stakeholders.

  • More than five years of experience in data tools within GCP with must have big query, data proc and nice to have GCP Dataflow and Pub/Sub.
  • Programming and scripting - Java and SQL Must Have.
  • Familiarity with APIs and SDKs for GCP services Nice to have.

Mandatory Skills:

  • GCP - Big Query, Data Proc
  • Java and SQL
  • GCP Dataflow and Pub/Sub
  • Familiarity with APIs and SDKs for GCP services