Job Position: GCP Data Engineer
Location : Phoenix, AZ - Hybrid
Job Type: Contract
Job Summary:
Core data engineering
Proficiency in using GCPs big data tools like:
BigQuery: For data warehousing and SQL analytics.
Dataproc: For running Spark and Hadoop clusters.
GCP Dataflow: For stream and batch data processing.(High level Idea)
GCP Pub/Sub: For real-time messaging and event ingestion.(High level Idea).
Expertise in building automated, scalable, and reliable pipelines using custom Python/Scala solutions or Cloud Data Functions.
Programming and Scripting:
Backend Development (Spring Boot & Java) Design and develop RESTful APIs and microservices using Spring Boot. Implement business logic, security, authentication (JWT/OAuth), and database operations. Work with relational databases (MySQL, PostgreSQL, MongoDB, Cloud SQL). Optimize backend performance, scalability, and maintainability. Implement unit testing and integration testing.
Highly skilled Engineer with a solid experience of building Bigdata, GCP Cloud based real time data pipelines and REST APIs with Java frameworks. The Engineer will play a crucial role in designing, implementing, and optimizing data solutions to support our organizations data-driven initiatives. This role requires expertise in data engineering, strong problem-solving abilities, and a collaborative mindset to work effectively with various stakeholders.
Mandatory Skills: