Position title
Sr. Data Engineer With strong GCP Background
Description
Experience:
- 7–10 years as a Data Warehouse Engineer/Architect
- Experience in startup environments
Database & Data Modeling Expertise:
- Mastery in transactional databases, dimensional data modeling, and data marts
- Deep understanding of SQL-based Big Data systems
ETL & Data Pipelines:
- Experience with modern ETL tools
- Skilled in building data pipelines using Python
Google Cloud & BigQuery:
- Hands-on experience designing data warehouses in BigQuery
- Strong GCP knowledge including data computing, storage, and security
High Availability Systems:
- Experience in 24x7x365 high-availability digital marketing/e-commerce systems
Batch & Real-Time Processing:
- Proficient in both real-time streaming and high-volume batch processing
Tools: Apache Kafka, Data Lakes, BigQuery
Big Data Tech Stack:
- Hands-on with Hadoop, Hive, Spark
- Experience with Customer Data Platforms (CDP)
Programming & DevOps:
- Proficient in Python (preferred) or other languages
- Familiar with CI/CD (GitHub Actions), Infrastructure as Code (Terraform)
BI & Orchestration Tools:
- Tools: DOMO, Tableau, Looker
- Orchestration: Airflow
Additional Skills:
- Basics of Data Science & ML (modeling, training, data prep)
- Version control (Git)
- Strong data governance & security understanding
- Analytical, problem-solving mindset
- Fast learner, self-guided, thrives in dynamic environments
Skills
Primary Skills:
- PySpark, Spark, Python
- Big Data Technologies
- Google Cloud Platform (GCP)
- Apache Beam, Dataflow, Airflow
- Kafka, BigQuery
Good to Have:
- GFO
- Google Analytics
Job Location
Remote work possible
Experience
8+ Years