Position : Database Architect (Google Cloud)
Location : Remote
Duration : 24+ months contract
Minimum of 8-10 years of technical solutions implementation, architecture design, evaluation, and investigation in a cloud environment.
Minimum of 5 years architecting, developing, and deploying scalable enterprise data analytics solutions (Enterprise Data Warehouses, Data Marts, etc)
Knowledge of data warehousing concepts, including data warehouse technical architectures, infrastructure components, ETL/ ELT and reporting/analytic tools and environments (such as Apache Beam, Hadoop, Spark, Pig, Hive, MapReduce, Flume)
Minimum 1 year of designing, building and operationalizing large-scale enterprise data solutions and applications using one or more of GCP data and analytics services in combination with 3rd parties - Spark, Cloud DataProc, Cloud Dataflow, Apache Beam, BigTable, Cloud BigQuery, Cloud PubSub, Cloud Functions, etc.
Hands-on experience analyzing, re-architecting and re-platforming on-premise data warehouses to data platforms on Google cloud using GCP/3rd party services
Designing and building production data pipelines from ingestion to consumption within a hybrid big data architecture, using Java, Python, Scala etc.
Architecting and implementing next generation data and analytics platforms on GCP cloud
Designing and implementing data engineering, ingestion and curation functions on GCP cloud using GCP native or custom programming
Experience in performing detail assessments of current state data platforms and creating an appropriate transition path to GCP cloud
Please note - USC/Green Card candidates are preferred…..
H1b is also workable…