Google Cloud / Big Data Engineer – 100% Remote! in USA

United States

SUMMARY:

Fortune 500 client in the Financial Services/FinTech sector is looking to hire a Big Data Engineer with strong GCP skills to join their team! This team develops customer-centric applications using emerging technology stacks that support a critical event-driven, microservices platform (which is moving exclusively to Google Cloud / GCP). Google Cloud / Big Data Engineer – 100% Remote!

DESIRED SKILLS & EXPERIENCE:

  • 4-5+ years of Big Data ecosystem and software engineering experience.
  • 3-5+ years’ experience with Google Cloud Platform (GCP) and it’s services.
  • 3+ years’ experience with GCP BigQuery, GCP Compute Engine and Google Cloud Dataproc.
  • 2-3+ years of hands-on experience of working with Hadoop, Map-Reduce, Hive, Spark (core, SQL and PySpark). Google Cloud / Big Data Engineer – 100% Remote!
  • Hands-on experience writing complex SQL (i.e., Hive/PySpark-dataframes & optimizing joins while processing huge amounts of data).
  • Ability to design and develop optimized data pipelines for batch and real-time data processing. Google Cloud / Big Data Engineer – 100% Remote!

NICE-TO-HAVE (PLUS):

Other Posts You May Be Interested In

  • Experience with Kafka streams or queues.
  • Experience with GitHub and leveraging CI/CD pipelines.
  • Experience with NoSQL i.e., HBase, Couchbase, MongoDB.
  • Experience with Data Visualization tools like Tableau, Sisense, Looker.
  • UNIX shell scripting skills. Google Cloud / Big Data Engineer – 100% Remote!

Leave a Comment