Senior Data Engineer with GCP experience to implement a data hub using Google Cloud components with one of our major banking clients- 36461
S.i. Systèmes Toronto
Senior Data Engineer with GCP experience to implement a data hub using Google Cloud components with one of our major banking clients- 36461
Location Address: Toronto - hybrid - 3 days/per week in the office (Monday, Wednesday and Thursday)Contract Duration: ASAP to October 31st, 2025.
Possibility of extension: Depending on performance and funding - This is a long project that will require multiple extension.
Schedule Hours: 9 am-5 pm Monday-Friday (No OT)
Story Behind the Need
Business group:
- The Wealth Data Engineering team within the Global Wealth Engineering (GWE) is the key team in meeting the operational data needs of the various stakeholders within Wealth Management.
- Project: Wealth Data Hub. The Google Lead Data Engineer will play a key role in setting up the Wealth Data Hub using GCP technology. The candidate will play a key role in designing and implementing the Data Hub using Google Cloud components working closely with the enterprise data team and data architects, solution architects, business systems analysts and data engineers
- Designing, building, operationalizing the Wealth Data Hub (WDH) using Google Cloud Platform (GCP) data services such as DataProc, Dataflow, CloudSQL, BigQuery, CloudSpanner in combination with third parties such as Spark, Apache Beam/ composer, DBT, Cloud PubSub, Confluent Kafka, Cloud storage Cloud Functions & Github
- Designing and implementing data ingestion patterns that will support batch, streaming and API interface on both the Ingress and Egress.
- Guide a team of data engineers and work hands on in developing framework and custom code using best practices that will meet the demanding performance requirements
- Take a lead in designing and building production data pipelines from data ingestion to consumption using GCP services, Java, Python, Scala, BigQuery, DBT, SQL etc.
- Experience using Cloud Dataflow using Java/Python for deploying streaming jobs in GCP as well as batch jobs using text/JSON files and writing them to BigQuery
- Building and managing data pipelines with a deep understanding of workflow orchestration, task scheduling and dependency management
- Ability to do proof of technology using GCP technologies and work with data architects, solution architects to achieve the desired results and performance.
- Provide end-to-end technical guidance and expertise on how to effectively use Google Cloud to build solutions; creatively applying cloud infrastructure and platform services to help solve business problems; and communicating these approaches to different business users
- Provide guidance on Implementing application logging, notification, jobs monitoring and performance monitoring
Must Have Skills:
- 8-10 years of experience in data engineering, performance optimization for large OLTP applications with a minimum of 3 years of working experience as Google Cloud Platform (GCP) developer
- 5+ years of experience working with relational/NoSQL databases
- 2-3 years of experience with the primary managed data services within GCP, including DataProc, Dataflow, BigQuery/DBT, Cloud Spanner, Cloud SQL, Cloud Pub/Sub etc.
- 2-3 years of experience with Google Cloud Platform Databases (SQL, Spanner, PostgreSQL)
- 1-2 years of experience with data streaming and technologies such as Kafka, Spark-streaming etc.
Nice-To-Have Skills:
- Working knowledge of developing and scaling JAVA REST services, using frameworks such as Spring
- Understanding of Wealth business line and the various data domains required for building an end to end solution
- Experience with Infrastructure as Code (IaC) practices and frameworks like Terraform
- Knowledge of Java microservices and Spring Boot
- Strong architecture knowledge with experience in providing technical solutions for cloud infrastructure.
- Active Google Cloud Data Engineer certification or Google Professional Cloud Architect certification preferred
Education:
- Degree in Computer Science or related field
- Active Google Cloud Data Engineer certification or Google Professional Cloud Architect certification preferred
- The best candidate is someone with experience within the financial industry preferably wealth management. Also, someone with previous experience working for Google or Amazon.
Candidate Review & Selection
2 Rounds:
- 1st round - HM- 30 mins
- MS Teams Video
- Role overview, and what we are looking for and high level technical questions.
- 2nd round - Panel interview with developers (45 mins) Video call. Ensure the right candidate is selected and fits the team culture.
Apply
Robert HalfToronto
Our client is searching for a Data Engineer to join their expanding team. This role is integral to their in-house Data Engineering team, where they will be tasked to build, configure, secure, migrate, optimize, and refactor Azure data platform...
Royal Bank of CanadaToronto
Job Summary
Job Description
What is the opportunity?
In this role as a Senior Data Engineer, you will be responsible for development deliverables for the Finance Core Data Platform. The Platform, leveraging Hadoop Big-Data technologies, serves...
Tiger AnalyticsToronto
in the world.
The Data Engineer will be responsible for architecting, designing, and implementing advanced analytics capabilities. The right candidate will have broad skills in database design, be comfortable dealing with large and complex data sets, have...