Google Cloud Data Architect

Posted 17 May 2023
Salary Negotiable
Job type Contract
Discipline Data & AnalyticsSoftware Engineering
ContactJamie Hine

Job description

Conexus are currently partnered with a Global Client who is looking for a Google Cloud Data Architect to join them remotely on an initial 6-month contract.

Your position:

Your role will involve working closely with stakeholders to understand business requirements, defining data architectures, and developing ETL strategies to enable efficient data integration and transformation.

  • Collaborate with business stakeholders, data scientists, and engineers to understand data requirements and design optimal data architectures on the Google Cloud Platform.
  • Develop end-to-end data strategies and roadmaps that align with business goals and ensure scalability, performance, and data quality.
  • Design and implement robust ETL pipelines to extract, transform, and load data from various sources into GCP, ensuring data integrity and consistency.
  • Evaluate existing data infrastructure and propose improvements or new solutions to enhance data management, storage, and processing capabilities.
  • Work with cross-functional teams to define data governance frameworks, standards, and best practices, ensuring data security, privacy, and compliance.
  • Provide technical guidance and mentorship to data engineers and other team members, fostering a collaborative and innovative work environment.
  • Stay up-to-date with industry trends, emerging technologies, and advancements in Google Cloud services to continuously improve data architectures and ETL processes.
  • Conduct performance tuning, optimization, and troubleshooting activities to maintain high data processing efficiency and reliability.
  • Collaborate with other architects and stakeholders to define data integration strategies, data warehousing solutions, and data analytics frameworks.


  • Extensive experience as a data architect, data engineer, or a similar role, with a focus on designing and implementing data solutions on the Google Cloud Platform.
  • In-depth knowledge of GCP services such as BigQuery, Dataflow, Cloud Storage, Pub/Sub, Data Catalog, and Cloud Composer.
  • Strong understanding of ETL concepts, data integration patterns, and data transformation techniques.
  • Familiarity with data modeling techniques, including conceptual, logical, and physical data models.
  • Experience in designing and implementing scalable, reliable, and performant ETL pipelines using tools like Apache Beam, Apache Airflow, or similar.
  • Knowledge of data governance, data security, and compliance standards.
  • Excellent communication and collaboration skills, with the ability to work effectively with cross-functional teams and stakeholders.

If this would be of interest to you, please apply or contact me directly for more details.