Lead Solutions Architect
London
Overview:
• We are thrilled to be partnered with a boutique data consultancy, searching for a Technical Solutions Architect to join a thriving data architecture team
• You will create, advise on, and implement systems architecture for the client's complex data architecture ecosystems.
• Data Mesh, Data Facrbic, Kubernetes, DevOps, Data Engineering, Kafka, MLOps, Cyber Security
• Salary up to £125,000 + package
The Company:
Exciting boutique technology consultancy, pairing strategic vision with deep architecture and engineering capabilities. You will be working with a leading London Lloyds institution specifically as a part of the core data architecture modernisation team.
Key Responsibilities:
• Working with internal and client teams, interacting with developers, architects, security teams, and engineers to oversee quality of work and reach key milestones
• You will be creating, advising on, and implementing systems architecture for the complex data architecture ecosystems of the client through to cutting code you will manage a very diverse and challenging portfolio of work.
• Designing and implementing the overall data architecture runtime within the organisation, including all workloads of its architecture on both private cloud and public cloud.
Required Skills and Experience:
• Experience designing Data Mesh Architectures and Data Fabric solutions for Analytics use cases.
• Solution Architecture for Infrastructure Solutions using Kubernetes.
• Knowledge of DevOps solutions, containers, IaC, and GitHub.
• Expertise in security, data governance, and principles.
• Experience with cybersecurity best practices and secure data architectures.
• MLOps experience, including ML model design, deployment, and management.
• Familiarity with machine learning concepts and frameworks (TensorFlow, PyTorch, scikit-learn).
• Data archiving, backup, and disaster recovery solutions implementation.
• Experience with functional programming languages (Java, Scala, JVM languages).
• Delivery of end-to-end large-scale programs.
• Proficiency with Apache Kafka and Confluent Cloud (Kafka Connect, KSQL, Kafka Streams, Schema Registry).
• Familiarity with Apache Spark and Databricks.
• Utilization of Cloud Data Services in Azure, AWS, and GCP.
If your experience matches the role, click apply and let's catch up!
McGregor Boyall is an equal opportunity employer and do not discriminate on any grounds.