Data Architect Responsibilities
The responsibilities are broadly grouped into architecture design, compliance, business enablement, and leadership.
1. Architect the Future
- Design and Deliver Solutions: Design and deliver enterprise-scale cloud data solutions for high-stakes transformations in sectors like Banking and Telco across ASEAN & Australia.
- Platform Design: Design secure, scalable, and modern cloud data platforms that serve as the backbone for digital transformation programs.
- End-to-End Leadership: Lead end-to-end architecture, from pre-sales solutioning to final delivery.
2. Drive Compliance & Trust
- Governance and Privacy: Ensure that data platforms meet the highest standards of governance, privacy, and regulatory compliance.
- Regulatory Adherence: Apply strong knowledge of regulatory & compliance frameworks (such as PDPA, MAS TRM, APRA, and GDPR) to secure data architecture design.
3. Enable Business Growth
- Business Translation: Translate complex technical architectures into tangible business value.
- Stakeholder Management: Empower CxOs and stakeholders to accelerate innovation, reduce risk, and improve decision-making.
- Strategic Partnership: Work directly with hyperscalers (AWS, GCP, Azure), clients, and partners to co-create cutting-edge solutions and accelerators.
4. Elevate Standards & Leadership
- Mentorship: Mentor engineers and architects across geographies, helping them embed best practices for performance, cost optimization, and long-term sustainability.
- Delivery Ownership: Take end-to-end ownership from pre-sales solutioning (RFI/RFP, SOWs, HLD/LLD) to delivery governance and business value realization.
- Team Leadership: Proven ability to mentor and lead distributed technical teams across the region (ASEAN & Australia).
- Travel: Be willing to travel across the region for client, partner, and delivery engagements.
Technical Skills Required for this Role:-
Cloud & Data Architecture Mastery
- Experience: 5+ years designing and implementing AWS-based enterprise data platforms.
- Platform Design: Expertise in building modern data architectures like data lakes, real-time streaming, analytics, and ML integration.
- Architecture Scope: Experience with hybrid/multi-cloud architectures and distributed systems.
- Core AWS Services: Hands-on experience with the following AWS services:
- Storage & Data Lake: S3, Lake Formation.
- Data Warehouse & Query: Redshift, Athena, Redshift Spectrum.
- ETL/Data Processing: Glue, EMR, Lambda, Spark/Hadoop ecosystem.
- NoSQL: DynamoDB.
2. Data Engineering & Governance
- Data Pipelining: Proficiency with ETL/ELT tools and frameworks, specifically Glue, Kafka, and Airflow.
- Optimization: Ability to design, optimize, and tune data pipelines for scalability, performance, and cost efficiency.
- Data Management: Strong understanding of enterprise data concepts, including data security, governance, lineage, quality, cataloging, and compliance.
3. Programming & Database Technologies
- Programming: Experience in Python, Java, or Scala for data processing and architecture automation.
- Database Knowledge: Deep knowledge of MPP (Massively Parallel Processing) and NoSQL databases.
- Analytics: Strong skills in SQL-on-Hadoop/AWS technologies (Hive, Spark SQL), and expertise in data warehouse design and BI/reporting solutions.
Nice-to-Have Skills
- Generative AI: Experience with GenAI data integration projects (using vectorDBs, embeddings, chunking, and Multi-Cloud Platform/tools integration).
- Multi-Cloud: Familiarity with Azure, GCP, or multi-hyperscaler architectures for co-solutioning.
- Resilience: Experience designing disaster recovery & backup strategies for large-scale data platforms.