This site uses cookies. To find out more, see our Cookies Policy

VP, Data Analyst, GTO in Singapore at UOB Group

Date Posted: 5/25/2018

Job Snapshot

Job Description

Functional area: Business Technology Services
Employment type: Full-time
Job Type: Permanent

  • Proven working experience as a data analyst, with experience in large data warehousing and Hadoop implementation projects in financial services industry
  • Good functional knowledge of Retail bank, Wholesale & Private Bank products & business processes
  • Hands on experience of executing projects in Finance domain - expertise in financial reconciliation, GL unification, profitability, fund transfer pricing, budgeting, forecasting
  • Experience with Credit Risk domain - computation of risk weighed assets, economic capital, S29, MAS, BASEL reporting, economic capital, cross border exposure
  • Knowledge & experience with products such as SAP GL, OFSA (profitability, FTP), FITAS, ARF (Trade Finance & Accounts Receivables), Moody’s RaY, Murex, Cash Management & Remittance would be of additional advantage
  • Good expertise in designing financial services data models & data modeling tools
  • Knowledge & experience of having used industry standards data models such as FSDM
  • Knowledge & experience in designing normalized & dimensional models to support different analytical users
  • Hands on experience of doing data mapping across different layers in data architecture – from source to target state
  • Good experience in implementing data governance processes
  • Experience in doing data profiling & implementing data quality rules using ETL tools like Informatica Data Quality
  • Knowledge & experience with reference data management, data standards, expertise with Teradata MDM
  • Experience of building business glossary & providing end to end data lineage using tools like Informatica Metadata Manager
  • Proven problem solving skills & analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy
  • Exposure to Data Warehouse & Big data tools for Information Management
  • ETL & Data governance tools - Informatica PC, DQ, MM, Teradata GCFR, MDM
  • RBMS - Teradata, Oracle, NoSQL
  • Reporting - Qlik, OBIEE, Tableau, BO
  • Cloudera administration suite
  • Hadoop languages & tools – Spark, Python, R, Pig, Hue, Impala, Hive, Hbase, Informatica IDL, BDM, Kafka, Flume, Machine Learning Algorithms
  • Experience & knowledge of building security framework involving data classification & access controls in Hadoop and Teradata
  • Good experience of implementing large scale, multitrack projects involving mix of waterfall and agile approaches
  • Knowledge of defect management – leading defect triage, resolution & reporting
  • Good attitude, team player, result driven, self motivated and keenness to learn
  • Experience in handling large teams, demonstrated ability to learn fast and apply to project execution
  • Good communication & interpersonal skills, ability to engage different stakeholders in business, operations & technology