user@ayomos:~

$ cat /home/ayo/profile.txt

Loading professional profile...

AYO MOSANYA

Austin / NYC / Global • mosanyaayo@gmail.com

Data Engineering Leader with 9 years of experience designing scalable analytics infrastructure and leading cross-functional data initiatives in highly regulated financial services environments. Currently at Charles Schwab, I architect enterprise ETL platforms, implement Python-based data pipelines serving C-suite decision-making, and provide technical leadership for Federal Reserve examination readiness.

Expert in SQL optimization, cloud data warehousing, and data governance frameworks (Collibra, Informatica). I build self-service analytics platforms that empower business teams, mentor technical talent, and stay current on development news by building projects in my free time.

Cross-Domain Thinker AI-Augmented Builder Systems Designer Self Starter Lifelong Learner Human-First Design Accessibility Advocate
$620K+
Est. Annual Value
85%
Time Reduction
50+
Stakeholders
3M+
Rows Scaled
SYS_INFO.exe
_ ×
LOCATION: Austin / NYC / Global
FOCUS: Data Analytics
COMPANY: Charles Schwab
LANGUAGES: Python, Elixir
STATUS: ● BUILDING
Download Resume

EXPERIENCE.log

DATA ANALYTICS MANAGER Mar 2022 - Present
Charles Schwab CURRENT
  • [01] Architect and own enterprise-wide ETL platform delivering critical risk intelligence to 50+ stakeholders across C-suite committees (Operational Risk Oversight Committee, Bank Operational Risk Oversight Committee, Aggregate Bank Risk Committee), driving estimated $620K+ annual value through 85% processing time reduction, automated compliance reporting, and risk mitigation that prevented regulatory findings
  • [02] Design Python-based ETL toolkit leveraging concurrent query processing and DuckDB/Parquet architecture to handle 20+ parallel SQL queries, scaling data operations from 700K to projected 3M rows while maintaining sub-second query performance and complete audit trail integrity through Git/GitHub version control—eliminating manual bottlenecks and enabling real-time risk KPI tracking
  • [03] Lead technical design discussions and data pipeline architecture for Critical Data Elements (CDE) regulatory reporting integrating with Collibra and Informatica data governance platforms; partner with GRC data engineers to refactor SQL code ensuring data quality, lineage tracking, and compliance standards across enterprise risk management systems supporting Federal Reserve examination requirements
  • [04] Serve as technical authority for Federal Reserve Board examination readiness, collaborating directly with Managing Director and senior leadership to analyze complex regulatory inquiries, identify critical data elements across disparate systems, validate data accuracy under tight deadlines, and deliver compliant responses—work that prevented formal findings and saved an estimated $500K-$2M in remediation costs and regulatory exposure
  • [05] Provide technical leadership across Internal Audit, Financial Crimes Risk Management (FCRM), and business unit Risk Managers, translating complex regulatory requirements into scalable analytics solutions and influencing strategic decisions on data architecture, tool selection, and process automation that reduced enterprise reporting costs by an estimated $200K annually
  • [06] Mentor Risk Specialist to Senior Specialist promotion through structured technical guidance in SQL optimization, Python automation, and data visualization; volunteer for intern mentorship and interview panels, and provide ongoing career guidance to early-career analysts across the organization on technical skills development, understanding business impact, and navigating career growth in data analytics
  • [07] Drive enterprise adoption of self-service Tableau dashboards by 50+ stakeholders, delivering comprehensive user training and technical documentation that reduced manual reporting requests by 60%, accelerated risk KPI delivery from weeks to hours, and enabled non-technical teams to independently explore RCSA metrics, data anomalies, and issue trends for board-level decision-making
  • [08] Implement automated data validation frameworks with built-in aggregate reconciliation and anomaly detection for Risk Control Self-Assessment (RCSA) programs, reducing manual validation effort by 70% and preventing data quality incidents that typically cost financial institutions $100K-$500K per occurrence through early detection and automated alerting
DATA ANALYST Jun 2021 - Jan 2022
2ndWatch AWS Premier Consulting Partner
  • Optimized AWS Redshift data warehouse infrastructure for Fortune 500 financial services clients, conducting cost-benefit analyses that identified multi-million dollar cloud spend efficiencies; developed unified Excel modeling framework integrated with Redshift SQL queries to accelerate cloud migration decision-making
M&A ANALYST Jul 2019 - Jun 2021
Salesforce
  • Led analytics integration as technical SME for technology acquisitions, consolidating disparate data sources for due diligence and post-close activities; automated executive reporting pipelines using Tableau and Google Sheets, reducing manual effort 60% and enabling real-time integration status tracking across Corporate Development, IT, and Operations teams
Technical Advisor Jun 2017 - Jul 2019
Apple Inc.

Top 5% global customer satisfaction ranking; analyzed support data for product improvements

Sales Consultant Dec 2015 - Jun 2017
Sprint Inc.

Achieved top sales performance through data-driven customer analytics

ERM Horizontal Reporting

Designed an interconnected reporting framework that visualizes relationships between business processes, risks, controls, and issues—enabling executives to see the full risk story through meaningful aggregations.

ERM_FRAMEWORK.flow
_ ×
PROCESS
N
Business Processes
RISK
N
Identified Risks
CONTROL
N
Mitigating Controls
ISSUE
N
Open Issues
1:N
Process to Risk Ratio
1:N
Risk to Control Ratio
High
Control Coverage
Tracked
Issues with Risk Impact

The Challenge

  • Large datasets across multiple interconnected entity types
  • Traditional single-query approaches hit performance limits
  • Siloed reporting obscured cross-entity relationships
  • Stakeholders needed holistic views showing cascading impacts

The Solution

  • Parallel Processing: Concurrent extraction across multiple data sources
  • In-Memory Joining: Fast data integration after parallel extraction
  • Optimized Analytics: Efficient aggregations on combined datasets
  • Result: 85%+ reduction in report generation time
PYTHON PARALLEL PROCESSING ETL PIPELINES DATA INTEGRATION ANALYTICS

SKILLS.conf

AI TOOLS & AUGMENTATION
GitHub Copilot / Copilot Chat
Claude (Anthropic)
ChatGPT / GPT-4
Cursor AI IDE
AI-Assisted Code Review
Prompt Engineering
METHODOLOGY & APPROACH
Systems Thinking & Design
Human-Centered Design
WCAG Accessibility Standards
Inclusive Data Visualization
Strategic Problem Solving
Continuous Learning Mindset
PYTHON STACK
oracledb / async
polars / pandas
DuckDB / Parquet
io / pytz
openpyxl / xlsxwriter
concurrent processing
ELIXIR & WEB
Phoenix Framework
LiveView
Tailwind CSS
Advanced DNS Setup
Fly.io Deployment
Git / GitHub
AUTOMATION & DOCS
Excel Report Gen
PDF Generation
Pandoc / Tectonic
Doc Automation
ANALYTICS & VIZ
Tableau
Low Cognitive Load Design
SQL (Oracle, Redshift)
Excel / Sheets
EDUCATION

B.S., Computer Information Systems

University of Texas at Tyler

PERSONAL_DEV.log

● MODULAR FRAMEWORK
ETL_TOOLKIT — MODULAR DATA DEVELOPMENT FRAMEWORK Python
Reusable Analytics Infrastructure PRODUCTION 8+ HRS/WEEK SAVED

A modular directory of reusable Python scripts that forms an auditable, recreatable development framework. This system creates a shareable foundation that lowers the barrier to custom code development—if you understand looking up documentation and finding the proper libraries, you can solve business problems efficiently.

The Framework Philosophy

1. Understand the business problem — What are we actually trying to solve?

2. Identify impacts & executive outcomes — What does success look like to leadership?

3. Ask the right discovery questions — And understand why you're asking them

4. Think through various solutions — Don't commit to the first idea

5. Measure solution impacts — Which approach best fits the constraints?

6. Estimate downstream impact — Notify involved parties before changes

7. Automate repeated work — Eliminate time sinks systematically

MODULES.tree
_ ×

etl_toolkit/

├── connections/ — OracleDB connectors, connection pooling

├── extraction/ — Multi-SQL file execution, batch queries

├── analysis/ — Aggregate counts, data profiling, analytics

├── excel/ — Creation, formatting, export utilities

├── validation/ — Extract comparison, count checks, auditing

├── visualization/ — D3.js mockups, Plotly charts, AI iteration

├── datetime/ — Date organization, fiscal calendars, scheduling

└── cli/ — One-command report runner

Data Connectivity

  • OracleDB connection management
  • Multi-.sql file batch execution
  • Parameterized query templates
  • Connection pooling & retry logic

Analytics & Reporting

  • Aggregate count compilation
  • Data extract profiling
  • Excel creation & formatting
  • Automated export pipelines

Validation & Audit

  • Extract-to-code aggregation comparison
  • Count validation checks
  • Discrepancy flagging
  • Audit trail generation

Visualization & Prototyping

  • D3.js chart mockups
  • Plotly interactive dashboards
  • AI-assisted rapid iteration
  • Stakeholder preview generation
CLI — ONE COMMAND REPORTING
_ ×

# Run any configured report with a single command

$ etl run weekly-account-summary

→ Connects to Oracle

→ Executes 12 SQL files

→ Compiles aggregates

→ Validates against prior extract

→ Generates formatted Excel

→ Exports to shared drive

✓ Complete in 3 minutes (was 45 min manual)

Business Impact

Time Savings

8+ hours saved weekly through automation of repeated extraction, validation, and reporting workflows.

Auditability

Every process is documented, version-controlled, and recreatable. No more "how did we calculate this last quarter?"

Lower Barrier

Team members can solve new problems by composing existing modules rather than starting from scratch.

Python
oracledb
pandas
openpyxl
Plotly
D3.js

Skills Demonstrated

Framework Design • Modular Architecture • Database Connectivity • ETL Pipeline Development • Data Validation • Excel Automation • CLI Tool Development • Process Documentation • Team Enablement • Visualization Prototyping • AI-Assisted Development

8+
Hours Saved/Week
15x
Faster Reports
100%
Audit Trail
1
Command to Run
PYTHON ETL PIPELINES ORACLE DB DATA VALIDATION EXCEL AUTOMATION CLI TOOLS D3.JS PLOTLY MODULAR DESIGN PROCESS AUTOMATION
BUILDER_PHILOSOPHY.md Core Principles

Research-Driven Development

I study frameworks and methodologies from diverse domains—functional programming, fault-tolerant systems, regulatory compliance, quantitative finance—and apply them to solve data analytics problems in novel ways.

Build to Learn

Every project is a learning laboratory. TabMine taught me hybrid architectures. FRAP taught me regulatory compliance engineering. TradingAlgo taught me iterative version evolution. I ship to understand.

AI as Force Multiplier

I use AI tools (Copilot, Claude, ChatGPT) not as crutches but as accelerators. I understand what I'm building and use AI to move faster, not to think for me.

Polyglot by Design

Python for ML/analytics. Elixir for concurrency and fault tolerance. Each language for its strengths. I build bridges between ecosystems rather than forcing one tool to do everything.

Continuous Learning

I read development news daily, study new frameworks, and build side projects to explore technologies before I need them professionally. Learning is not separate from work—it is the work.

Ship & Iterate

I believe in shipping imperfect systems and improving them based on real usage, not endless planning. Each version teaches what the next one needs.

PYTHON ETL DEVELOPMENT ORACLE DB DATA PIPELINES EXCEL AUTOMATION VALIDATION CLI TOOLS D3.JS PLOTLY MODULAR ARCHITECTURE PROCESS AUTOMATION AI-ASSISTED DEV

READ MY INSIGHTS

Exploring data analytics, enterprise risk management, and the future of financial technology.

ENTER BLOG