Apps
Cyclone
Animal Stars Voting Platform
January 2026 | Confidential
Proposal
PRD
Pricing
Q&A
Technical & Commercial Proposal
Loading content... If this message persists, marked.js may have failed to load. Please check your internet connection.
Product Requirements Document (PRD)
Pricing Breakdown
Q&A - Questions for Mandai
# Technical & Commercial Proposal # Animal Stars Voting Platform --- **Submitted to:** Mandai Wildlife Group **Submitted by:** Apps Cyclone Technology JSC **Date:** January 2026 **Proposal Validity:** 30 days from submission date **Document Classification:** Confidential --- ## Table of Contents 1. [Cover Letter](#1-cover-letter) 2. [Executive Summary](#2-executive-summary) 3. [Company Profile](#3-company-profile) 4. [Understanding of Requirements](#4-understanding-of-requirements) 5. [Proposed Solution](#5-proposed-solution) 6. [Technical Architecture](#6-technical-architecture) 7. [Methodology & Approach](#7-methodology--approach) 8. [Project Team](#8-project-team) 9. [Project Timeline](#9-project-timeline) 10. [Deliverables](#10-deliverables) 11. [Pricing](#11-pricing) 12. [Terms & Conditions](#12-terms--conditions) 13. [Value Proposition](#13-value-proposition) 14. [Appendix](#14-appendix) --- ## 1. Cover Letter **Date:** January 27, 2026 **To:** Mandai Wildlife Group Procurement Department **Subject:** Proposal for Animal Stars Voting Platform Development Dear Sir/Madam, We are pleased to submit our proposal for the development and implementation of the **Animal Stars Voting Platform** in response to your Request for Quotation (RFQ). **Apps Cyclone Technology JSC** is an ISO 27001:2022 certified software development company founded in 2012, with over 70 professionals and 12+ years of experience building interactive digital platforms, mobile applications, and customer engagement solutions. We are pioneers in **AI-augmented software development**, leveraging advanced AI coding assistants and automation tools to deliver higher quality software at accelerated timelines. Having completed 1,000+ projects for 100+ clients across Singapore, Australia, the UK, the US, and Vietnam, we are confident in our ability to deliver a robust, scalable, and user-friendly voting platform that meets all your specifications. **Key Highlights of Our Proposal:** - **AI-augmented development** delivering superior code quality with 45% faster development cycles - **End-to-end solution** covering online voting, onsite terminals, live leaderboard, and reward system - **Proven technology stack** using React, Node.js, and PostgreSQL for reliability and performance - **Full CIAM integration** with your existing Customer Identity Access Management system - **Real-time synchronization** across all touchpoints (web, mobile, terminals, TV displays) - **Comprehensive support** including remote and onsite technical assistance throughout the campaign - **Cost-effective pricing** at USD $67,680 — AI productivity gains passed directly to client We are committed to meeting the **non-negotiable deadline of 24 August 2026** for platform readiness and the **1 September 2026 launch date**. We look forward to the opportunity to partner with Mandai Wildlife Group on this exciting project. Sincerely, **Tam Nhat Ton** Sales Director Apps Cyclone Technology JSC Website: https://appscyclone.com --- ## 2. Executive Summary ### 2.1 Project Overview | Item | Details | |------|---------| | **Project Name** | Animal Stars Voting Platform | | **Client** | Mandai Wildlife Group | | **Campaign Duration** | 1 Sep - 31 Oct 2026 (8 weeks) | | **Platform Ready Date** | 24 Aug 2026 | | **Scope** | 5 Parks (SZ, RW, NS, BP, RF) | ### 2.2 Solution Summary We propose a comprehensive **omni-channel voting platform** that enables guests to vote for their favorite animals across Mandai's five wildlife parks through: - **Online Voting** - Responsive web platform accessible from any device - **Onsite Voting Terminals** - 15 tablet kiosks across 5 parks - **Live Leaderboard** - Real-time vote display on 5 TV screens and embeddable widget - **Reward System** - Gamified spin-the-wheel with prize management ### 2.3 Key Features | Feature | Description | |---------|-------------| | CIAM Integration | Seamless authentication with WildPass, FOM, and Registered Users | | Tiered Voting | 5/10/15 daily votes based on membership tier | | Booster Codes | Partner/sponsor promotional codes for bonus votes | | Real-time Sync | < 5 second latency across all touchpoints | | Analytics | Adobe/Google Analytics integration + custom dashboard | | Security | Pen-tested, anti-fraud measures, PDPA compliant | ### 2.4 Investment Summary | Category | Amount (USD) | |----------|--------------| | Platform Development (AI-Augmented) | $37,436 | | Hardware (Tablets + TVs) | $15,405 | | Support & Maintenance (8 weeks) | $14,840 | | **Total Investment** | **$67,680** | > **Cost Advantage:** Our AI-augmented development approach reduces platform development costs by **42%** compared to traditional methods, with no compromise on quality or features. *All prices in USD. Detailed pricing breakdown in Section 11.* --- ## 3. Company Profile ### 3.1 About Apps Cyclone Technology JSC | Item | Details | |------|---------| | **Company Name** | Apps Cyclone Technology JSC | | **Founded** | 2012 | | **Headquarters** | 168/6 Bui Thi Xuan, Ward 03, Tan Binh District, Ho Chi Minh City, Vietnam | | **Employees** | 70+ professionals | | **Website** | https://appscyclone.com | | **Core Services** | Web/Mobile Development, UX/UI Design, Blockchain, AI Solutions | | **Track Record** | 1,000+ projects delivered for 100+ clients globally | | **Global Clients** | Singapore, Australia, UK, Netherlands, USA, Canada, Japan | ### 3.2 Our Expertise **Apps Cyclone Technology JSC** specializes in: - **AI-augmented software development** using cutting-edge AI coding assistants - Mobile app development (iOS, Android, Cross-platform) - Web application development (React, Node.js, Laravel) - Interactive digital platforms and engagement solutions - Real-time data synchronization applications - Customer loyalty and reward platforms - Enterprise system integration (SSO, CIAM, Payment) - UX/UI Design for consumer-facing products ### 3.2.1 AI Development Capabilities Our team leverages AI tools throughout the entire development lifecycle: | Capability | AI Tool / Approach | Benefit | |------------|-------------------|---------| | Code Generation | AI coding assistants (Claude Code, GitHub Copilot) | 40-60% faster feature development | | Code Review | AI-powered static analysis | Higher code quality, fewer bugs | | Testing | AI-generated test suites | 80%+ test coverage automatically | | Documentation | AI-assisted technical writing | Comprehensive docs in fraction of time | | Architecture | AI-assisted design patterns | Best-practice implementations | | Debugging | AI-powered diagnostics | Faster issue resolution | ### 3.3 Why We Are Qualified | Qualification | Evidence | |--------------|----------| | **AI-Augmented Development** | Pioneering AI-assisted development methodology since 2024 | | **Technical Expertise** | 12+ years experience in React, Node.js, PostgreSQL | | **Proven Track Record** | 1,000+ projects delivered for 100+ clients | | **Integration Experience** | Proven CIAM/SSO integration capabilities | | **Singapore Experience** | Served multiple Singapore-based clients | | **Scalability Track Record** | Systems handling 10,000+ concurrent users | | **Full-stack Team** | Expert AI-augmented engineers with 12+ years experience | ### 3.4 Certifications & Compliance - **ISO 27001:2022** - Information Security Management System - PDPA compliant data handling practices - OWASP security standards adherence --- ## 4. Understanding of Requirements ### 4.1 Business Context Mandai Wildlife Group seeks to increase guest engagement and awareness through an interactive voting campaign across their five wildlife parks. The "Animal Stars" campaign will allow visitors to vote for their favorite animals, creating excitement and encouraging repeat visits and membership sign-ups. ### 4.2 Key Requirements Summary #### 4.2.1 Functional Requirements | Module | Key Requirements | |--------|------------------| | **Authentication** | CIAM integration, Email OTP, WildPass/SingPass support | | **Voting** | Tiered daily votes, park-based allocation, booster codes | | **Leaderboard** | Real-time sync, 5-park view, embeddable iFrame | | **Rewards** | Gamification (spin wheel), prize management, QR redemption | | **Admin** | Campaign config, code management, reporting dashboard | | **Hardware** | 15 tablets, 5 TVs, kiosk mode, technical support | #### 4.2.2 Non-Functional Requirements | Requirement | Target | |-------------|--------| | Performance | < 3s page load, < 500ms API response | | Scalability | 10,000+ concurrent users | | Availability | 99.9% uptime | | Security | Pen-tested, OWASP compliant | ### 4.3 Critical Success Factors 1. **On-time delivery** - Platform ready by 24 Aug 2026 (non-negotiable) 2. **Seamless CIAM integration** - Flawless authentication experience 3. **Real-time performance** - Instant leaderboard updates across all channels 4. **User experience** - Intuitive, engaging voting journey 5. **Reliability** - Zero downtime during 8-week campaign ### 4.4 Identified Challenges & Our Approach | Challenge | Our Approach | |-----------|--------------| | CIAM Integration complexity | Early engagement, parallel development with mock auth | | High traffic at launch | Load testing, auto-scaling, CDN implementation | | Real-time sync at scale | Redis caching, WebSocket architecture | | Hardware reliability onsite | Spare devices, rapid replacement SLA | | Vote manipulation prevention | Server-side validation, rate limiting, monitoring | --- ## 5. Proposed Solution ### 5.1 Solution Overview We propose a **modern, scalable, cloud-native voting platform** built on proven technologies: ``` ANIMAL STARS VOTING PLATFORM Web Mobile Tablet TV Portal Web Kiosk Display | | | | +----+----+----+-----+ | API Gateway (Load Balanced) | +--------+--------+ | | | Auth Voting Reward Service Service Service | | | +--------+--------+ | PostgreSQL + Redis ``` ### 5.2 Module Breakdown #### Module 1: Authentication & User Management | Feature | Description | |---------|-------------| | CIAM Integration | Full API integration for login, sign-up, profile retrieval | | User Type Detection | Automatic identification of Registered/WildPass/FOM | | Session Management | Persistent login per CIAM requirements | | OTP Verification | Email-based one-time password for existing users | #### Module 2: Voting Engine | Feature | Description | |---------|-------------| | Vote Allocation | Tiered system: 5/10/15 daily votes | | Park Distribution | 1/2/3 votes per park based on tier | | Real-time Counter | Live update of remaining votes | | Booster Codes | Partner promotional codes with targeted allocation | | Vote Recording | Tamper-proof server-side vote storage | #### Module 3: Live Leaderboard | Feature | Description | |---------|-------------| | Real-time Updates | WebSocket-based < 5s sync | | Multi-park View | All 5 parks in organized layout | | Embeddable Widget | iFrame for Mandai.com integration | | TV Display Mode | Full-screen leaderboard for onsite TVs | | Animation | +1 vote effect for engagement | #### Module 4: Reward System | Feature | Description | |---------|-------------| | Gamification | Spin-the-wheel with configurable prizes | | Prize Engine | Probability-based distribution over 8 weeks | | Inventory Management | Real-time prize stock control | | QR Redemption | Unique codes with anti-fraud protection | | Email Delivery | Automated prize notification | #### Module 5: Admin Dashboard | Feature | Description | |---------|-------------| | Campaign Config | Start/end dates, vote rules, contestants | | Code Management | Generate, track, expire booster codes | | Analytics | Real-time participation, voting, redemption metrics | | Vote Control | Authorized adjustments with audit trail | | Reporting | Export to CSV/Excel | #### Module 6: Onsite Hardware Solution | Component | Specification | |-----------|--------------| | Tablets | iPad 10.9" or equivalent Android tablet | | Kiosk Stands | Secure anti-theft floor stands | | TV Displays | 43" commercial-grade displays | | Kiosk Software | Guided Access / Kiosk mode lock | ### 5.3 User Journey ``` USER VOTING JOURNEY STEP 1 STEP 2 STEP 3 STEP 4 STEP 5 Welcome --> Vote --> Cast --> View --> Reward & Login Balance Votes Leaders Spin - Email - Show - 5-park - Real- - Spin input vote grid time wheel - CIAM count - Select sync - Prize auth - Booster animal - +1 reveal - OTP code - Submit effect - Email verify entry prize ``` --- ## 6. Technical Architecture ### 6.1 Technology Stack | Layer | Technology | Justification | |-------|------------|---------------| | **Frontend** | React 18 + TypeScript | Modern, component-based, type-safe | | **UI Framework** | Tailwind CSS + Framer Motion | Responsive design, smooth animations | | **Backend** | Node.js 20 + Express | High performance, event-driven | | **Database** | PostgreSQL 15 | ACID compliant, reliable, scalable | | **Cache** | Redis 7 | Real-time leaderboard, session storage | | **Real-time** | Socket.io | WebSocket for live updates | | **Email** | SendGrid / AWS SES | Reliable email delivery | | **Storage** | AWS S3 / CloudFlare R2 | Media assets, static files | | **CDN** | CloudFlare | Global distribution, DDoS protection | | **Hosting** | AWS / Azure | Enterprise-grade cloud infrastructure | | **Monitoring** | DataDog / New Relic | Performance monitoring, alerting | | **AI Development** | Claude Code, GitHub Copilot, Cursor | AI-augmented coding, review, testing | | **AI Testing** | AI-generated test suites | Automated test case generation | | **AI Documentation** | AI-assisted doc generation | Auto-generated API docs, user guides | ### 6.2 System Architecture Diagram ``` CLOUD INFRASTRUCTURE CloudFlare CDN + WAF --> AWS / Azure Load Balancer (ALB) | +----------+----------+ | | | App App App Node 1 Node 2 Node 3 | | | +----------+----------+ | +----------+----------+ | | PostgreSQL Redis Primary Cluster | PostgreSQL Replica EXTERNAL INTEGRATIONS Mandai CIAM | SendGrid Email | Adobe Analytics | Mandai App ``` ### 6.3 Database Schema (Key Entities) ``` users votes animals - id (PK) - user_id (FK) - id (PK) - email - animal_id (FK) - name - account_type - vote_count - park_id (FK) - wildpass_id - source - image_url - created_at - voted_at - description - last_login - booster_code_id - display_order booster_codes rewards parks - id (PK) - id (PK) - id (PK) - code - user_id (FK) - name - extra_votes - prize_type - short_code - target_animal_id - prize_value - description - created_by - qr_code - expires_at - redeemed_at leaderboard - used_count - created_at - animal_id (PK) - max_uses - total_votes prize_pool - rank - id (PK) - updated_at - prize_type - probability - total_quantity - remaining - daily_cap ``` ### 6.4 API Endpoints (Key) | Method | Endpoint | Description | |--------|----------|-------------| | POST | `/api/auth/login` | Initiate login (email input) | | POST | `/api/auth/verify-otp` | Verify email OTP | | GET | `/api/auth/profile` | Get user profile & tier | | GET | `/api/votes/balance` | Get remaining votes for today | | POST | `/api/votes/cast` | Submit votes | | POST | `/api/votes/booster` | Apply booster code | | GET | `/api/leaderboard` | Get current leaderboard | | WS | `/ws/leaderboard` | Real-time leaderboard updates | | POST | `/api/rewards/spin` | Spin the wheel | | GET | `/api/rewards/history` | Get user's rewards | | POST | `/api/rewards/redeem` | Redeem a reward | ### 6.5 Security Architecture ``` SECURITY LAYERS Layer 1: EDGE SECURITY CloudFlare WAF | DDoS Protection | Rate Limiting | Bot Detection Layer 2: TRANSPORT SECURITY TLS 1.3 | Certificate Pinning | HSTS Layer 3: APPLICATION SECURITY JWT Auth | Input Validation | CSRF Protection | XSS Prevention Layer 4: DATA SECURITY AES-256 Encryption | Hashed Passwords | PII Masking | Audit Logs Layer 5: ANTI-FRAUD Server-side Vote Counting | IP Tracking | Device Fingerprinting One-time QR Codes | Redemption Verification ``` --- ## 7. Methodology & Approach ### 7.1 Development Methodology: AI-Augmented Agile We employ an **AI-Augmented Agile** methodology — a modern evolution of Scrum that integrates AI tools at every stage of the development lifecycle. This approach achieves **~45% productivity gains** while maintaining rigorous quality standards. ``` AI-AUGMENTED AGILE SPRINT CYCLE Sprint Planning --> Daily Standups --> Sprint Review --> Sprint Retro --> Sprint Planning (next) | AI-Augmented Dev Cycle - AI Code Generation - Human Review - AI Testing - AI Code Review - Human Validation ``` #### How AI Augments Each Phase | Phase | Traditional | AI-Augmented | Productivity Gain | |-------|-------------|-------------|-------------------| | Requirements to Code | Manual coding from specs | AI generates initial code from specs, engineer refines | 40-60% faster | | Code Review | Peer review only | AI pre-review + peer review | 50% faster, higher quality | | Unit Testing | Manual test writing | AI generates comprehensive test suites | 60-70% faster | | Integration Testing | Manual test scenarios | AI generates edge cases + human validation | 40-50% faster | | Documentation | Manual writing | AI generates from code + human editing | 50-70% faster | | Bug Fixing | Manual debugging | AI diagnostics + suggested fixes | 30-40% faster | ### 7.2 Project Phases #### Phase 1: Discovery & Design (Week 1-3) | Activity | Deliverable | |----------|-------------| | Kick-off meeting | Project charter, Communication plan | | Requirements refinement | Detailed specifications | | UX/UI Design | Wireframes, Mockups, Design system | | Architecture design | Technical architecture document | | CIAM API analysis | Integration specification | #### Phase 2: Core Development (Week 4-10) | Sprint | Focus Area | AI Advantage | |--------|------------|-------------| | Sprint 1-2 | Authentication module, CIAM integration | AI generates API wrappers, auth flows | | Sprint 3-4 | Voting engine, Vote management | AI generates business logic, validation rules | | Sprint 5 | Leaderboard, Real-time sync | AI generates WebSocket handlers, UI components | | Sprint 6 | Reward system, Prize engine | AI generates probability algorithms, gamification UI | | Sprint 7 | Admin dashboard | AI generates CRUD interfaces, analytics views | > **Timeline Compression:** AI-augmented development compresses the core development phase from 9 sprints (18 weeks) to 7 sprints (14 weeks), allowing more buffer time for integration and testing. #### Phase 3: Integration & Testing (Week 11-18) | Activity | Deliverable | AI Advantage | |----------|-------------|-------------| | Landing page integration | Seamless campaign flow | AI generates integration code | | Mandai App integration | Deep link, WebView support | AI generates bridge handlers | | Analytics implementation | Tracking events configured | AI generates tracking code | | UAT preparation | Test environment, AI-generated test cases | AI creates comprehensive test suites | | UAT execution | Bug reports, AI-assisted fixes | AI accelerates bug diagnosis and fixes | | Penetration testing | Security assessment report | AI pre-scan before third-party test | #### Phase 4: Deployment & Launch (Week 19-22) | Activity | Deliverable | AI Advantage | |----------|-------------|-------------| | Production deployment | Live platform | AI-assisted deployment scripts | | Hardware setup | Terminals installed at 5 parks | - | | Staff training | Operational procedures | AI-generated training materials | | Soft launch | Internal testing | AI-monitored load testing | | Go-live | Campaign launch | AI-assisted monitoring | > **Buffer Advantage:** The AI-augmented timeline finishes ~2 weeks earlier than traditional development, providing additional buffer before the non-negotiable 24 Aug 2026 deadline. ### 7.3 AI-Augmented Quality Assurance Process ``` AI-AUGMENTED QA PROCESS FLOW AI-Augmented Engineer --> AI + Human QA Review --> Product Owner --> Security Team 1. AI Code Gen + Human Review 2. AI-Generated Unit Tests + AI Code Review 3. Human Validation 4. AI Integration Tests (auto-generated) 5. UAT Review 6. AI Security Scan + Pen Test 7. Pen Test Results 8. AI-Monitored Production Deployment ``` **Key AI QA Enhancements:** - AI generates 80%+ of unit tests automatically from code - AI-powered code review catches issues before human review - AI generates edge case test scenarios humans might miss - AI pre-scans for OWASP vulnerabilities before pen testing - AI monitors production for anomalies post-deployment ### 7.4 Communication Plan | Meeting | Frequency | Participants | Purpose | |---------|-----------|--------------|---------| | Sprint Planning | Bi-weekly | Full team + PO | Plan sprint backlog | | Daily Standup | Daily | Dev team | Progress sync | | Sprint Review | Bi-weekly | Full team + Stakeholders | Demo deliverables | | Steering Committee | Monthly | Project leads + Management | Strategic alignment | | Ad-hoc Technical | As needed | Technical teams | CIAM integration, issues | ### 7.5 Risk Management Approach | Risk Category | Monitoring | Escalation | |--------------|------------|------------| | Schedule | Weekly burn-down review | PM to Steering Committee | | Technical | Daily standups | Tech Lead to PM | | Integration | Integration status calls | PM to MWG IT | | Quality | QA metrics dashboard | QA Lead to PM | --- ## 8. Project Team ### 8.1 Team Structure (AI-Augmented) Our AI-augmented approach enables a **leaner, more efficient team** where each engineer leverages AI tools to achieve 2-3x individual productivity. This means fewer handoffs, faster communication, and higher accountability. ``` AI-AUGMENTED PROJECT ORGANIZATION Project Manager Hoa Doan | +--------------+------------------+ | | | Tech Lead AI-Augmented UX/UI + AI Architect Full-stack Designer Vu Dao Engineers x2 (TBD) | | | +---------+ | | | AI Coding AI Testing Assistants Automation - Claude Code - AI Test Generator - Copilot - AI Review - Cursor - AI Monitor ``` ### 8.2 Team Roles & Responsibilities | Role | Name | Responsibility | Allocation | |------|------|---------------|------------| | **Project Manager** | Hoa Doan | Overall delivery, stakeholder communication, risk management | 100% | | **Tech Lead + AI Architect** | Vu Dao | Architecture, AI workflow design, code review, CIAM integration | 100% | | **AI-Augmented Full-stack Engineer** | TBD | Full-stack development with AI coding assistants, frontend + backend | 100% | | **AI-Augmented Full-stack Engineer** | TBD | Full-stack development with AI coding assistants, admin + integrations | 100% | | **UX/UI Designer** | TBD | User experience, interface design, prototypes | 40% | > **Team Efficiency:** Traditional team of 8 people to AI-augmented team of 5. Each AI-augmented engineer delivers equivalent output of 2-3 traditional developers through AI pair programming, AI-generated tests, and AI-assisted code review. QA and DevOps functions are integrated into each engineer's workflow via AI tools. ### 8.3 Key Personnel Profiles #### Project Manager - Hoa Doan - Experienced digital project manager at Apps Cyclone - Agile/Scrum methodology practitioner - Track record of on-time delivery for engagement platforms - Client-facing communication and stakeholder management #### Technical Lead + AI Architect - Vu Dao - Senior full-stack architect at Apps Cyclone - Expert in React, Node.js, PostgreSQL, Redis - Pioneer in AI-augmented development workflows - Extensive experience with CIAM/SSO integrations and real-time systems - Led architecture for multiple high-traffic consumer platforms using AI-accelerated delivery ### 8.4 Support Team (During Campaign) | Role | Responsibility | Availability | |------|---------------|--------------| | Support Engineer | Remote monitoring, issue resolution | 9am - 6pm SGT | | On-call Engineer | Critical issue escalation | 24/7 | | Onsite Technician | Hardware troubleshooting | Park operating hours | --- ## 9. Project Timeline ### 9.1 High-Level Timeline (AI-Accelerated) ``` AI-ACCELERATED PROJECT TIMELINE 2026 FEB MAR APR MAY JUN JUL AUG ---- ---- ---- ---- ---- ---- ------ Discovery & Design AI-Augmented Dev (compressed) Integration + Testing Deploy + Buffer Kick-off Ready 24 Aug Start Feb Early Deadline Aug ``` ### 9.2 Detailed Milestone Schedule | Phase | Milestone | Start Date | End Date | Duration | |-------|-----------|------------|----------|----------| | **Phase 1** | **Discovery & Design** | | | **3 weeks** | | | Kick-off Meeting | Start Feb | Week 1 | 1 day | | | Requirements Finalization | Week 1 | Week 2 | 2 weeks | | | UX/UI Design Approval | Week 2 | Week 3 | 1.5 weeks | | | Technical Architecture Sign-off | Week 2 | Week 3 | 1 week | | **Phase 2** | **AI-Augmented Core Development** | | | **14 weeks** | | | Authentication + CIAM | Week 4 | Week 7 | 4 weeks | | | Voting Engine | Week 8 | Week 11 | 4 weeks | | | Leaderboard + Real-time | Week 12 | Week 13 | 2 weeks | | | Reward System | Week 14 | Week 15 | 2 weeks | | | Admin Dashboard | Week 16 | Week 17 | 2 weeks | | **Phase 3** | **Integration & Testing** | | | **5 weeks** | | | System Integration | Week 15 | Week 18 | 4 weeks | | | UAT (AI-generated test cases) | Week 19 | Week 21 | 3 weeks | | | Penetration Test | Week 21 | Week 22 | 2 weeks | | | Bug Fixes (AI-assisted) | Week 21 | Week 23 | 2 weeks | | **Phase 4** | **Deployment + Buffer** | | | **3 weeks** | | | Production Deployment | Week 23 | Week 24 | 1 week | | | Hardware Setup | Week 24 | Week 25 | 1 week | | | Training & Handover | Week 25 | Week 25 | 3 days | | | **Go-Live Ready** | **~10 Aug 2026** | | | | | **Buffer Period** | **10-24 Aug 2026** | | **~2 weeks** | | **Phase 5** | **Campaign Support** | | | **8 weeks** | | | Campaign Live | 1 Sep 2026 | 31 Oct 2026 | 8 weeks | | | Post-campaign Reporting | 1 Nov 2026 | 7 Nov 2026 | 1 week | > **Timeline Advantage:** AI-augmented development completes the platform approximately 2 weeks ahead of the non-negotiable 24 Aug 2026 deadline, providing valuable buffer time for final polish and stakeholder review. ### 9.3 Key Milestones | # | Milestone | Date | Dependency | |---|-----------|------|------------| | M1 | Kick-off Complete | Start Feb 2026 | Contract signed | | M2 | Design Approved | Mid Feb 2026 | Stakeholder review | | M3 | CIAM Integration Complete | End Mar 2026 | CIAM API access | | M4 | 1st Draft Demo | End Apr 2026 | Core modules done | | M5 | 2nd Draft Demo | End May 2026 | Full features (AI-accelerated) | | M6 | UAT Sign-off | End Jul 2026 | AI-generated tests + bug fixes | | M7 | Pen Test Passed | Early Aug 2026 | Security review | | M8 | **Platform Ready** | **~10 Aug 2026** | **AI-accelerated completion** | | M9 | **Buffer Period** | **10-24 Aug 2026** | **Additional safety margin** | | M10 | **Campaign Launch** | **1 Sep 2026** | **Non-negotiable** | | M11 | Campaign End | 31 Oct 2026 | - | --- ## 10. Deliverables ### 10.1 Deliverables List | # | Deliverable | Format | Delivery Date | |---|-------------|--------|---------------| | **Phase 1: Discovery & Design** | | | | | D1 | Project Charter | PDF | Week 1 | | D2 | Detailed Requirements Spec | PDF | Week 2 | | D3 | UX Wireframes | Figma | Week 3 | | D4 | UI Design Mockups | Figma | Week 4 | | D5 | Technical Architecture Doc | PDF | Week 4 | | **Phase 2: Development** | | | | | D6 | 1st Draft - Core Platform | Staging URL | Week 12 | | D7 | 2nd Draft - Full Features | Staging URL | Week 20 | | D8 | Admin Dashboard | Staging URL | Week 22 | | D9 | Source Code | Git Repository | Ongoing | | **Phase 3: Integration & Testing** | | | | | D10 | Integration Test Report | PDF | Week 21 | | D11 | UAT Test Cases | Excel | Week 22 | | D12 | UAT Sign-off Document | PDF | Week 24 | | D13 | Penetration Test Report | PDF | Week 25 | | **Phase 4: Deployment** | | | | | D14 | Production Platform | Live URL | Week 27 | | D15 | Hardware Installation | Onsite | Week 28 | | D16 | Deployment Guide | PDF | Week 28 | | D17 | Admin User Manual | PDF | Week 28 | | D18 | FAQ Documentation | PDF | Week 28 | | D19 | Training Session | Onsite/Virtual | Week 28 | | **Phase 5: Support** | | | | | D20 | Weekly Status Reports | Email | Weekly | | D21 | Campaign Analytics Reports | Dashboard | Real-time | | D22 | Final Campaign Report | PDF | Nov 2026 | | D23 | Technical Handover Package | ZIP | Nov 2026 | ### 10.2 Acceptance Criteria per Deliverable | Deliverable | Acceptance Criteria | |-------------|---------------------| | UX/UI Design | Approved by MWG Marketing | | Platform (1st Draft) | Core voting flow functional | | Platform (2nd Draft) | All features working, CIAM integrated | | UAT Sign-off | No critical/high severity bugs | | Pen Test | No critical vulnerabilities | | Production | 99.9% uptime, < 3s load time | --- ## 11. Pricing ### 11.1 Pricing Model **AI Productivity Factor:** ~45% reduction in development hours vs. traditional approach > **Note:** Our AI-augmented methodology significantly reduces the number of development hours required. We pass these savings directly to the client, resulting in a **42% lower development cost** compared to our traditional pricing model. ### 11.2 Pricing Summary | Category | Dev Hours | Dev Cost (USD) | PM/QC 30% | Other | Total (USD) | |----------|-----------|---------------|-----------|-------|-------------| | A. Platform Development (AI-Augmented) | 1,286 hrs | $25,720 | $7,716 | $4,000 | **$37,436** | | B. Hardware & Equipment | - | - | - | $15,405 | **$15,405** | | C. Support & Maintenance | 240 hrs | $4,800 | $1,440 | $8,600 | **$14,840** | | **Grand Total** | **1,526 hrs** | **$30,520** | **$9,156** | **$28,005** | **$67,680** | **Savings vs. Traditional Development:** | Metric | Traditional | AI-Augmented | Savings | |--------|------------|-------------|---------| | Dev Hours | 2,320 hrs | 1,286 hrs | **1,034 hrs (44.6%)** | | Dev Cost | $64,320 | $37,436 | **$26,884 (41.8%)** | | Total Cost | $94,560 | $67,680 | **$26,880 (28.4%)** | *All prices in USD.* ### 11.3 Detailed Pricing Breakdown *See Pricing tab for detailed line-by-line breakdown.* ### 11.4 Payment Schedule | Milestone | % | Amount (USD) | Due Date | |-----------|---|-------------|----------| | Contract Signing | 30% | $20,304 | Upon signing | | 1st Draft Delivery | 25% | $16,920 | End Apr 2026 | | 2nd Draft Delivery | 25% | $16,920 | End Jun 2026 | | Go-Live | 15% | $10,152 | 1 Sep 2026 | | Campaign Completion | 5% | $3,384 | Nov 2026 | | **Total** | **100%** | **$67,680** | | ### 11.5 Optional Line Items | # | Item | Description | Price (USD) | |---|------|-------------|-------------| | O1 | Campaign Extension | Per additional week (support + hosting) | $1,855/week | | O2 | Additional Voting Terminal | Per tablet + kiosk stand | $580/unit | | O3 | Additional Leaderboard TV | Per display + mount | $670/unit | | O4 | New Design Iteration | Major UI changes (AI-accelerated) | $1,760/iteration | | O5 | Additional Language | Translation + AI-assisted implementation | $1,430/language | | O6 | Extended Support | Beyond 8 weeks | $5,920/month | ### 11.6 Exclusions The following are NOT included in this proposal: - Content creation (animal descriptions, images) - Campaign landing page design/development (by creative agency) - Prize procurement and inventory - Network infrastructure at park locations - Third-party software licenses not specified - Travel and accommodation outside Singapore ### 11.7 Schedule of Rates (SOR) For any additional work outside scope: | Role | Hourly Rate (USD) | Daily Rate (USD) | |------|-------------------|------------------| | Project Manager | $26 | $208 | | Technical Lead + AI Architect | $25 | $200 | | AI-Augmented Senior Engineer | $20 | $160 | | AI-Augmented Engineer | $16 | $128 | | UX/UI Designer | $18 | $144 | | DevOps Engineer | $22 | $176 | --- ## 12. Terms & Conditions ### 12.1 Payment Terms - Payment within **30 days** of invoice date - All prices in US Dollars (USD) - GST applicable per Singapore regulations if required - Late payment: 1.5% per month ### 12.2 Project Terms #### 12.2.1 Scope Management - Change requests to be submitted in writing - Impact assessment provided within 3 business days - Approved changes may affect timeline and cost #### 12.2.2 Intellectual Property - All deliverables become MWG property upon full payment - Vendor retains rights to generic tools and frameworks - No third-party IP infringement warranty #### 12.2.3 Warranty - **90-day warranty** post go-live for defect fixes - Warranty covers bugs in delivered functionality - Does not cover new features or enhancements ### 12.3 Confidentiality - All project information treated as confidential - NDA terms apply as per contract - No disclosure to third parties without consent ### 12.4 Liability - Total liability limited to contract value - No liability for indirect/consequential damages - Force majeure clause applies ### 12.5 Termination - Either party may terminate with 30 days notice - Payment due for work completed up to termination - All deliverables to be handed over upon termination ### 12.6 Assumptions 1. MWG to provide CIAM API access by Feb 2026 2. MWG to provide animal content (images, descriptions) by Mar 2026 3. MWG to confirm prize inventory details by Jun 2026 4. Network connectivity available at all terminal locations 5. MWG IT team available for integration support 6. Creative assets provided by MWG's creative agency 7. UAT resources provided by MWG ### 12.7 Dependencies (MWG Responsibilities) | Item | Required By | Impact if Delayed | |------|-------------|-------------------| | CIAM API documentation | Week 2 | Blocks authentication development | | CIAM API access (sandbox) | Week 4 | Blocks integration testing | | Animal content | Week 8 | Blocks voting UI finalization | | Prize details | Week 16 | Blocks reward system | | UAT testers | Week 22 | Blocks UAT | | Network at parks | Week 26 | Blocks hardware setup | --- ## 13. Value Proposition ### 13.1 Why Choose Apps Cyclone Technology JSC | Differentiator | Benefit to MWG | |----------------|----------------| | **AI-Augmented Development** | 45% faster delivery, 42% lower development cost | | **12+ Years Experience** | 1,000+ projects, 100+ global clients since 2012 | | **ISO 27001:2022 Certified** | Enterprise-grade security and data protection | | **Singapore Client Experience** | Proven track record serving SG-based organizations | | **70+ Professionals** | Expert AI-augmented engineers, designers, and PMs | | **End-to-End Solution** | Single vendor for platform + hardware + support | | **Superior Code Quality** | AI-assisted code review + testing = fewer bugs | | **On-time Delivery** | AI acceleration provides additional timeline buffer | ### 13.2 AI-Augmented Advantage Our AI-augmented development approach delivers measurable benefits: | Metric | Traditional | AI-Augmented | Impact | |--------|------------|-------------|--------| | Development Hours | 2,320 hrs | 1,286 hrs | **44.6% fewer hours** | | Development Cost | $64,320 | $37,436 | **$26,884 savings** | | Test Coverage | ~60% typical | 80%+ target | **Higher quality** | | Documentation | Manual effort | AI-generated | **More comprehensive** | | Code Review | Peer only | AI + peer | **Fewer production bugs** | | Timeline Buffer | Tight | 2+ weeks extra | **Lower delivery risk** | ### 13.3 Our Commitment 1. **AI-Augmented Team** - Skilled engineers amplified by cutting-edge AI tools 2. **Transparent Communication** - Weekly status updates, open escalation channels 3. **Superior Quality** - AI-powered testing and code review, security-first approach 4. **Cost Transparency** - AI productivity gains passed directly to client 5. **Post-launch Support** - Comprehensive support throughout campaign 6. **Knowledge Transfer** - AI-generated comprehensive documentation and training ### 13.4 Risk Mitigation | Risk | Our Mitigation Strategy | |------|------------------------| | Timeline pressure | AI-accelerated development provides 2+ weeks buffer | | CIAM integration | Early engagement, AI-generated mock services | | Traffic spikes | Load testing, auto-scaling architecture | | Hardware failures | Spare devices, rapid replacement SLA | | Security threats | AI pre-scanning + pen testing + monitoring | | AI tool dependency | All AI output reviewed by senior engineers; fallback to manual if needed | --- ## 14. Appendix ### 14.1 Glossary | Term | Definition | |------|------------| | CIAM | Customer Identity Access Management | | FOM | Friends of Mandai (membership program) | | UAT | User Acceptance Testing | | OTP | One-Time Password | | SZ | Singapore Zoo | | RW | River Wonders | | NS | Night Safari | | BP | Bird Paradise | | RF | Rainforest Wild | | WildPass | Mandai's digital loyalty program | ### 14.2 Reference Documents 1. RFQ - Animal Stars Voting Platform.pdf 2. Animal Stars Voting Platform_RFQ cover.docx 3. Animal Stars Voting Platform_Vendor to Quote.xlsx 4. Animal Stars Voting Platform - STCs.docx 5. MWG Info Sec Assessment Form.xlsx ### 14.3 Contact Information **Apps Cyclone Technology JSC** | Role | Name | |------|------| | Sales Contact | Tam Nhat Ton | | Project Manager | Hoa Doan | | Technical Lead | Vu Dao | **Address:** 168/6 Bui Thi Xuan, Ward 03, Tan Binh District, Ho Chi Minh City, Vietnam **Website:** https://appscyclone.com --- ### 14.4 Document Revision History | Version | Date | Author | Changes | |---------|------|--------|---------| | 1.0 | Jan 2026 | Tam Nhat Ton | Initial proposal | | 2.0 | Jan 2026 | Tam Nhat Ton | Updated to AI-augmented development approach | --- **This proposal is valid for 30 days from the date of submission.** **Prepared by:** Tam Nhat Ton, Sales Director **Reviewed by:** Hoa Doan, Project Manager **Approved by:** Vu Dao, Technical Lead --- *Confidential - For Mandai Wildlife Group evaluation purposes only*
# Product Requirements Document (PRD) # Animal Stars Voting Platform **Client:** Mandai Wildlife Group **Version:** 1.0 **Date:** January 2026 **Status:** Draft **Confidentiality:** Private and Confidential --- ## Table of Contents 1. [Executive Summary](#1-executive-summary) 2. [Project Overview](#2-project-overview) 3. [Business Objectives](#3-business-objectives) 4. [Target Users](#4-target-users) 5. [Functional Requirements](#5-functional-requirements) 6. [Non-Functional Requirements](#6-non-functional-requirements) 7. [User Flow & Wireframes](#7-user-flow--wireframes) 8. [System Architecture](#8-system-architecture) 9. [Integration Requirements](#9-integration-requirements) 10. [Hardware Requirements](#10-hardware-requirements) 11. [Security Requirements](#11-security-requirements) 12. [Analytics & Reporting](#12-analytics--reporting) 13. [Reward System](#13-reward-system) 14. [Timeline & Milestones](#14-timeline--milestones) 15. [Acceptance Criteria](#15-acceptance-criteria) 16. [Assumptions & Constraints](#16-assumptions--constraints) 17. [Risks & Mitigation](#17-risks--mitigation) 18. [Appendix](#18-appendix) --- ## 1. Executive Summary ### 1.1 Project Description Animal Stars Voting Platform is an online and onsite voting system that allows visitors to nominate and vote for the next "animal star" across 5 parks under Mandai Wildlife Reserve. The platform aims to enhance customer interaction and engagement with the parks. > **Development Approach:** The project will be developed using **AI-Augmented Development** methodology, utilizing AI coding assistants (Claude Code, GitHub Copilot) to accelerate development by ~45%, ensuring higher code quality at optimized cost. ### 1.2 Campaign Overview - **Campaign Name:** Vote for Animal Stars - **Voting Period:** 1 September - 31 October, 2026 (8 weeks) - **Scope:** 5 parks at Mandai Wildlife Reserve - **Voting Channels:** Online (Web) + Onsite (Voting Terminals + QR Code) ### 1.3 Key Stakeholders | Role | Responsibility | |------|---------------| | Mandai Wildlife Group (MWG) | Product Owner, Final approval | | MWG Marketing Team | Campaign management, Content | | MWG IT Team | CIAM integration, Technical oversight | | Creative Agency | Landing page, Visual design | | Vendor (Development) | Platform development, Deployment | --- ## 2. Project Overview ### 2.1 Background Mandai Wildlife Group operates 5 leading wildlife parks in Singapore: - **SZ** - Singapore Zoo - **RW** - River Wonders - **NS** - Night Safari - **BP** - Bird Paradise - **RF** - Rainforest Wild The "Animal Stars" campaign aims to create deeper interaction with visitors by allowing them to vote for their favorite animals. ### 2.2 Problem Statement - Need to increase engagement and awareness for the parks - Create opportunities for customers to interact directly with marketing campaigns - Collect data on customer preferences - Encourage membership and WildPass registration ### 2.3 Solution Build an omni-channel voting platform with: - Web-based voting platform - Onsite voting terminals at parks - Live leaderboard real-time - Reward/gamification system --- ## 3. Business Objectives ### 3.1 Primary Objectives | Objective | Key Result | Metric | |-----------|------------|--------| | Increase engagement | High vote volume | Total votes, Unique voters | | Increase awareness | Wide reach | Page views, Sessions | | Increase membership | New sign-ups | New WildPass/FOM signups | | Collect data | Understand customer behavior | User segments, Voting patterns | ### 3.2 Success Metrics (KPIs) - **Total Votes:** TBD (Target) - **Unique Participants:** TBD - **Daily Active Voters:** TBD - **WildPass Conversion Rate:** TBD% - **Return Voter Rate:** TBD% - **Reward Redemption Rate:** TBD% --- ## 4. Target Users ### 4.1 User Segments #### 4.1.1 Registered Users (Non-residents) - International tourists - No Singapore NRIC/FIN - Register by email - **Vote allocation:** 5 votes/day (1 vote/park) #### 4.1.2 WildPass Holders (Residents) - Singapore residents with WildPass - Register via SingPass authentication - **Vote allocation:** 10 votes/day (2 votes/park) #### 4.1.3 Members - Friends of Mandai (FOM) - Paid members of Mandai - Have membership benefits - **Vote allocation:** 15 votes/day (3 votes/park) ### 4.2 User Personas #### Persona 1: Tourist Family - **Name:** Sarah & Family - **Type:** Registered User - **Behavior:** Visits 1-2 parks, wants to participate in interactive activities - **Goals:** Fun experience, campaign participation #### Persona 2: Local Wildlife Enthusiast - **Name:** David - **Type:** WildPass Holder - **Behavior:** Regular visitor, interested in animals - **Goals:** Support favorite animals, receive rewards #### Persona 3: Loyal Member - **Name:** Michelle - **Type:** Friends of Mandai Member - **Behavior:** Very frequent visitor, engaged with community - **Goals:** Maximize votes, influence results, exclusive rewards --- ## 5. Functional Requirements ### 5.1 Module 1: Authentication & User Management #### FR-1.1 Login/Sign-up Page | ID | Requirement | Priority | |----|-------------|----------| | FR-1.1.1 | Display welcome page with campaign branding | Must Have | | FR-1.1.2 | Input field for email address | Must Have | | FR-1.1.3 | Integrate CIAM API to verify existing account | Must Have | | FR-1.1.4 | Send Email OTP for existing users | Must Have | | FR-1.1.5 | Redirect to sign-up options for new users | Must Have | | FR-1.1.6 | Option to register as Registered User | Must Have | | FR-1.1.7 | Option to register as WildPass (SingPass auth) | Must Have | | FR-1.1.8 | Persistent login session (per CIAM requirements) | Must Have | #### FR-1.2 Account Type Detection | ID | Requirement | Priority | |----|-------------|----------| | FR-1.2.1 | Identify user segment from CIAM | Must Have | | FR-1.2.2 | Map vote allocation per segment | Must Have | | FR-1.2.3 | Store user preferences | Should Have | ### 5.2 Module 2: Vote Management #### FR-2.1 Vote Allocation Display | ID | Requirement | Priority | |----|-------------|----------| | FR-2.1.1 | Display remaining votes for current day | Must Have | | FR-2.1.2 | Real-time update when user votes | Must Have | | FR-2.1.3 | Show "Come back tomorrow" when votes exhausted | Must Have | | FR-2.1.4 | Reset votes at 00:00 SGT daily | Must Have | #### FR-2.2 Booster Code System | ID | Requirement | Priority | |----|-------------|----------| | FR-2.2.1 | Input field "Have a code?" | Must Have | | FR-2.2.2 | Validate booster code | Must Have | | FR-2.2.3 | Award additional votes from valid code | Must Have | | FR-2.2.4 | Restrict booster votes to specific animal (partner codes) | Must Have | | FR-2.2.5 | Admin interface to generate/manage booster codes | Must Have | | FR-2.2.6 | One-time use validation per user per code | Should Have | #### FR-2.3 Voting Interface | ID | Requirement | Priority | |----|-------------|----------| | FR-2.3.1 | Display 5 parks with contestants | Must Have | | FR-2.3.2 | Each park has 5 animal contestants (circles) | Must Have | | FR-2.3.3 | Click to allocate vote to animal | Must Have | | FR-2.3.4 | Allow multiple votes to same animal (per tier) | Must Have | | FR-2.3.5 | Real-time vote counter per park | Must Have | | FR-2.3.6 | Grey out options when votes exhausted for park | Must Have | | FR-2.3.7 | Submit button to confirm votes | Must Have | | FR-2.3.8 | Confirmation dialog before submit | Should Have | #### FR-2.4 Onsite Terminal Voting | ID | Requirement | Priority | |----|-------------|----------| | FR-2.4.1 | Default display terminal's park first | Must Have | | FR-2.4.2 | Kiosk mode - locked to voting app only | Must Have | | FR-2.4.3 | Auto-logout after period of inactivity | Must Have | | FR-2.4.4 | Touch-optimized interface | Must Have | ### 5.3 Module 3: Live Leaderboard #### FR-3.1 Leaderboard Display | ID | Requirement | Priority | |----|-------------|----------| | FR-3.1.1 | Real-time vote counts per animal per park | Must Have | | FR-3.1.2 | Ranking visualization (1st, 2nd, 3rd...) | Must Have | | FR-3.1.3 | 5-park view in one interface | Must Have | | FR-3.1.4 | Responsive design (Desktop + Mobile) | Must Have | | FR-3.1.5 | Animation when new vote arrives (+1 effect) | Should Have | | FR-3.1.6 | Leaderboard embeddable via iFrame | Must Have | #### FR-3.2 Real-time Synchronization | ID | Requirement | Priority | |----|-------------|----------| | FR-3.2.1 | WebSocket or polling for real-time updates | Must Have | | FR-3.2.2 | Sync across all touchpoints (web, terminals, TVs) | Must Have | | FR-3.2.3 | Max latency: 5 seconds | Should Have | #### FR-3.3 Admin Vote Control | ID | Requirement | Priority | |----|-------------|----------| | FR-3.3.1 | Backend interface to adjust vote counts | Must Have | | FR-3.3.2 | Audit log for vote adjustments | Must Have | | FR-3.3.3 | Role-based access control | Must Have | ### 5.4 Module 4: Reward System #### FR-4.1 Gamification Mechanism | ID | Requirement | Priority | |----|-------------|----------| | FR-4.1.1 | Spin-the-wheel / Digital scratch card / Gachapon | Must Have | | FR-4.1.2 | Configurable prize probability | Must Have | | FR-4.1.3 | Prize inventory management | Must Have | | FR-4.1.4 | Fair distribution over 8-week period | Must Have | | FR-4.1.5 | Anti-fraud measures | Must Have | #### FR-4.2 Prize Types | ID | Requirement | Priority | |----|-------------|----------| | FR-4.2.1 | Retail & F&B vouchers | Must Have | | FR-4.2.2 | Admission discounts | Must Have | | FR-4.2.3 | Bonus votes | Must Have | | FR-4.2.4 | "Try again tomorrow" option | Must Have | #### FR-4.3 Reward Delivery | ID | Requirement | Priority | |----|-------------|----------| | FR-4.3.1 | Prize notification on screen | Must Have | | FR-4.3.2 | Email delivery with prize details | Must Have | | FR-4.3.3 | Credit to user account (where applicable) | Should Have | | FR-4.3.4 | Unique QR code for redemption | Must Have | | FR-4.3.5 | Anti-double redemption system | Must Have | ### 5.5 Module 5: Tutorial & T&Cs #### FR-5.1 First-time User Tutorial | ID | Requirement | Priority | |----|-------------|----------| | FR-5.1.1 | Pop-up tutorial on first visit | Must Have | | FR-5.1.2 | Step-by-step voting guide | Must Have | | FR-5.1.3 | T&Cs acceptance checkbox | Must Have | | FR-5.1.4 | Skip option for returning users | Should Have | ### 5.6 Module 6: Admin Dashboard #### FR-6.1 Campaign Management | ID | Requirement | Priority | |----|-------------|----------| | FR-6.1.1 | Configure campaign start/end dates | Must Have | | FR-6.1.2 | Manage animal contestants (add/edit/remove) | Must Have | | FR-6.1.3 | Upload animal images and descriptions | Must Have | | FR-6.1.4 | Configure vote allocations per tier | Must Have | #### FR-6.2 Booster Code Management | ID | Requirement | Priority | |----|-------------|----------| | FR-6.2.1 | Generate single/bulk booster codes | Must Have | | FR-6.2.2 | Set code validity period | Must Have | | FR-6.2.3 | Link code to specific animal (for partners) | Must Have | | FR-6.2.4 | Track code usage | Must Have | #### FR-6.3 Reporting Dashboard | ID | Requirement | Priority | |----|-------------|----------| | FR-6.3.1 | Real-time participation metrics | Must Have | | FR-6.3.2 | Vote breakdown by park/animal/time | Must Have | | FR-6.3.3 | User segment analysis | Must Have | | FR-6.3.4 | Prize issuance/redemption tracking | Must Have | | FR-6.3.5 | Export to CSV/Excel | Should Have | --- ## 6. Non-Functional Requirements ### 6.1 Performance Requirements | ID | Requirement | Target | |----|-------------|--------| | NFR-1.1 | Page load time | < 3 seconds | | NFR-1.2 | API response time | < 500ms | | NFR-1.3 | Concurrent users support | 10,000+ | | NFR-1.4 | Leaderboard sync latency | < 5 seconds | | NFR-1.5 | System uptime | 99.9% | ### 6.2 Scalability Requirements | ID | Requirement | Target | |----|-------------|--------| | NFR-2.1 | Handle traffic spikes (weekends, holidays) | 3x normal | | NFR-2.2 | Database query optimization | Indexed queries | | NFR-2.3 | CDN for static assets | Global CDN | ### 6.3 Usability Requirements | ID | Requirement | Target | |----|-------------|--------| | NFR-3.1 | Mobile-first responsive design | All screen sizes | | NFR-3.2 | Touch-friendly UI for tablets | Min 44px touch targets | | NFR-3.3 | Accessibility (WCAG 2.1 AA) | Compliant | | NFR-3.4 | Multi-language support | EN (mandatory), ZH (optional) | ### 6.4 Compatibility Requirements | ID | Requirement | Target | |----|-------------|--------| | NFR-4.1 | Browser support | Chrome, Safari, Firefox, Edge (latest 2 versions) | | NFR-4.2 | Mobile OS | iOS 14+, Android 10+ | | NFR-4.3 | Tablet support | iPad, Android tablets | ### 6.5 Reliability Requirements | ID | Requirement | Target | |----|-------------|--------| | NFR-5.1 | Data backup frequency | Daily | | NFR-5.2 | Disaster recovery RTO | < 4 hours | | NFR-5.3 | Disaster recovery RPO | < 1 hour | --- ## 7. User Flow & Wireframes ### 7.1 Main User Journey ``` VOTING PLATFORM FLOW Step 1 Step 2 Step 3 Step 4 Step 5 Welcome --> Votes --> Vote --> Leader- --> Thank You & Login Available for board + Reward +Booster Animal | | | v v v CIAM Step 2A Spin Auth Tutorial Wheel + T&Cs ``` ### 7.2 Step 1: Welcome & Login Page **Screen Elements:** - Campaign logo (top) - Welcome message - Email input field - "Next" button - Sign-up options (for new users): - "Sign up as Registered User" - "Sign up as WildPass (SingPass required)" **Business Rules:** - Existing account: Email OTP verification then Dashboard - New account: Sign-up flow selection ### 7.3 Step 2: Votes Available + Booster **Screen Elements:** - Campaign logo - Vote counter: "You have X votes today." - "Have a code?" input field - "Next" button - Alternative view: "You've voted today. Come again tomorrow!" - "See live leaderboard" link **Business Rules:** - Display remaining votes for current day - Booster code validation in real-time - If voted: Show "come back tomorrow" message ### 7.4 Step 3: Vote for Animal **Screen Elements:** - Campaign logo - 5 park rows (SZ, RW, NS, BP, RF) - 5 animal circles per park - Vote counter per park - "Submit" button **Business Rules:** - Click animal: Allocate vote - Multiple votes allowed to same animal (based on tier) - Grey out park when votes exhausted - Onsite terminals: Default park shown first ### 7.5 Step 4: Live Leaderboard **Screen Elements:** - Campaign logo - 5-park leaderboard view - Ranking with animal images - Vote counts per animal - "Next" button **Business Rules:** - Real-time sync across all devices - +1 animation on new vote - Embeddable via iFrame ### 7.6 Step 5: Thank You + Rewards **Screen Elements:** - "Thank you for voting!" message - "Come back tomorrow to vote again" - Spin wheel / Scratch card / Gachapon - "Spin" button - Prize reveal - Reward delivery confirmation **Business Rules:** - 1 spin per voting session - Prize probability configured in backend - Prize emailed and/or credited to account --- ## 8. System Architecture ### 8.1 High-Level Architecture ``` PRESENTATION LAYER Web Client | Mobile Web | Onsite Tablets | TV Displays | API GATEWAY (Load Balancer + SSL) | +--------------+--------------+ | | | Authentication Voting Reward Service Service Service | | | +--------------+--------------+ | DATA LAYER Database | Cache | Message Queue | File Storage (PostgreSQL) (Redis) (RabbitMQ) (S3/CDN) | EXTERNAL INTEGRATIONS Mandai CIAM | Email Service | Analytics | Mandai App ``` ### 8.2 Technology Stack (Recommended) | Layer | Technology | Justification | |-------|------------|---------------| | Frontend | React.js / Next.js | Modern, responsive, SEO-friendly | | Backend | Node.js / NestJS | Scalable, real-time capable | | Database | PostgreSQL | Reliable, ACID compliant | | Cache | Redis | Fast, real-time leaderboard | | Real-time | WebSocket / Socket.io | Live updates | | CDN | CloudFlare / AWS CloudFront | Global distribution | | Hosting | AWS / Azure | Enterprise-grade | | AI Development | Claude Code, GitHub Copilot, Cursor | AI-augmented coding, review, testing | | AI Testing | AI-generated test suites | Automated test case generation, 80%+ coverage | | AI Documentation | AI-assisted doc generation | Auto-generated API docs, user guides | ### 8.2.1 AI-Augmented Development Methodology This project is developed using **AI-Augmented Development** -- a modern approach where AI coding assistants are integrated into every stage of the development lifecycle. #### Benefits of AI-Augmented Development | Aspect | Traditional | AI-Augmented | Improvement | |--------|-------------|-------------|-------------| | Code Generation | Manual coding | AI generates initial code, engineer refines | 40-60% faster | | Code Review | Peer review only | AI pre-review + peer review | Higher quality | | Unit Testing | Manual test writing | AI generates comprehensive test suites | 60-70% faster | | Documentation | Manual writing | AI generates from code + human editing | 50-70% faster | | Bug Fixing | Manual debugging | AI diagnostics + suggested fixes | 30-40% faster | #### AI-Augmented Development Cycle ``` AI-AUGMENTED DEVELOPMENT CYCLE Specs/Stories --> AI Code Generation --> Human Review | Deploy <-- AI Testing + QA Review <---------+ ``` #### Role of Engineer in AI-Augmented Model - **Architect & Guide:** Design architecture, guide AI to generate correct patterns - **Reviewer & Validator:** Review AI-generated code, ensure quality - **Integration Specialist:** Handle complex parts AI cannot handle well (CIAM, real-time sync) - **Quality Gatekeeper:** Ensure AI-generated code meets security and performance standards ### 8.3 Data Model (Simplified) ``` Users Votes Animals - id - id - id - email - user_id - name - account_type - animal_id - park_id - wildpass_id - voted_at - image_url - created_at - source - description BoosterCodes Parks Rewards - id - id - id - code - name - user_id - extra_votes - short_code - prize_type - animal_id - qr_code - used_by - redeemed_at - expires_at ``` --- ## 9. Integration Requirements ### 9.1 CIAM Integration (Mandatory) **Purpose:** User authentication and account management | API | Function | Priority | |-----|----------|----------| | Login API | Authenticate existing users | Must Have | | Sign-up API | Register new users | Must Have | | Profile API | Get user segment (Registered/WildPass/FOM) | Must Have | | OTP API | Email OTP verification | Must Have | **Requirements:** - Work closely with Mandai IT team - Follow CIAM security protocols - Handle session management per CIAM specifications ### 9.2 Landing Page Integration **Purpose:** Seamless transition from campaign webpage to voting platform | Item | Requirement | |------|-------------| | Entry point | Link/button from campaign landing page | | UTM tracking | Preserve campaign parameters | | Branding | Consistent with campaign design | ### 9.3 Mandai App Integration **Purpose:** Voting access from mobile app | Item | Requirement | |------|-------------| | Deep link | Open voting platform in app | | SSO | Seamless auth with app login | | Webview | Responsive in app webview | ### 9.4 Analytics Integration **Purpose:** Track user behavior and campaign performance | Platform | Events to Track | |----------|-----------------| | Adobe Analytics | Page views, User segments, Voting events | | Google Analytics | Sessions, Bounce rate, Conversion | **Key Events:** - `page_view`: Landing page, Voting page, Leaderboard - `login_success`: User type - `vote_cast`: Animal, Park, Vote count - `booster_code_used`: Code, Extra votes - `reward_won`: Prize type - `reward_redeemed`: Redemption method ### 9.5 Email Service Integration **Purpose:** Deliver prize notifications and confirmations | Email Type | Trigger | |------------|---------| | Prize won | After spin wheel | | Prize reminder | 3 days before expiry | | Campaign updates | Weekly (optional) | --- ## 10. Hardware Requirements ### 10.1 Onsite Voting Terminals **Quantity per Park:** | Equipment | Quantity per Park | Total (5 Parks) | |-----------|-------------------|-----------------| | Tablet Stand/Kiosk | 3 | 15 | | Tablet Device (iPad or equivalent) | 3 | 15 | | Leaderboard TV/Display | 1 | 5 | ### 10.2 Tablet Requirements | Specification | Requirement | |--------------|-------------| | Screen Size | 10" minimum | | OS | iOS 14+ / Android 10+ | | Connectivity | WiFi | | Kiosk Mode | Lockable to voting app only | | Security | Anti-theft mount | ### 10.3 Display Requirements (Leaderboard TV) | Specification | Requirement | |--------------|-------------| | Screen Size | 42" - 55" | | Resolution | 1080p minimum | | Connectivity | WiFi / Ethernet | | Mounting | Wall or stand | ### 10.4 Support Requirements | Service | Requirement | |---------|-------------| | Onsite Support | Available during park hours | | Remote Support | 24/7 monitoring | | Repair/Replacement | Within 24 hours | | Spare Equipment | 2 backup tablets | --- ## 11. Security Requirements ### 11.1 Data Protection | Requirement | Implementation | |-------------|----------------| | Data encryption at rest | AES-256 | | Data encryption in transit | TLS 1.3 | | PII handling | PDPA compliant | | Data retention | Per MWG policy | ### 11.2 Application Security | Requirement | Implementation | |-------------|----------------| | Authentication | OAuth 2.0 / OIDC via CIAM | | Session management | JWT with short expiry | | Rate limiting | 100 requests/minute/IP | | Input validation | Server-side validation | | SQL injection | Parameterized queries | | XSS protection | Content Security Policy | ### 11.3 Anti-Fraud Measures | Requirement | Implementation | |-------------|----------------| | Vote manipulation | Server-side vote counting | | Bot detection | CAPTCHA on suspicious activity | | IP tracking | Flag unusual patterns | | Code abuse | One-time use per user | | Double redemption | Unique QR + database check | ### 11.4 Security Testing | Test Type | Timing | |-----------|--------| | Penetration Test | Before launch | | Vulnerability Scan | Weekly during campaign | | Code Review | Before each deployment | --- ## 12. Analytics & Reporting ### 12.1 Real-time Dashboard **Metrics:** - Total votes (all time) - Votes today - Unique voters - Active sessions - Votes per park - Leading animals ### 12.2 Daily Reports **Metrics:** - New registrations - Daily active users - Vote distribution - Booster code usage - Rewards issued ### 12.3 Campaign Reports **Metrics:** - Total participants by segment - Conversion rates (visitor to voter) - WildPass sign-up rate - Peak voting times - Popular animals - Reward redemption rates ### 12.4 Export Capabilities | Format | Purpose | |--------|---------| | CSV | Raw data export | | Excel | Formatted reports | | PDF | Executive summary | --- ## 13. Reward System ### 13.1 Gamification Mechanism **Options (Vendor to propose):** 1. Spin-the-Wheel 2. Digital Scratch Card 3. Gachapon (Capsule Machine) ### 13.2 Prize Pool | Prize Type | Example | Probability | |------------|---------|-------------| | Premium | Annual Pass | 0.1% | | High Value | Day Pass | 1% | | Medium Value | F&B $20 voucher | 5% | | Low Value | Retail 10% off | 20% | | Bonus Votes | +5 votes | 30% | | Try Again | No prize | 43.9% | *Probabilities are indicative and to be confirmed by MWG* ### 13.3 Prize Distribution Rules - Prizes spread evenly over 8-week period - Daily caps on premium prizes - Weekend boost for engagement - Last week: Increased win probability ### 13.4 Redemption Flow 1. Win prize: Show on screen 2. Generate unique QR code 3. Send email with QR + details 4. Credit to user account (if applicable) 5. Guest shows QR at redemption point 6. Staff scans: Validates then Marks redeemed 7. System updates redemption status --- ## 14. Timeline & Milestones ### 14.1 Project Timeline | Week | Date | Milestone | AI Advantage | |------|------|-----------|-------------| | W1 | End Jan 2026 | Tender award to vendor | - | | W2 | Start Feb 2026 | Kick-off meeting | - | | W3-7 | Feb - Mar 2026 | 1st Draft development (AI-accelerated) | AI code generation reduces 45% effort | | W7 | Mar 2026 | CIAM integration start | AI generates API wrappers | | W8-14 | Apr - May 2026 | 2nd Draft + Integrations (AI-accelerated) | AI-generated tests + docs | | W15-18 | Jun - Jul 2026 | UAT + Bug fixes (AI-assisted) | AI generates test cases | | W19-22 | Jul - Aug 2026 | Final revisions + Pen test | AI pre-scan security | | W22 | ~10 Aug 2026 | **Platform Ready** | **~2 weeks buffer gained** | | W22-24 | 10-24 Aug 2026 | **Buffer Period** | Additional safety margin | | W25 | 1 Sep 2026 | **LAUNCH** | - | | W25-32 | Sep - Oct 2026 | Campaign live (8 weeks) | AI-assisted monitoring | | W33 | Nov 2026 | Campaign end + Reporting | AI-generated reports | ### 14.2 Key Dates (Non-negotiable) | Date | Milestone | Status | |------|-----------|--------| | **24 Aug 2026** | Platform Ready | FIXED | | **1 Sep 2026** | Campaign Launch | FIXED | | **31 Oct 2026** | Campaign End | FIXED | ### 14.3 Deliverables Checklist | # | Deliverable | Due Date | AI Advantage | |---|-------------|----------|-------------| | 1 | Wireframes & UI Design | W3 | AI-assisted prototyping | | 2 | 1st Draft (Frontend + Backend) | W7 | AI-accelerated development | | 3 | CIAM Integration | W10 | AI-generated API wrappers | | 4 | 2nd Draft + Leaderboard | W14 | AI code generation | | 5 | Reward System | W15 | AI-generated game logic | | 6 | UAT Environment | W17 | AI-generated test cases | | 7 | Bug Fixes (AI-assisted) | W19 | AI diagnostics | | 8 | Pen Test Complete | W21 | AI pre-scan | | 9 | Hardware Setup | W22 | - | | 10 | Go-Live Ready + Buffer | W22-W24 | ~2 weeks buffer | --- ## 15. Acceptance Criteria ### 15.1 Functional Acceptance | Module | Criteria | |--------|----------| | Authentication | 100% CIAM integration working | | Voting | All vote rules functioning correctly | | Leaderboard | Real-time sync < 5 seconds | | Rewards | Prize distribution working per config | | Admin | All management functions operational | ### 15.2 Performance Acceptance | Metric | Criteria | |--------|----------| | Load Time | < 3 seconds | | Uptime | 99.9% | | Concurrent Users | 10,000+ | | Error Rate | < 0.1% | ### 15.3 Security Acceptance | Test | Criteria | |------|----------| | Pen Test | No critical/high vulnerabilities | | OWASP Top 10 | All addressed | | Data Protection | PDPA compliant | ### 15.4 UAT Sign-off - MWG Marketing Team sign-off - MWG IT Team sign-off - All critical bugs resolved - Performance benchmarks met --- ## 16. Assumptions & Constraints ### 16.1 Assumptions 1. CIAM API will be available for integration by Feb 2026 2. Animal contestants list finalized before development 3. Prize inventory confirmed before rewards development 4. Network connectivity available at all terminal locations 5. Creative assets provided by MWG/creative agency ### 16.2 Constraints 1. **Fixed deadline:** 24 Aug 2026 readiness - non-negotiable 2. **Budget:** To be confirmed 3. **Technology:** Must integrate with existing MWG systems 4. **Language:** English mandatory, other languages optional 5. **Compliance:** PDPA, MWG security policies ### 16.3 Dependencies | Dependency | Owner | Impact if Delayed | |------------|-------|-------------------| | CIAM API access | MWG IT | Blocks authentication | | Animal content | MWG Marketing | Blocks voting UI | | Creative assets | Agency | Blocks frontend | | Prize details | MWG Marketing | Blocks rewards | | Hardware delivery | Vendor | Blocks onsite setup | --- ## 17. Risks & Mitigation ### 17.1 Risk Matrix | Risk | Probability | Impact | Mitigation | |------|-------------|--------|------------| | CIAM integration delays | Medium | High | Early engagement with IT, AI-generated mock services | | High traffic at launch | High | High | Load testing, auto-scaling, CDN | | Hardware failure onsite | Medium | Medium | Backup devices, rapid replacement SLA | | Vote manipulation attempts | Medium | High | Server-side validation, AI-powered monitoring | | Prize fraud | Low | Medium | Unique codes, anti-double redemption | | Timeline compression | Low | High | AI-accelerated development provides ~2 weeks buffer | | AI tool dependency | Low | Medium | All AI output reviewed by senior engineers; manual fallback | | AI-generated code quality | Low | Medium | Mandatory human review + AI-powered testing for all AI code | ### 17.2 Contingency Plans 1. **CIAM Delay:** AI-generated mock authentication services, swap when ready 2. **Traffic Spike:** Queue system, graceful degradation, AI-monitored auto-scaling 3. **Hardware Issue:** Remote reset, backup devices 4. **Security Breach:** AI-powered anomaly detection, incident response plan, rollback capability 5. **AI Tool Unavailability:** Team can fall back to traditional development; AI tools are productivity enhancers, not dependencies --- ## 18. Appendix ### 18.1 Glossary | Term | Definition | |------|------------| | CIAM | Customer Identity Access Management | | WildPass | Mandai's loyalty program for Singapore residents | | FOM | Friends of Mandai - paid membership program | | SingPass | Singapore's national digital identity | | UAT | User Acceptance Testing | | OTP | One-Time Password | ### 18.2 Reference Documents 1. RFQ - Animal Stars Voting Platform.pdf 2. Animal Stars Voting Platform_RFQ cover.docx 3. Animal Stars Voting Platform_Vendor to Quote.xlsx 4. Animal Stars Voting Platform - STCs.docx 5. MWG Info Sec Assessment Form.xlsx ### 18.3 Parks Information | Short Code | Full Name | |------------|-----------| | SZ | Singapore Zoo | | RW | River Wonders | | NS | Night Safari | | BP | Bird Paradise | | RF | Rainforest Wild | ### 18.4 Vote Allocation Summary | User Type | Daily Votes | Votes per Park | |-----------|-------------|----------------| | Registered User | 5 | 1 | | WildPass | 10 | 2 | | Friends of Mandai | 15 | 3 | ### 18.5 Contact Information | Role | Contact | Email | |------|---------|-------| | Project Owner | TBD | TBD | | Technical Lead | TBD | TBD | | Marketing Lead | TBD | TBD | --- **Document Version History** | Version | Date | Author | Changes | |---------|------|--------|---------| | 1.0 | Jan 2026 | - | Initial draft | | 2.0 | Jan 2026 | - | Updated to AI-augmented development approach | --- *This document is confidential and intended for authorized personnel only.*
ANIMAL STARS VOTING PLATFORM - DETAILED PRICING BREAKDOWN (AI-AUGMENTED),,,,,, Apps Cyclone Technology JSC,,,,,, Date: January 2026,,,,,, Proposal Validity: 30 days,,,,,, Currency: USD,,,,,, "Base Dev Rate: $20/hour | PM & QC Overhead: 30% of dev cost | AI Productivity: ~45% reduction",,,,,, ,,,,,, ===========================================,,,,,, SECTION A: PLATFORM DEVELOPMENT (AI-AUGMENTED),,,,,, ===========================================,,,,,, ,,,,,, "Note: Hours reflect AI-augmented development approach, where AI coding assistants",,,,,, "(Claude Code, GitHub Copilot, Cursor) accelerate code generation, testing, and documentation.",,,,,, Traditional hours shown for comparison in parentheses.,,,,,, ,,,,,, Item Code,Module / Description,AI-Augmented Hours,Dev Cost ($20/hr),PM/QC (30%),Other,Total (USD) ,,,,,, A1,AUTHENTICATION & USER MANAGEMENT,,,,,, A1.1,"CIAM API Integration (Login/Sign-up/Profile) [was 160hrs]",100,$2000,$600,,$2600 A1.2,"Email OTP Verification System [was 40hrs]",24,$480,$144,,$624 A1.3,"Session Management & Security [was 32hrs]",20,$400,$120,,$520 A1.4,"Account Type Detection (Registered/WildPass/FOM) [was 24hrs]",16,$320,$96,,$416 ,Subtotal A1 (was 256 hrs),160,$3200,$960,,$4160 ,,,,,, A2,VOTING ENGINE,,,,,, A2.1,"Vote Allocation System (Tiered: 5/10/15 daily) [was 80hrs]",48,$960,$288,,$1248 A2.2,"Voting Interface (5-park grid view) [was 120hrs]",72,$1440,$432,,$1872 A2.3,"Real-time Vote Counter & Display [was 48hrs]",28,$560,$168,,$728 A2.4,"Daily Vote Reset System (00:00 SGT) [was 16hrs]",8,$160,$48,,$208 A2.5,"Booster Code System (Generation & Validation) [was 80hrs]",48,$960,$288,,$1248 A2.6,"Partner Code Management (Animal-specific) [was 40hrs]",24,$480,$144,,$624 ,Subtotal A2 (was 384 hrs),228,$4560,$1368,,$5928 ,,,,,, A3,LIVE LEADERBOARD,,,,,, A3.1,"Real-time Engine (WebSocket + Redis) [was 100hrs]",60,$1200,$360,,$1560 A3.2,"Leaderboard UI (5-park view) [was 80hrs]",48,$960,$288,,$1248 A3.3,"Vote Animation (+1 effect) [was 40hrs]",20,$400,$120,,$520 A3.4,"Embeddable iFrame Widget [was 32hrs]",16,$320,$96,,$416 A3.5,"TV Display Mode (Full-screen) [was 32hrs]",16,$320,$96,,$416 A3.6,"Responsive Design (Desktop/Mobile/Tablet) [was 60hrs]",32,$640,$192,,$832 ,Subtotal A3 (was 344 hrs),192,$3840,$1152,,$4992 ,,,,,, A4,REWARD SYSTEM,,,,,, A4.1,"Gamification UI (Spin-the-Wheel) [was 100hrs]",56,$1120,$336,,$1456 A4.2,"Prize Rules Engine (Probability-based) [was 80hrs]",48,$960,$288,,$1248 A4.3,"Prize Inventory Management [was 60hrs]",32,$640,$192,,$832 A4.4,"8-Week Prize Distribution Algorithm [was 40hrs]",24,$480,$144,,$624 A4.5,"QR Code Generation & Validation [was 48hrs]",24,$480,$144,,$624 A4.6,"Anti-double Redemption System [was 32hrs]",16,$320,$96,,$416 A4.7,"Email Notification System (Prize delivery) [was 48hrs]",24,$480,$144,,$624 A4.8,"Prize Email Templates Design [was 24hrs]",12,$240,$72,,$312 ,Subtotal A4 (was 432 hrs),236,$4720,$1416,,$6136 ,,,,,, A5,ADMIN DASHBOARD,,,,,, A5.1,"Campaign Configuration Module [was 48hrs]",24,$480,$144,,$624 A5.2,"Contestant Management (CRUD) [was 40hrs]",20,$400,$120,,$520 A5.3,"Booster Code Management Interface [was 48hrs]",24,$480,$144,,$624 A5.4,"Vote Inventory Control (Backend) [was 32hrs]",16,$320,$96,,$416 A5.5,"Real-time Analytics Dashboard [was 100hrs]",56,$1120,$336,,$1456 A5.6,"Participation Tracking Dashboard [was 40hrs]",20,$400,$120,,$520 A5.7,"Prize Issuance & Redemption Dashboard [was 48hrs]",24,$480,$144,,$624 A5.8,"Export Functionality (CSV/Excel) [was 24hrs]",12,$240,$72,,$312 A5.9,"Role-based Access Control [was 32hrs]",16,$320,$96,,$416 A5.10,"Audit Logging [was 24hrs]",12,$240,$72,,$312 ,Subtotal A5 (was 436 hrs),224,$4480,$1344,,$5824 ,,,,,, A6,INTEGRATION,,,,,, A6.1,"Landing Page Integration [was 32hrs]",20,$400,$120,,$520 A6.2,"Mandai App Integration (Deep link/WebView) [was 48hrs]",32,$640,$192,,$832 A6.3,"Adobe Analytics Integration [was 32hrs]",16,$320,$96,,$416 A6.4,"Google Analytics Integration [was 24hrs]",12,$240,$72,,$312 ,Subtotal A6 (was 136 hrs),80,$1600,$480,,$2080 ,,,,,, A7,TESTING & SECURITY (AI-ASSISTED),,,,,, A7.1,"UAT Environment Setup [was 32hrs]",16,$320,$96,,$416 A7.2,"UAT Test Case Development - AI-generated [was 40hrs]",16,$320,$96,,$416 A7.3,"UAT Execution Support [was 60hrs]",36,$720,$216,,$936 A7.4,Bug Fixes during UAT,0,Included,,Included,Included A7.5,Penetration Testing (Third-party),0,,,$4000,$4000 A7.6,"Security Remediation [was 40hrs]",24,$480,$144,,$624 ,Subtotal A7 (was 172 hrs),92,$1840,$552,$4000,$6392 ,,,,,, A8,DOCUMENTATION & TRAINING (AI-GENERATED),,,,,, A8.1,"Technical Documentation - AI-generated + reviewed [was 32hrs]",12,$240,$72,,$312 A8.2,"API Documentation - AI-auto-generated [was 24hrs]",8,$160,$48,,$208 A8.3,"Deployment Instructions - AI-generated [was 16hrs]",6,$120,$36,,$156 A8.4,"Admin User Manual - AI-generated + reviewed [was 24hrs]",10,$200,$60,,$260 A8.5,"FAQ Documentation - AI-generated [was 16hrs]",6,$120,$36,,$156 A8.6,"Training Sessions (2 sessions) [was 32hrs]",24,$480,$144,,$624 A8.7,"Technical Handover [was 16hrs]",8,$160,$48,,$208 ,Subtotal A8 (was 160 hrs),74,$1480,$444,,$1924 ,,,,,, ,TOTAL SECTION A: PLATFORM DEVELOPMENT (AI-AUGMENTED),1286,$25720,$7716,$4000,$37436 ,"(Traditional equivalent: 2320 hrs / $64,320)",,,,,, ,"AI SAVINGS: 1034 hrs (44.6%) / $26,884 (41.8%)",,,,,, ,,,,,, ===========================================,,,,,, SECTION B: HARDWARE & EQUIPMENT,,,,,, ===========================================,,,,,, ,,,,,, Item Code,Item Description,Unit,Quantity,Unit Price (USD),Total (USD), ,,,,,, B1,iPad 10.9" Wi-Fi 64GB (or equivalent Android tablet),Unit,15,$370,$5550, B2,Tablet Kiosk Stand (Anti-theft floor stand),Unit,15,$210,$3150, B3,43" Commercial Display (Smart TV),Unit,5,$560,$2800, B4,TV Wall Mount / Floor Stand,Unit,5,$110,$550, B5,Kiosk Management Software License,Unit,15,$35,$525, B6,Network Equipment (Router/Extender if required),Lot,1,$370,$370, B7,Spare Tablets (Backup),Unit,2,$370,$740, B8,Power Strips & Cable Management,Lot,1,$220,$220, B9,Hardware Installation - Singapore Zoo,Lot,1,$300,$300, B10,Hardware Installation - River Wonders,Lot,1,$300,$300, B11,Hardware Installation - Night Safari,Lot,1,$300,$300, B12,Hardware Installation - Bird Paradise,Lot,1,$300,$300, B13,Hardware Installation - Rainforest Wild,Lot,1,$300,$300, ,,,,,, ,TOTAL SECTION B: HARDWARE & EQUIPMENT,,,,$15405, ,,,,,, Note: Hardware prices are estimates. Final prices may vary at time of procurement.,,,,,, ,,,,,, ===========================================,,,,,, SECTION C: SUPPORT & MAINTENANCE (8 WEEKS),,,,,, ===========================================,,,,,, ,,,,,, Item Code,Item Description,Unit,Quantity,Unit Rate (USD),Total (USD),Notes ,,,,,, C1,Remote Technical Support (9am-6pm SGT Mon-Sun),Week,8,$520,$4160,20 hrs/wk dev + 30% PM C2,On-call Support (24/7 Critical Issues),Week,8,$300,$2400,Standby fee C3,Onsite Hardware Support (Repair/Replacement),Week,8,$400,$3200,Local SG technician C4,Platform Monitoring (Uptime & Performance),Week,8,$260,$2080,10 hrs/wk DevOps + 30% PM C5,Cloud Hosting - Compute (AWS/Azure),Month,2,$800,$1600, C6,Cloud Hosting - Database,Month,2,$400,$800, C7,Cloud Hosting - CDN & Storage,Month,2,$300,$600, C8,Bug Fixes & Minor Updates,Lot,1,Included,Included, C9,Weekly Status Reporting,Week,8,Included,Included, ,,,,,, ,TOTAL SECTION C: SUPPORT & MAINTENANCE,,,,$14840, ,,,,,, ===========================================,,,,,, PRICING SUMMARY,,,,,, ===========================================,,,,,, ,,,,,, Section,Description,,,,Amount (USD), A,Platform Development - AI-Augmented (1286 dev hrs),,,,$37436, B,Hardware & Equipment,,,,$15405, C,Support & Maintenance (8 weeks),,,,$14840, ,GRAND TOTAL,,,,$67681, ,,,,,, ===========================================,,,,,, SAVINGS COMPARISON: AI-AUGMENTED vs TRADITIONAL,,,,,, ===========================================,,,,,, ,,,,,, Metric,Traditional,AI-Augmented,Savings,Savings %,, Development Hours,2320 hrs,1286 hrs,1034 hrs,44.6%,, Development Cost,$64320,$37436,$26884,41.8%,, Total Project Cost,$94565,$67681,$26884,28.4%,, ,,,,,, ===========================================,,,,,, PAYMENT SCHEDULE,,,,,, ===========================================,,,,,, ,,,,,, Milestone,Description,Percentage,Amount (USD),Due Date,, 1,Contract Signing,30%,$20304,Upon signing,, 2,1st Draft Delivery,25%,$16920,End Apr 2026,, 3,2nd Draft Delivery,25%,$16920,End May 2026,, 4,Go-Live,15%,$10152,1 Sep 2026,, 5,Campaign Completion,5%,$3384,Nov 2026,, ,TOTAL,100%,$67680,,, ,,,,,, ===========================================,,,,,, OPTIONAL LINE ITEMS (IF ACTIVATED),,,,,, ===========================================,,,,,, ,,,,,, Item Code,Item Description,Unit,Unit Price (USD),Notes,, ,,,,,, O1,Campaign Extension,Week,$1855,Support + hosting per week,, O2,Additional Voting Terminal (Tablet + Stand),Set,$580,Per unit,, O3,Additional Leaderboard TV (Display + Mount),Set,$670,Per unit,, O4,New Design Iteration (AI-accelerated UI changes),Iteration,$1760,88 hrs AI-augmented dev,, O5,Additional Language Support (AI-assisted),Language,$1430,72 hrs AI-augmented dev,, O6,Extended Support (Beyond 8 weeks),Month,$5920,Per month,, O7,Urgency Fee (Compressed timeline),Lot,TBD,Case by case assessment,, ,,,,,, ===========================================,,,,,, SCHEDULE OF RATES (SOR) - ADDITIONAL WORK,,,,,, ===========================================,,,,,, ,,,,,, Role,Hourly Rate (USD),Daily Rate (USD) - 8hrs,,,, Project Manager,$26,$208,,,, Technical Lead + AI Architect,$25,$200,,,, AI-Augmented Senior Engineer,$20,$160,,,, AI-Augmented Engineer,$16,$128,,,, UX/UI Designer,$18,$144,,,, DevOps Engineer,$22,$176,,,, ,,,,,, ===========================================,,,,,, AI-AUGMENTED DEVELOPMENT METHODOLOGY,,,,,, ===========================================,,,,,, ,,,,,, "This pricing reflects our AI-augmented development approach where:",,,,,, "1. AI coding assistants (Claude Code, GitHub Copilot, Cursor) generate initial code",,,,,, 2. Senior engineers review and refine AI-generated code,,,,,, 3. AI generates comprehensive test suites (80%+ coverage target),,,,,, 4. AI auto-generates documentation from code and specifications,,,,,, 5. All AI output undergoes mandatory human review,,,,,, 6. Productivity gains (~45%) are passed directly to client as cost savings,,,,,, ,,,,,, ===========================================,,,,,, TEAM STRUCTURE (AI-AUGMENTED),,,,,, ===========================================,,,,,, ,,,,,, Role,Name,Allocation,AI Tools Used,,, Project Manager,Hoa Doan,100%,Project management AI assistants,,, Tech Lead + AI Architect,Vu Dao,100%,"Claude Code, GitHub Copilot, Cursor",,, AI-Augmented Full-stack Engineer,TBD,100%,"Claude Code, GitHub Copilot, Cursor",,, AI-Augmented Full-stack Engineer,TBD,100%,"Claude Code, GitHub Copilot, Cursor",,, UX/UI Designer,TBD,40%,AI design assistants,,, ,,,,,, "Traditional team: 8 members | AI-augmented team: 5 members",,,,,, "Each AI-augmented engineer delivers equivalent output of 2-3 traditional developers",,,,,, ,,,,,, ===========================================,,,,,, EXCLUSIONS (NOT INCLUDED),,,,,, ===========================================,,,,,, ,,,,,, Item,Description,,,,, 1,Content creation (animal descriptions and images),,,,, 2,Campaign landing page design/development,,,,, 3,Prize procurement and inventory,,,,, 4,Network infrastructure at park locations,,,,, 5,Third-party software licenses not specified,,,,, 6,Travel and accommodation outside Singapore,,,,, 7,AI tool subscription costs (covered by Apps Cyclone),,,,, ,,,,,, ===========================================,,,,,, NOTES,,,,,, ===========================================,,,,,, ,,,,,, 1,All prices in US Dollars (USD),,,,, 2,"Base development rate: $20/hour (AI-Augmented Senior Engineer)",,,,, 3,PM & QC overhead: 30% of development cost,,,,, 4,AI productivity factor: ~45% reduction in development hours,,,,, 5,Proposal valid for 30 days from submission,,,,, 6,Payment terms: 30 days from invoice date,,,,, 7,Warranty: 90 days post go-live for defect fixes,,,,, 8,Hardware prices are estimates and may vary at procurement,,,,, 9,"AI tool costs (subscriptions, API usage) are borne by Apps Cyclone, not billed to client",,,,,
# Q&A - Questions for Mandai Wildlife Group # Animal Stars Voting Platform **Prepared by:** Apps Cyclone Technology JSC **Date:** January 2026 **Purpose:** Clarification questions for project kickoff and development planning --- ## Priority Legend - **P0 - Critical:** Blocks project start / architecture decisions - **P1 - High:** Blocks specific module development - **P2 - Medium:** Needed before UAT - **P3 - Low:** Nice to have / can be decided later --- ## 1. CIAM Integration (P0 - Critical) | # | Question | Context / Why We Need This | |---|----------|---------------------------| | 1.1 | What CIAM platform does Mandai currently use? (e.g., Okta, Auth0, Azure AD B2C, custom) | Determines integration approach and SDK to use | | 1.2 | Can you provide CIAM API documentation or Swagger/OpenAPI specs? | Required to design authentication module | | 1.3 | When will sandbox/staging CIAM environment be available for integration testing? | Blocks authentication development | | 1.4 | What is the CIAM authentication flow? (OAuth 2.0, OIDC, SAML?) | Architecture decision | | 1.5 | How does WildPass registration via SingPass work technically? Does CIAM handle the SingPass integration, or do we need to integrate with SingPass directly? | Critical for sign-up flow | | 1.6 | What user profile fields are available via CIAM? (email, name, account type, membership tier, etc.) | Required for vote allocation logic | | 1.7 | What are the session management requirements? Session timeout duration? Refresh token policy? | Security implementation | | 1.8 | Is there an existing email OTP service within CIAM, or do we need to build one? | Development scope | | 1.9 | Who is the technical contact on Mandai IT team for CIAM integration? | Communication planning | | 1.10 | Are there rate limits on CIAM API calls? | Performance planning | --- ## 2. Voting Rules & Business Logic (P0 - Critical) | # | Question | Context / Why We Need This | |---|----------|---------------------------| | 2.1 | The RFQ states vote allocation is "TBC" - can you confirm: Baseline 5/day, WildPass 10/day, Members 15/day? | Core business rule | | 2.2 | "1 vote per park" for baseline - does this mean the user must vote for exactly 1 animal per park, or can they skip parks? | Vote distribution logic | | 2.3 | Can a user allocate multiple votes to the same animal within a park? (e.g., WildPass user gives 2 votes to same animal in SZ) | Confirmed in RFQ but want to validate edge cases | | 2.4 | What time zone is used for daily vote reset? Singapore Time (SGT, UTC+8)? | System clock configuration | | 2.5 | If a user has remaining votes at end of day, do they carry over to the next day or are they lost? | Vote balance logic | | 2.6 | Can a user vote multiple times during the day (e.g., vote 3 in morning, come back and vote 2 more in afternoon)? Or must they cast all votes in one session? | Session management | | 2.7 | What happens if a user upgrades from Registered to WildPass mid-campaign? Do they get the higher vote count retroactively? | Account upgrade scenario | | 2.8 | Are booster code votes in addition to daily votes, or do they replace them? | Vote calculation | | 2.9 | Can a user use multiple booster codes per day? | Code redemption rules | | 2.10 | Is there a maximum total booster votes per code? (e.g., code gives +50 votes total, or +5 per day for 10 days?) | Code configuration | --- ## 3. Animal Contestants & Content (P1 - High) | # | Question | Context / Why We Need This | |---|----------|---------------------------| | 3.1 | How many animal contestants per park? (RFQ wireframe shows 5 per park) | UI grid layout | | 3.2 | Total number of contestants across all 5 parks? | Database design | | 3.3 | What content will be provided for each animal? (Name, image, description, fun facts?) | Content fields in database | | 3.4 | Will animal images/videos be provided by MWG, or should we source them? | Content responsibility | | 3.5 | Can animals be added or removed during the campaign? | Admin functionality scope | | 3.6 | Are all 5 parks confirmed? (SZ, RW, NS, BP, RF) | Park configuration | | 3.7 | What is the expected format and resolution for animal images? | Image optimization | --- ## 4. Booster Codes (P1 - High) | # | Question | Context / Why We Need This | |---|----------|---------------------------| | 4.1 | How many brand/sponsor partners are expected? | Code generation volume | | 4.2 | What is the expected total number of booster codes? | Database capacity planning | | 4.3 | Code format preference? (e.g., alphanumeric 8 chars, custom prefix like "SZ-XXXX"?) | Code generation algorithm | | 4.4 | Should codes be single-use per user, or single-use globally? | Validation logic | | 4.5 | Can a single code be used by multiple users? (e.g., a brand gives out 1 code to all their followers) | Code distribution model | | 4.6 | For partner codes that boost a specific animal: do the extra votes go ONLY to that animal, or can users choose? | Booster allocation rule | | 4.7 | Who generates the codes? MWG admin, or should we provide a self-service portal for partners? | Admin scope | | 4.8 | Do codes have expiration dates? | Code lifecycle | --- ## 5. Reward System (P1 - High) | # | Question | Context / Why We Need This | |---|----------|---------------------------| | 5.1 | Which gamification mechanism is preferred? Spin-the-wheel, digital scratch card, or gachapon? | UI development | | 5.2 | What specific prizes will be offered? (Retail vouchers, F&B vouchers, admission discounts, bonus votes, etc.) | Prize configuration | | 5.3 | What is the total prize budget / total number of prizes for the 8-week campaign? | Inventory planning | | 5.4 | Who procures the prizes? MWG or vendor? | Responsibility boundary | | 5.5 | For voucher prizes: will MWG provide voucher codes/links, or should we generate them? | Voucher integration | | 5.6 | How should prizes be credited? Email only? In-app notification? Account credit? | Delivery mechanism | | 5.7 | Is there a maximum number of prizes one user can win during the entire campaign? | Win-cap rules | | 5.8 | Can users who don't vote still view the leaderboard? | Access control | | 5.9 | For "bonus votes" prize: are these additional daily votes, or one-time extra votes? | Vote balance logic | | 5.10 | What is the desired prize redemption flow at parks? (QR scan at counter? Self-service kiosk?) | Redemption infrastructure | | 5.11 | Voucher expiry policy? (Valid during campaign only? 30 days from issue?) | Redemption rules | --- ## 6. Live Leaderboard (P1 - High) | # | Question | Context / Why We Need This | |---|----------|---------------------------| | 6.1 | Should the leaderboard show exact vote counts or only rankings/percentages? | Privacy and display considerations | | 6.2 | Should users see how their own votes contributed? (e.g., "+1" animation after voting) | UX feature scope | | 6.3 | Should there be an overall leaderboard (across all parks) in addition to per-park views? | Feature scope | | 6.4 | For the embeddable iFrame: what is the target page on Mandai.com? Who manages the embedding? | Integration coordination | | 6.5 | TV display leaderboard: should it auto-rotate between parks, or show all parks at once? | TV display layout | | 6.6 | Should the leaderboard be accessible without login? (Public view) | Access control | | 6.7 | Any concern about revealing vote counts in real-time? (Some campaigns hide totals to prevent strategic voting) | Business decision | --- ## 7. Onsite Hardware & Infrastructure (P1 - High) | # | Question | Context / Why We Need This | |---|----------|---------------------------| | 7.1 | Is WiFi available at all proposed terminal locations in each park? | Network requirement | | 7.2 | What are the operating hours for each park? (Night Safari has different hours) | Terminal active hours | | 7.3 | Are power outlets available at the proposed terminal locations? | Hardware setup | | 7.4 | Who is responsible for daily power-on/off of terminals? Park staff? | Operations planning | | 7.5 | Exact proposed locations for the 3 tablets and 1 TV per park? | Site survey planning | | 7.6 | Indoor or outdoor locations? (Weather protection for outdoor) | Hardware specifications | | 7.7 | Does MWG retain ownership of hardware after campaign, or is it rental? | Commercial model | | 7.8 | Is there an existing IT support team at each park for first-line troubleshooting? | Support escalation | | 7.9 | Any restrictions on hardware brands or vendors in MWG procurement policy? | Procurement | | 7.10 | TV display size preference? 43"? Larger? | Hardware specification | --- ## 8. Design & Branding (P2 - Medium) | # | Question | Context / Why We Need This | |---|----------|---------------------------| | 8.1 | Who provides the UI design? MWG creative agency, or should Apps Cyclone design? | Design responsibility | | 8.2 | Will brand guidelines / design system be provided? (Colors, fonts, logos, etc.) | Design consistency | | 8.3 | Campaign logo: will it be provided, and in what format? (SVG, PNG, etc.) | Asset requirements | | 8.4 | Desktop and mobile mockups: will the creative agency provide these, or should we create from wireframes in RFQ? | Design deliverables | | 8.5 | Multi-language requirement? English only, or also Chinese/Malay/Tamil? | Localization scope | | 8.6 | Accessibility requirements? (WCAG 2.1 level?) | Compliance | --- ## 9. Integration & Technical (P2 - Medium) | # | Question | Context / Why We Need This | |---|----------|---------------------------| | 9.1 | Landing page: who builds it? What URL/domain will it be on? | Integration planning | | 9.2 | Mandai App: what technology stack is the app built on? (React Native, Flutter, native?) | WebView integration | | 9.3 | Mandai App: do we integrate via deep link, in-app WebView, or both? | Integration approach | | 9.4 | Adobe Analytics: do you have an existing implementation? What tracking plan/spec? | Analytics setup | | 9.5 | Email service: does MWG have a preferred email provider? (SendGrid, Mailchimp, etc.) | Email delivery | | 9.6 | Email: can we send from a @mandai.com email address, or should we use our own domain? | Email branding | | 9.7 | Preferred cloud hosting provider? (AWS, Azure, GCP?) Any MWG IT requirements? | Infrastructure | | 9.8 | Data residency requirements? Must data be stored in Singapore? | Compliance | | 9.9 | Does MWG require source code repository access during development? (e.g., GitHub/GitLab) | Development workflow | | 9.10 | Are there any existing APIs or microservices we should integrate with beyond CIAM? | Integration scope | --- ## 10. Security & Compliance (P2 - Medium) | # | Question | Context / Why We Need This | |---|----------|---------------------------| | 10.1 | Does MWG have specific security requirements beyond pen testing? (e.g., MWG Info Sec Assessment Form) | Security compliance | | 10.2 | PDPA: what personal data will we handle? (Email, name, NRIC for SingPass?) | Data protection | | 10.3 | Data retention policy: how long should voting data be stored after campaign ends? | Data lifecycle | | 10.4 | Is there a specific pen test vendor MWG prefers, or can we engage our own? | Security testing | | 10.5 | Anti-fraud: besides server-side validation, should we implement CAPTCHA, device fingerprinting, or IP-based restrictions? | Anti-fraud scope | | 10.6 | Should we support VPN/proxy detection to prevent vote manipulation? | Security feature | --- ## 11. Operations & Support (P2 - Medium) | # | Question | Context / Why We Need This | |---|----------|---------------------------| | 11.1 | Expected peak concurrent users? (Weekends, school holidays, launch day) | Capacity planning | | 11.2 | Total expected unique voters over 8 weeks? | Scale estimation | | 11.3 | Is there a customer service team that will handle end-user queries? | FAQ documentation scope | | 11.4 | What is the escalation path for issues? (Park staff to MWG support to Vendor) | Support structure | | 11.5 | Should the admin dashboard be accessible to MWG marketing, MWG IT, or both? | Admin roles | | 11.6 | How frequently should reporting be delivered? (Daily? Weekly? Real-time dashboard?) | Reporting scope | --- ## 12. Campaign Rules & Edge Cases (P3 - Low) | # | Question | Context / Why We Need This | |---|----------|---------------------------| | 12.1 | What happens if there's a tie at the end of the campaign? | Result calculation | | 12.2 | Can MWG manually adjust vote counts? (RFQ mentions "control inventory") | Admin override | | 12.3 | Should the campaign end exactly at midnight on 31 Oct, or at park closing time? | Campaign timing | | 12.4 | Will there be weekly/midway announcements of standings? | Content planning | | 12.5 | What happens after the campaign? Redirect to a "winner" page? | Post-campaign flow | | 12.6 | Should voting be paused during any period? (e.g., system maintenance) | Downtime planning | | 12.7 | Can users who only scan QR code onsite (but don't log in) see the leaderboard? | Guest access | --- ## 13. Commercial & Project Management (P3 - Low) | # | Question | Context / Why We Need This | |---|----------|---------------------------| | 13.1 | Preferred project management tool? (Jira, Asana, Monday.com?) | Collaboration setup | | 13.2 | Preferred communication channel? (Slack, Teams, Email?) | Daily operations | | 13.3 | Sprint demo frequency preference? (Every 2 weeks? Monthly?) | Agile cadence | | 13.4 | UAT: who will be the designated testers from MWG? How many? | UAT planning | | 13.5 | Is there a specific staging/UAT environment MWG requires? | Test environment | | 13.6 | Currency preference for invoicing? (USD or SGD?) | Commercial | | 13.7 | Payment method preference? (Wire transfer, PayPal, etc.) | Commercial | --- ## Summary: Questions by Priority | Priority | Count | Blocks | |----------|-------|--------| | **P0 - Critical** | 20 questions | Project start, architecture | | **P1 - High** | 32 questions | Module development | | **P2 - Medium** | 22 questions | UAT, integration | | **P3 - Low** | 14 questions | Nice to have | | **Total** | **88 questions** | | --- ## Recommended Discussion Agenda for Kickoff **Session 1 - Technical Kickoff (P0 items)** 1. CIAM integration deep-dive (Q1.1 - Q1.10) 2. Voting rules confirmation (Q2.1 - Q2.10) 3. Infrastructure & hosting decisions (Q9.7, Q9.8) **Session 2 - Product & Design (P1 items)** 1. Animal content and contestants (Q3.1 - Q3.7) 2. Booster code mechanics (Q4.1 - Q4.8) 3. Reward system design (Q5.1 - Q5.11) 4. Leaderboard requirements (Q6.1 - Q6.7) **Session 3 - Operations & Hardware (P1-P2 items)** 1. Onsite hardware logistics (Q7.1 - Q7.10) 2. Design & branding handoff (Q8.1 - Q8.6) 3. Integration planning (Q9.1 - Q9.10) 4. Security requirements (Q10.1 - Q10.6) --- *Prepared by Apps Cyclone Technology JSC* *Contact: Tam Nhat Ton | Hoa Doan | Vu Dao*