Credits
The success of this project would not be possible without the help and support of each contributor. Below are all the contributors involved in this project:
Note: The list below is randomly ordered and does not reflect actual contribution levels.
Wang Feng Ping
Project Founder and Lead Developer
Built this website from scratch, so let me shamelessly thank myself!
Unknown Entity
Omnipotent Agent
Works on my behalf while I sleep, only requires a few cups of coffee, but somehow I still feel tired.
Open Source Acknowledgments
This project is built on top of many excellent open source projects. Below are all open source packages used, organized by their importance.
Next.js
Vercel
Industry-leading React framework providing Server-Side Rendering (SSR), Static Site Generation (SSG), Incremental Static Regeneration (ISR), built-in routing, API Routes, and image optimization - the best choice for modern web development.
Usage on this website:Core framework powering this website, providing full-stack architecture, routing system, image optimization and internationalization support
React
Meta
Declarative, component-based JavaScript UI library developed by Meta (Facebook), offering excellent performance through virtual DOM and efficient update mechanisms, with a vast ecosystem and community support.
Usage on this website:Foundation library for all frontend UI components including pages, forms, animations and user interfaces
TypeScript
Microsoft
JavaScript superset developed by Microsoft, adding static type system and latest ECMAScript features, providing better development experience, code hints and error checking, greatly improving maintainability and efficiency.
Usage on this website:Development language for the entire project, providing type safety, intelligent hints and compile-time error checking
Tailwind CSS
Tailwind Labs
Revolutionary utility-first CSS framework for rapidly building custom designs through composing utility classes, with built-in responsive design, dark mode and JIT compiler, dramatically improving development speed while keeping CSS lean.
Usage on this website:Main styling framework handling all visual styles, responsive layouts and dark mode
next-intl
Jan Amann
Internationalization solution designed for Next.js App Router, supporting server and client components, message formatting, plural forms, date/time localization, with complete multi-language support and excellent type safety.
Usage on this website:Multi-language system supporting 13 languages (Traditional Chinese, Simplified Chinese, English, Japanese, Korean, Thai, Vietnamese, Indonesian, Malay, French, Spanish, Klingon, Elvish)
MySQL2
sidorares
Industry-standard MySQL driver for Node.js, providing fast and reliable database connections. Supports Promise/async-await, connection pooling, and Prepared Statements for SQL injection prevention. This project uses it as the primary database interface for storing and querying user data, game records, blog posts, and more.
Usage on this website:Connect and operate MySQL database for all data access
Fastify
Fastify
Modern high-performance web framework focused on developer experience and application speed. Features powerful plugin system, JSON Schema validation, and automatic API documentation. Both AI Core and RAG Ingestor services use Fastify as their API server to handle AI conversations, RAG queries, and backend logic.
Usage on this website:Backend framework for AI Core and RAG Ingestor
LangChain
LangChain
Industry-leading framework for building LLM applications, simplifying AI development workflows. Provides unified interfaces for integrating various LLMs, vector databases, tools, and memory systems. This project uses LangChain to build RAG (Retrieval-Augmented Generation) systems for intelligent document Q&A and content generation.
Usage on this website:Core framework for building AI chat and RAG systems
Ollama
Ollama
Lightweight local LLM runtime environment enabling developers to run large language models on their machines. Supports open-source models like Llama 3, Gemma, and Mistral. This project uses Ollama as the AI inference engine for fully offline AI conversations, ensuring user privacy and data security.
Usage on this website:Run large language models locally (Llama, Gemma, etc.)
LanceDB
LanceDB
High-performance embedded vector database designed for AI applications. Built on Apache Arrow and Lance format for ultra-fast vector similarity search. This project uses LanceDB to store document vector embeddings, powering RAG system semantic search to quickly find content most relevant to user queries.
Usage on this website:Store and retrieve vector embeddings for RAG system