~/portfolio $ whoami
Full-stack engineer with a background in Biochemistry and 5+ years building production systems, specializing in LLM orchestration and compliance-grade workflows — seeking to apply domain knowledge in medical AI, drug discovery platforms, or healthcare management systems.
Currently deep into AI tooling: local LLM fine-tuning pipelines, browser extensions, and the intersection where backend systems meet language models.
| Laravel / PHP | Primary | |
| React | Primary | |
| MySQL | Primary | |
| LLM / MLX | Growing | |
| Node / JS | Fluent | |
| Python | Fluent |
Designed an AI Orchestration Layer managing the full LLM workflow — data preparation, prompt construction, model routing (local vs API), output parsing, and validation guardrails ,versioning and with human-in-the-loop — projecting ~33% reduction in clinician documentation time per reporting cycle.
End-to-end pipeline: export conversation data from multiple AI platforms, score, filter, clean, desensitize, convert, and fine-tune. Architecture insight: scoring before desensitization prevents placeholder leakage. Working adapter: local_claude_final2.
Browser extension injecting branch-marking UI into Claude, Gemini, ChatGPT, and Grok simultaneously, enabling precise branch marking, collapsing, and navigation across hundreds of messages. Platform-specific adapters handle DOM differences across four distinct interfaces.
Two-stage classification: Claude API (Haiku) for topic judgment, GPT for format summarization. Each model used where it outperforms the other. Designed to run before data desensitization to prevent placeholder contamination in training sets.
Break tasks by complexity before touching code. Know the blast radius before you dig.
File by file. Line by line. Fresh context when things go in circles. No spaghetti.
Production first, polish second. Things that run matter more than things that look good in dev.
CHENGHONG_MENG_UNSTOPPABLE_FOUNDATION
LLAMA_3_PERSONALITY_SYNC_COMPLETE
TARGET_LOCKED: NEXT_OPPORTUNITY
You found it. Now hire me.