YVIC Research Lab

Independent research on representation geometry, compact language models, and interface-level control in human–LLM interaction.

Contact: victor@yvic.dev · Taipei (UTC+8)
Links: GitHub · Preprints

About

YVIC Research Lab investigates how human-facing inputs constrain and organize internal representation states in large language models. Through empirical studies across architectures and scales, the research examines interface-level control, representation geometry, and the stability of model behavior under deployment and efficiency constraints.

Research interests

  • Representation geometry in language models
  • Prefix-based control and interface-level analysis
  • Compact and on-device LLMs (distillation, efficiency)
  • Robustness under competing directives / interference

L (on-device system)

L is an offline, on-device language model system built as a research artifact. It is used to study how representation structure and interface-level control behave under compression, deployment constraints, and real-world human interaction.

Offline on-device inference

Quantized LLMs (GGUF) + local workflows for privacy-preserving use

Interface evidence

Real interaction traces used to motivate and validate interface-level analysis

Note: L is presented here as a research artifact, not as a commercial product.

Current projects

Distillation for Compact LLMs

Manuscript — under review / revision (update status as needed)