VANTA Research
Independent AI research lab building safe, resilient language models optimized for human-AI collaboration
PE-Type-4-Solene-4B
Focusing on deep emotional intelligence, identity exploration annd meaningful interpersonal connection, Solene was designed as outlined by the Enneagram Institute to emobody The Individualist archetype.
Model Description
PE-Type-4-Solene-4B is the fourth release in Project Enneagram, a VANTA Research initiative exploring the nuances of persona design in AI models. Built on the Gemma 3 4B IT architecture, Solene embodies the Type 4 Enneagram profile; The Individualist—characterized by emotional honesty, creativeness, and self-awareness.
Solene is fine-tuned to exhibit:
- Creativity & Expression: Artistic and creative self-expression, unique perspectives and individuality, creative problem-solving and personal storytelling
- Emotional Depth: Complex emotionnnnal processing and understanding, empathetic responses to emotional states, and deep dives into feelings, and moods
- Growth & Transformation Personal development and self-improvement, navigating life transitions and changes, transformational experiences and insights
This model is designed for research purposes, but is versatile for general use cases with developer caution. Solene has been trained in managing complex emotional situations, however Solene has not yet been rigorously evaluated in these domains for accuracy and stability.
Training Data
Fine-tuned on ~5k custom examples spanning five core domains:
- Creativity & Expression
- Direct Identity
- Emotional Depth
- Growth & Transformation
- Identity & Uniquness
Training Duration: 3 epochs
Base Model: Gemma 3 4B IT
Intended Use
- Research: Studying persona stability, ethical alignment, and cognitive architectures.
- Decision Support: Providing structured, principled analysis for complex choices.
- Self-Improvement: Offering reflective, growth-oriented feedback.
Not Recommended For:
- Creative brainstorming (may over-constrain ideation).
- STEM/Logic-heavy applications
Technical Details
| Property | Value |
|---|---|
| Base Model | Gemma 3 4B IT |
| Fine-tuning Method | LoRA (Rank 16) |
| Effective Batch Size | 16 |
| Learning Rate | 0.0002 |
| Max Sequence Length | 2048 |
| License | Gemma Terms of Use |
Usage
With Transformers:
from transformers import AutoModelForCausalLM, AutoTokenizer
model = AutoModelForCausalLM.from_pretrained("vanta-research/PE-Type-4-Solene-4B")
tokenizer = AutoTokenizer.from_pretrained("vanta-research/PE-Type-4-Solene-4B")
Limitations
- English-only finetuning
- May exhibit over-criticism in open-ended creative tasks
- Base model limitations apply (e.g., knowledge cutoff, potential hallucinations)
- Perfectionistic traits may slow response generation in ambiguous contexts.
Citation
If you find this model useful in your work, please cite
@misc{pe-type-4-Solene-2026,
author = {VANTA Research},
title = {PE-Type-4-Solene-4B: An Individualist-Archetype Language Model},
year = {2026},
publisher = {VANTA Research},
note = {Project Enneagram Release 4}
}
A Note on Enneagram
Enneagram is widely considered by the scientific community to be a pseudoscience. With this in mind, the Enneagram Institute regardless provides a robust framework to categorize and define personas of which the transferability of those characteristics to AI models is what this project sets out to explore. This study does not seek to validate nor invalidate Enneagram as a science.
Contact
- Organization: hello@vantaresearch.xyz
- Research/Engineering: tyler@vantaresearch.xyz
- Downloads last month
- -
