Show HN: Adaptive-classifier – text classification with continuous learning

Hi HN! I've built a Python library that lets you create text classifiers that can continuously learn and adapt to new classes without retraining from scratch.

What makes this different from typical text classifiers:

- Dynamically add new classes at any time without full retraining - Combines neural networks with prototype learning for better few-shot performance - Uses Elastic Weight Consolidation to prevent catastrophic forgetting - Works with any HuggingFace transformer base model - Memory-efficient through prototype-based storage

You can try it out in under a minute:

pip install adaptive-classifier

from adaptive_classifier import AdaptiveClassifier

# Initialize with any HuggingFace model classifier = AdaptiveClassifier("bert-base-uncased")

# Add initial examples texts = ["Great product!", "Terrible experience", "Average performance"] labels = ["positive", "negative", "neutral"] classifier.add_examples(texts, labels)

# Make predictions print(classifier.predict("This is amazing!")) # [('positive', 0.85), ('neutral', 0.10), ('negative', 0.05)]

# Add a completely new class later classifier.add_examples( ["Error 404", "System crashed"], ["technical", "technical"] )

The library came out of my work on building a model router for optillm where approaches and model types keep changing. Traditional classifiers require full retraining when adding new classes, which becomes impractical with large datasets.

Source code: https://github.com/codelion/adaptive-classifier

I'd love to hear your thoughts and feedback! Happy to answer any questions about the implementation details, use cases, or future plans.


Comments URL: https://news.ycombinator.com/item?id=42786817

Points: 5

# Comments: 1