BenChaliah / Superposition-Transformer

a novel architecture that leverages Autoencoders to superimpose the hidden representations of a base model and a fine-tuned model within a shared parameter space. Using B-spline-based blending coefficients and autoencoders that adaptively reconstruct the original hidden states based on the input data distribution.

Date Created 2024-10-22 (2 months ago)
Commits 13 (last one 3 days ago)
Stargazers 41 (41 this week)
Watchers 2 (0 this week)
Forks 0
License unknown
Ranking

RepositoryStats indexes 601,131 repositories, of these BenChaliah/Superposition-Transformer is ranked #524,439 (13th percentile) for total stargazers, and #488,516 for total watchers. Github reports the primary language for this repository as Jupyter Notebook, for repositories using this language it is ranked #14,476/17,763.

BenChaliah/Superposition-Transformer is also tagged with popular topics, for these it's ranked: deep-learning (#7,786/8567),  llm (#2,441/2989),  artificial-intelligence (#1,882/2131),  language-model (#455/520)

Star History

Github stargazers over time

Watcher History

Github watchers over time, collection started in '23

Recent Commit History

13 commits on the default branch (main) since jan '22

Yearly Commits

Commits to the default branch (main) per year

Issue History

No issues have been posted

Languages

The only known language in this repository is Jupyter Notebook

updated: 2025-01-06 @ 09:15pm, id: 876802124 / R_kgDONELwTA