We share our work. We spread knowledge, learn and advance at a faster pace.
A state-of-the-art model to perform zero-shot multilingual text classification. Over 250.000 downloads/month.
A GPT2 model fine-tuned over the Alpaca dataset for easy and fast experimentation in instruction tuning. Over 3.000 downloads/month.
A Python library for personalizing the Stable Diffusion generative model with the aesthetic preferences of the user. +600 stars.
Nowcasting (short-term prediction) of solar irradiance, to optimize solar panel position merged with wind data (direction and speed).
Improvements of up to 30% (depending on the sensor) over the baseline model (persistence).