Skip to content

Commit 66576d2

Browse files
committed
fix readme
1 parent 93a91c5 commit 66576d2

File tree

2 files changed

+2
-0
lines changed

2 files changed

+2
-0
lines changed

README.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -8,6 +8,7 @@ Welcome to the official repository for **LLM2CLIP**! This project leverages larg
88
---
99

1010
## News 🚀🚀🚀
11+
- **[2024-11-18]** Our Caption-Contrastive finetuned Llama3-8B-CC released on [HuggingFace](https://huggingface.co/microsoft/LLM2CLIP-Llama-3-8B-Instruct-CC-Finetuned), we will try release more version.
1112
- **[2024-11-08]** We are currently training a **scaled-up** version with ten times the training dataset, along with upcoming updates: EVA ViT-E, InternVL-300M, SigCLIP-SO-400M, and more VLLM results trained with LLM2CLIP. Stay tuned for the most powerful CLIP models, and thank you for your star!
1213
- **[2024-11-06]** OpenAI's CLIP and EVA02's ViT base and large models are now available on [HuggingFace](https://huggingface.co/collections/microsoft/llm2clip-672323a266173cfa40b32d4c).
1314
- **[2024-11-01]** Our paper was accepted to the NeurIPS 2024 SSL Workshop!

docs/index.html

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -77,6 +77,7 @@ <h2 class="subtitle is-4">Weiquan Huang<sup>1*</sup>, Aoqi Wu<sup>1*</sup>, Yifa
7777
<h2 class="title is-3 has-text-centered">News <span class="icon"><i class="fas fa-rocket"></i></span></h2>
7878
<div class="content has-text-left">
7979
<ul>
80+
<li><strong>[2024-11-18]</strong> Our Caption-Contrastive finetuned Llama3-8B-CC released on <a href="https://huggingface.co/microsoft/LLM2CLIP-Llama-3-8B-Instruct-CC-Finetuned" target="_blank">HuggingFace</a>, we will try release more version.</li>
8081
<li><strong>[2024-11-08]</strong> We are training a scaled-up version with ten times the dataset. Updates: EVA ViT-E, InternVL-300M, SigCLIP-SO-400M, and more VLLM results. Stay tuned for the most powerful CLIP models. Thanks for your star!</li>
8182
<li><strong>[2024-11-06]</strong> OpenAI's CLIP and EVA02's ViT models are now available on <a href="https://huggingface.co/collections/microsoft/llm2clip-672323a266173cfa40b32d4c" target="_blank">HuggingFace</a>.</li>
8283
<li><strong>[2024-11-01]</strong> Our paper was accepted at the NeurIPS 2024 SSL Workshop!</li>

0 commit comments

Comments
 (0)