-
Notifications
You must be signed in to change notification settings - Fork 569
Open
Labels
Feature RequestExtra attention is neededExtra attention is needed
Description
Since we have now supported the multi-turn benchmark MMDU, we would like to implement the chat_inner function for existing VLMs in VLMEvalKit add support for multi-turn chatting.
Currently, we have already supported the method for: GPT series, Claude series, QwenVL APIs, qwen_chat, MiniCPM-v2.5, Idefics2_8b, llava-v1.5, deepseek-vl series. We need help from the community to support it for more models.
If you would like to help, you can refer to the development doc: https://github.com/open-compass/VLMEvalKit/blob/main/docs/en/Development.md.
Implementing three chat_inner api can be viewed as a major contribution. However, only implement this function if you are sure the VLM has the capability to handle the multi-turn input.
Metadata
Metadata
Assignees
Labels
Feature RequestExtra attention is neededExtra attention is needed