Skip to content

[Help Wanted] Supporting the chat_inner API for existing VLMs. #323

@kennymckormick

Description

@kennymckormick

Since we have now supported the multi-turn benchmark MMDU, we would like to implement the chat_inner function for existing VLMs in VLMEvalKit add support for multi-turn chatting.
Currently, we have already supported the method for: GPT series, Claude series, QwenVL APIs, qwen_chat, MiniCPM-v2.5, Idefics2_8b, llava-v1.5, deepseek-vl series. We need help from the community to support it for more models.

If you would like to help, you can refer to the development doc: https://github.com/open-compass/VLMEvalKit/blob/main/docs/en/Development.md.

Implementing three chat_inner api can be viewed as a major contribution. However, only implement this function if you are sure the VLM has the capability to handle the multi-turn input.

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions