Description:
I'm encountering an issue after updating the version of LlamaIndex in our project (SEC Insights). The problem specifically relates to sub-questions not being displayed on the frontend.
Steps to Reproduce:
Update the LlamaIndex to the latest version.
Use OpenAIAgent with SubQuestionQueryEngine.
Trigger sub-questions during a query.
Expected Behavior: The CBEventType.SUB_QUESTION should be triggered inside ChatCallbackHandler, allowing sub-questions and their responses to be streamed and displayed on the frontend.
Observed Behavior: After updating the version, the CBEventType.SUB_QUESTION is not triggered, preventing subquery answers from streaming to the frontend.
Additional Context: This issue is impacting the ability to handle sub-questions in a live environment. Has anyone else encountered this problem, or are there any workarounds?
Environment:
LlamaIndex Version: "^0.11.13"
Llama Index Agent Openai: "^0.3.4"