13B chat model output only dots (.) on example space

#1
by wvangils - opened

I think there is something wrong with the config of the space where you showcase this new model. From all the examples on the left the output of the model is a single dot '.'

NVIDIA-Eagle org

Hi,

Thank you for your interest! Could you please provide us with more details, such as the input image and the prompt you used? Additionally, are you using your own inference code, or are you using the Hugging Face space we created?

I'm using your space on Huggingface, it seems to appear when running another example in the Gradio widget.. For example:

Screenshot 2024-08-29 163149.png

NVIDIA-Eagle org

Yeah, we have been testing on these cases and the output is normal. Have you cleared the history when you are swiching to a new image?

No, after one response from the model I clicked on a new example. In that case the output the empty or dot-response occurs. So users need to clear the interaction after every reply before they can select a new example?

NVIDIA-Eagle org

Yeah, the code of this demo space is still very simple. If you directly switch to a new sample, the last image and conversation will stay in the context. Since our model doesn't support multi-image input, it will cause some problems.

Please clear the history before switching to another example or uploading your own image.

Thank you for you feedback and we will try to make it better!

Sign up or log in to comment