Issue with FlowiseAI: Server restarts when using vector memory modules

Hello everyone,

I’m encountering an issue with FlowiseAI that I can’t seem to resolve. When using the default workflow “Flowise Docs QnA” and testing a chat, I get a series of errors and the server restarts. Here are the logs I receive:

Oct 19 02:25:53 PM 2023-10-19 12:25:53 [INFO]: :arrow_up: POST /api/v1/chatmessage/b2ded186-f49c-4b02-88a4-32fd75c6c99b
Oct 19 02:26:26 PM 2023-10-19 12:26:26 [INFO]: :pen: PUT /api/v1/chatflows/b2ded186-f49c-4b02-88a4-32fd75c6c99b
Oct 19 02:26:28 PM 2023-10-19 12:26:28 [INFO]: :arrow_up: POST /api/v1/chatmessage/b2ded186-f49c-4b02-88a4-32fd75c6c99b
Oct 19 02:26:29 PM 2023-10-19 12:26:29 [INFO]: :arrow_up: POST /api/v1/internal-prediction/b2ded186-f49c-4b02-88a4-32fd75c6c99b
Oct 19 02:26:29 PM error Command failed with exit code 1.
Oct 19 02:26:29 PM info Visit https://yarnpkg.com/en/docs/cli/run for documentation about this command.
Oct 19 02:26:29 PM error Command failed with exit code 1.
Oct 19 02:26:29 PM info Visit https://yarnpkg.com/en/docs/cli/run for documentation about this command.
Oct 19 02:26:42 PM yarn run v1.22.19
Oct 19 02:26:42 PM $ run-script-os
Oct 19 02:26:42 PM $ cd packages/server/bin && ./run start
Oct 19 02:26:45 PM 2023-10-19 12:26:45 [INFO]: Starting Flowise…

I’ve also attached my flow and chat window to this post for reference.

What’s peculiar is that this issue arises whenever I use a vector memory or embedding module. If I just couple it with OpenAI chat, everything works smoothly. So it doesn’t seem to be related to my OpenAI configuration.

Has anyone faced a similar issue or have any insights to help resolve this problem?

Thanks in advance for your assistance and suggestions.

Hi Ester25,

I’ll follow up with you in the support ticket you opened on this topic.

We can follow up here with any findings that might be generally applicable.

Regards,

Matt

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.