like-gold
like-gold10mo ago

Error With MUI Inferencer

When running Mui Inferencer, I am getting an error stating:
Exception:

RangeError: Maximum call stack size exceeded
Exception:

RangeError: Maximum call stack size exceeded
My API did properly send back the requested data the endpoint, and when I press show code, it does seem to be correct as it identifies the proper attributes, but I am not sure why it's getting that error, because it can't render the screen.
No description
6 Replies
kapa.ai
kapa.ai10mo ago
Hi @nick_montoya. I'm kapa.ai, your friendly support bot. I will try to answer your question by searching through the knowledge base. :wave: ---------------------- Hmm, I don't know enough to give you a confident answer yet. The information extracts provided do not contain specific details about the error you're encountering. However, a "Maximum call stack size exceeded" error typically occurs when there's a recursive call with no exit condition, causing the stack to overflow. You might want to check your code for any potential infinite loops or recursive calls. For more detailed help, you could refer to the official Refine documentation (https://refine.dev/docs) or their API reference for Mui Inferencer (https://refine.dev/docs/api-reference/mui/components/inferencer). ---------------------- Note: you can ask me a follow up question by @ mentioning me again :speech_balloon: ----------------------
kapa.ai
kapa.ai10mo ago
<@263527875506995202> kapa.ai is still learning and improving, please let us know how it did by reacting below
like-gold
like-gold10mo ago
This is also part of the error in the console
No description
extended-salmon
extended-salmon10mo ago
Hello @nick_montoya, thanks for the issue. We will fix this as soon as possible
old-apricot
old-apricot10mo ago
Can you also share the generated code if its something you can share with us? 🙏 We directly run the generated code, if its not from the runner then its from the generated code 😅
like-gold
like-gold10mo ago
Thank you @alicanerdurmaz and @aliemirs I ended up fixing my solution by fixing my backend to send id instead of orgId and such, now it seems to work well. I noticed the inferencer can't detect objects too well, but I will post this as a separate issue