Mastering OpenAPI Schemas: Troubleshooting 400 Errors

Published On Wed Sep 18 2024
Mastering OpenAPI Schemas: Troubleshooting 400 Errors

Unexpected 400 errors with Generated Output Schema - Google ...

ShareShare this topicLinkedInLinkedInTwitterTwittercopyURLsCopy URL

Introduction

Hi there folks! Hoping to find an answer to a problem that's frustrating me to no end. I'm leveraging OpenAPI schema definitions for generated output from gemini flash via API calls and Batch predictions. It's generally working well, however as my schemas become more complex/large I find that I am getting 400 errors with no useful context from them when running generative prompts. The responses are 400 errors with 'INVALID_ARGUMENT' and errorDetails=undefined (which is unbelievably useless to debug).

Tech Blog] Machine Learning Batch Prediction Architecture Using ...

Issue with Schema Size

Through a ton of anecdotal trial and error I've determined that it appears there might be some kind of unpublished hard limits around the size or depth of the schema you can supply. I've dug into docs and cannot find anything indicating such -- the input token count is still relatively small (I can copy the schema and ask for the exact output with the schema inline in VertexAI freeform and it works perfectly, with ~4000 input tokens). The JSON serialized schema is only about 8k characters long.

Batch Predictions

Conclusion

A few notes: Thanks in advance anybody who can help!