Idea in 2-3 sentences

Currently most companies use 1 model provider and barely change as changing the model provider means changing quite some code. Often you dont need the best mode for every request and some models perform better for one task (e.g. math knowledge) and other better in other tasks (e.g. coding). So what if you could route your incoming requst to the best fitting LLM for an optimal response?

The current problems are:

Why are/were we excited about it

End user

Why we killed it