Tech Stack for Building, Evaluating, and Deploying your LLM Application
-
Updated
Nov 14, 2024 - TypeScript
Tech Stack for Building, Evaluating, and Deploying your LLM Application
Build reliable, secure, and production-ready AI apps easily.
Build reliable, secure, and production-ready AI apps easily.
A light weight library that extends instrictor lib for llm based task
Kubernetes Configs for Portkey Gateway deployment
This repo contains the codes, images, report and slides for the project of the course - `MTH535A: An Introduction To Bayesian Analysis` at IIT Kanpur during the academic year 2022-2023.
The application utilizes GPT 3.5 Turbo, making calls to Language Models (LLMs) over Portkey. This ensures efficient and accurate translations. Additionally, the project uses Vite to substitute environment variables and to avoid Cross-Origin Resource Sharing (CORS) issues by using the Vite server proxy.
Add a description, image, and links to the portkey topic page so that developers can more easily learn about it.
To associate your repository with the portkey topic, visit your repo's landing page and select "manage topics."