Overview
Below you'll find a collection of end-to-end tutorials and guides to help you get started with running Infernet applications locally.

Hello, World!
Infernet's version of a "Hello, World!" program.

Running An ONNX Model
Use ONNX runtime to run a pre-trained model.

Running A Torch Model
Load a pytorch model and run it with Infernet.

TGI with Mistral 7-b
Integrate Infernet with a TGI service running Mistral 7-b.

Prompt to NFT
Run a diffusion model to generate an NFT from a prompt

Running GPT-4
Learn how to integrate Infernet with OpenAI's GPT-4.