The talk

abstract

Prerequisites

Description

Speakers

Actions

Speed up open source LLM-serving with llama-cpp-python | PyCon Lithuania 2024