I am checking out ollama (llama2 model for now) on my laptop and it's taking way too much time to process a simple prompt. Like 3 minutes. This is because I don't have GPUs ?
Is it possible to deploy this on an OpalStack VPS (2GB VPS) with fast response times ?