Cool, I guess I now have hands on experience getting LLMs to interact with the internet and system resources in case folks want to hire me to do stuff like that.
Fully offline and locally with open source models without high end hardware or GPUs.
@mauve so cool!