shiny.ollama
R Shiny Interface for Chatting with LLMs Offline via Ollama
Experience seamless, private, and offline AI conversations right
on your machine! shiny.ollama
provides a user-friendly R
Shiny interface to interact with LLMs locally, powered by Ollama.
Important: shiny.ollama
requires Ollama
to be installed on your system. Without it, this package will not
function. Follow the Installation
Guide below to set up Ollama first.
install.packages("shiny.ollama")
# Install devtools if not already installed
install.packages("devtools")
::install_github("ineelhere/shiny.ollama") devtools
Launch the Shiny app in R with:
library(shiny.ollama)
# Start the application
::run_app() shiny.ollama
To use this package, install Ollama first:
ollama --version
If successful, the version number will be displayed.
This R package is an independent, passion-driven open source
initiative, released under the
Apache License 2.0
. It is not affiliated
with, owned by, funded by, or influenced by any external organization.
The project is dedicated to fostering a community of developers who
share a love for coding and collaborative innovation.
Contributions, feedback, and feature requests are always welcome!
Stay tuned for more updates. 🚀