Question Details

No question body available.

Tags

r shiny large-language-model

Answers (1)

Accepted Answer Available
Accepted Answer
September 18, 2025 Score: 6 Rep: 21,710 Quality: Expert Completeness: 80%

You could check streaming markdown interface from {shinychat}, outputmarkdownstream() + markdownstream().

There's a complete example for this in {ellmer}'s Streaming and async APIs Vignette (2nd code block under Shiny example, at the time of writing) which also makes use of shiny::ExtendedTask to avoid blocking other users & rest of the app.

Here's a slightly modified version of that example for reference, for {ellmer} experiments we can use models from Github marketplace for free (API rate limits do apply) through ellmer::chatgithub() . JustWorks(tm) as long as gitcreds::gitcredsget() is able to get your GitHub personal access token, either from GITHUBPAT env. variable or from credential store; https://usethis.r-lib.org/articles/git-credentials.html for details.

library(shiny)
library(bslib)
library(ellmer)
library(shinychat)

ui ui RTerm #> language (EN) #> collate English
United Kingdom.utf8 #> ctype English_United Kingdom.utf8 #> tz Europe/Tallinn #> date 2025-09-18 #> pandoc 3.6.3 @ C:/Program Files/RStudio/resources/app/bin/quarto/bin/tools/ (via rmarkdown) #> #> ─ Packages ─────────────────────────────────────────────────────────────────── #> ! package version date (UTC) lib source #> P coro 1.1.0 2024-11-05 [?] RSPM #> P ellmer 0.3.2 2025-09-03 [?] RSPM #> P shiny 1.11.1 2025-07-03 [?] RSPM #> shinychat 0.2.0 2025-05-16 [1] RSPM #> ──────────────────────────────────────────────────────────────────────────────