Hosting quickly 2.5 - Discovering and calling a backend from Dioxus
- Hosting quickly 1 - Setting up Terraform, Github, and 1Password
- Hosting quickly 2 - Dioxus to the web on Fly.io
We should make the backend serve some API that we can hit from the frontend.
We'll turn the root endpoint in the backend into a JSON-returning endpoint:
#[derive(Serialize)]
struct Response {
name: String,
}
#[debug_handler]
async fn home() -> Json<Response> {
Json(Response {
name: "Hello, World!".to_string(),
})
}
and then use that endpoint from the frontend:
#[derive(Deserialize)]
struct Response {
name: String,
}
pub async fn get_endpoint() -> Result<String, reqwest::Error> {
let url = format!("/topstories.json");
let res = reqwest::get(url).await?.json::<Response>().await?;
Ok(res.name)
}
fn app(cx: Scope) -> Element {
let response = use_future(cx, (), |_| get_endpoint());
match response.value() {
Some(Ok(name)) => cx.render(rsx! {
h1 { "{name}" }
}),
Some(Err(_)) | None => cx.render(rsx! {
h1 { "Error!" }
}),
}
}
Let the frontend know
There's a very simple reason that this is a separate article: I need to figure out how to make the frontend discover the base URI of the backend API. There are a few options:
- Compile it in, either from an environment variable or as a hardcoded string.
- Serve it in configuration, like an
/env.json
file that can be loaded from the frontend or inserted into a script tag in the HTML. - Ensure it's on the same domain, so that the frontend can just call /api.
I'm not a huge fan of compiling it into the code: I'd like to be able to use the same compiled code across different environments. Additional requests, like having an env.json
or service discovery, will only slow down the page load. Routing is something I'd like to avoid depending on.
So instead, let's insert any environment variables into the index.html
at startup! We'll start by loading it in Dioxus. I'll leave defining the EnvSettings
as an exercise to the reader:
fn get_dioxus_env<'a>(cx: &Scope) -> EnvSettings {
std::env::var("DIOXUS_ENV").unwrap_or_default()
}
For running the frontend locally and through Docker, we'll set it through the Justfile. For the docker-run
command, we'll also make sure to forward the backend's port:
run $DIOXUS_ENV='BACKEND_API_URL="https://localhost:3000"': build
dx serve --features ssr --hot-reload --platform desktop
docker-run: docker-build
docker run -it -p 8081:8081 -p 8080:8080 -e BACKEND_API_URL=https://localhost:8081 cochrane-frontend:dev
Call the API
To make this as simple as possible, let's create a special component that calls the API and returns either an error or the text response:
#[derive(PartialEq, Props)]
struct QueryingTextProps {
backend_url: url::Url,
}
#[allow(non_snake_case)]
fn QueryingText(cx: Scope<QueryingTextProps>) -> Element {
let result = use_future(cx, &cx.props.backend_url, |val| async move {
let res = call_api(val).await;
res
});
cx.render(match result.value() {
Some(Ok(s)) => rsx! { p { "{s}" } },
Some(Err(e)) => rsx! { p { "{e}" } },
None => rsx! { p { "Loading..." } },
})
}
Note the call_api
function that's undefined, we'll get there. There's some setup code here that's necessary for a Dioxus component: the QueryingTextProps
struct to allow passing our API endpoint, the use_future
call to actually call the async fn
that we'll be defining. Otherwise it's fairly standard!
In fact, the call_api
function is super simple. It's just a plain reqwest
call. Of course, we could flesh this out by sprinkling in some serde_json
and better error types, but this'll do for now:
async fn call_api(url: url::Url) -> Result<String, String> {
let res = reqwest::get(url).await.map_err(|e| e.to_string())?;
res.text().await.map_err(|e| e.to_string())
}
All that's left is actually using the component now:
let EnvSettings { backend_url } = get_dioxus_env(&cx);
And then add this to the returned rsx:
rsx! { QueryingText { backend_url: backend_url } },
Voila! It's all hooked up.
Results
I've intentionally left certain elements for the reader to implement. Specifically, the exact definition of EnvSettings
and the intricate details of error handling and JSON processing within the call_api
function are not covered. Also, getting the environment info from a dotenv file or other configuration would probably improve this.
Anyway, I think it works out to be a nice way of configuring your frontend Docker image without having to rebuild your Rust code for each environment.
Enjoy!