Custom domains (oppgave 14.9): DNS-validering, Caddy on-demand TLS, re-rendering

Tre hovedkomponenter:

1. custom_domain.rs — ny modul i maskinrommet:
   - GET /internal/verify-domain?domain= — Caddy on-demand TLS callback.
     Returnerer 200 hvis domenet er registrert i en publishing-trait, 404 ellers.
   - DNS-validering (validate_dns): sjekker at domenet peker til serverens IP
     via system DNS resolver. Kalles ved oppdatering av publishing-trait.
   - Domene-basert serving: /custom-domain/index, /custom-domain/{article_id},
     /custom-domain/feed.xml — Caddy rewriter custom domain-forespørsler hit,
     Host-header brukes til å finne samlingen.
   - Re-rendering: rerender_collection_articles() enqueuer render-jobber
     for alle artikler + forside når custom_domain endres.

2. Caddy on-demand TLS (Caddyfile):
   - Catch-all :443-blokk med on_demand ask-callback til maskinrommet.
   - Rewrite-regler: / → /custom-domain/index, /feed.xml → /custom-domain/feed.xml,
     /* → /custom-domain/{uri}. Host-header bevares for domene-oppslag.

3. intentions.rs — utvidet update_node:
   - DNS-validering ved setting av custom_domain i publishing-trait.
   - Detekterer endring i custom_domain og trigger re-rendering av
     alle artikler (canonical URL endres).

Eksisterende kode (publishing.rs, rss.rs) bruker allerede custom_domain
for base_url/canonical_url — ingen endringer nødvendig der.

Ref: docs/concepts/publisering.md § "Custom domain-mekanisme"

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
This commit is contained in:
vegard 2026-03-18 01:51:35 +00:00
parent 9486480ebe
commit 66ebe58ff8
5 changed files with 433 additions and 2 deletions

View file

@ -77,3 +77,34 @@ vegard.info {
header Content-Type text/html
respond `<!DOCTYPE html><html><head><meta charset="utf-8"><title>vegard.info</title><link rel="icon" href="/favicon.ico" sizes="32x32"><link rel="icon" href="/icon-192.png" type="image/png" sizes="192x192"><link rel="apple-touch-icon" href="/apple-touch-icon.png"></head><body><p>vegard.info — underveis!</p></body></html>` 200
}
# === Custom domains for publiseringssamlinger ===
# On-demand TLS: Caddy henter sertifikat kun for domener som maskinrommet
# bekrefter via /internal/verify-domain. Forespørsler rutes til
# maskinrommets /custom-domain/-ruter med Host-headeren bevart.
# Ref: docs/concepts/publisering.md § "Custom domain-mekanisme"
:443 {
tls {
on_demand {
ask http://host.docker.internal:3100/internal/verify-domain
}
}
# RSS/Atom feed
handle /feed.xml {
rewrite * /custom-domain/feed.xml
reverse_proxy host.docker.internal:3100
}
# Forside
handle / {
rewrite * /custom-domain/index
reverse_proxy host.docker.internal:3100
}
# Artikler (alt annet)
handle {
rewrite * /custom-domain{uri}
reverse_proxy host.docker.internal:3100
}
}

View file

@ -0,0 +1,346 @@
//! Custom domain-håndtering for publiseringssamlinger.
//!
//! Tre ansvarsområder:
//! 1. Verify-domain callback for Caddy on-demand TLS (intern, uautentisert)
//! 2. DNS-validering ved registrering av custom domain
//! 3. Ruting: domene → samling → serve artikkel/forside/feed
//!
//! Ref: docs/concepts/publisering.md § "Custom domain-mekanisme"
use axum::{
extract::{Query, State},
http::{HeaderMap, StatusCode},
response::Response,
};
use serde::Deserialize;
use sqlx::PgPool;
use std::net::ToSocketAddrs;
use uuid::Uuid;
use crate::publishing::{self, PublishingConfig};
use crate::rss;
use crate::AppState;
// =============================================================================
// Verify-domain (Caddy on-demand TLS callback)
// =============================================================================
#[derive(Deserialize)]
pub struct VerifyDomainQuery {
domain: String,
}
/// GET /internal/verify-domain?domain=mittmagasin.no
///
/// Caddy kaller dette endepunktet før den henter TLS-sertifikat.
/// Returnerer 200 hvis domenet tilhører en samling med publishing-trait,
/// 404 ellers. Uautentisert — kun tilgjengelig internt.
pub async fn verify_domain(
State(state): State<AppState>,
Query(query): Query<VerifyDomainQuery>,
) -> StatusCode {
let domain = query.domain.to_lowercase().trim().to_string();
if domain.is_empty() {
return StatusCode::BAD_REQUEST;
}
// Sjekk at domenet er registrert i en publishing-trait
match find_collection_by_domain(&state.db, &domain).await {
Ok(Some(_)) => {
tracing::info!(domain = %domain, "Domene verifisert for on-demand TLS");
StatusCode::OK
}
Ok(None) => {
tracing::debug!(domain = %domain, "Ukjent domene avvist");
StatusCode::NOT_FOUND
}
Err(e) => {
tracing::error!(domain = %domain, error = %e, "DB-feil ved domeneverifisering");
StatusCode::INTERNAL_SERVER_ERROR
}
}
}
// =============================================================================
// DNS-validering
// =============================================================================
/// Serverens forventede IP (fra miljøvariabel eller default).
fn expected_server_ip() -> String {
std::env::var("SERVER_IP").unwrap_or_else(|_| "157.180.81.26".to_string())
}
/// Validerer at domenet har DNS-oppføring som peker til serveren.
///
/// Sjekker A-record via system DNS resolver. Returnerer Ok(()) hvis
/// minst én A-record peker til serverens IP, Err med forklaring ellers.
pub fn validate_dns(domain: &str) -> Result<(), String> {
let expected_ip = expected_server_ip();
// Bruk system DNS resolver
let lookup = format!("{domain}:443")
.to_socket_addrs()
.map_err(|e| format!("DNS-oppslag feilet for {domain}: {e}"))?;
let resolved_ips: Vec<String> = lookup.map(|addr| addr.ip().to_string()).collect();
if resolved_ips.is_empty() {
return Err(format!(
"Ingen DNS-oppføringer funnet for {domain}. \
Opprett en A-record som peker til {expected_ip}."
));
}
if resolved_ips.iter().any(|ip| ip == &expected_ip) {
Ok(())
} else {
Err(format!(
"DNS for {domain} peker til {:?}, men forventet {expected_ip}. \
Opprett en A-record som peker til {expected_ip}.",
resolved_ips
))
}
}
// =============================================================================
// Domene → samling-oppslag
// =============================================================================
#[allow(dead_code)]
struct DomainCollection {
id: Uuid,
title: Option<String>,
publishing_config: PublishingConfig,
has_rss: bool,
slug: String,
}
/// Finn samling basert på custom_domain i publishing-trait.
async fn find_collection_by_domain(
db: &PgPool,
domain: &str,
) -> Result<Option<DomainCollection>, sqlx::Error> {
let row: Option<(Uuid, Option<String>, serde_json::Value)> = sqlx::query_as(
r#"
SELECT id, title, metadata
FROM nodes
WHERE node_kind = 'collection'
AND metadata->'traits'->'publishing'->>'custom_domain' = $1
LIMIT 1
"#,
)
.bind(domain)
.fetch_optional(db)
.await?;
let Some((id, title, metadata)) = row else {
return Ok(None);
};
let traits = metadata.get("traits");
let publishing_config: PublishingConfig = traits
.and_then(|t| t.get("publishing"))
.cloned()
.map(|v| serde_json::from_value(v).unwrap_or_default())
.unwrap_or_default();
let slug = publishing_config
.slug
.clone()
.unwrap_or_else(|| "unknown".to_string());
let has_rss = traits.and_then(|t| t.get("rss")).is_some();
Ok(Some(DomainCollection {
id,
title,
publishing_config,
has_rss,
slug,
}))
}
// =============================================================================
// Custom domain serving: forside, artikkel, feed
// =============================================================================
/// Ekstraher domene fra Host-header.
fn extract_domain(headers: &HeaderMap) -> Result<String, StatusCode> {
let host = headers
.get("host")
.and_then(|v| v.to_str().ok())
.ok_or(StatusCode::BAD_REQUEST)?;
Ok(normalize_host(host))
}
/// GET /custom-domain/index — forside for custom domain.
///
/// Caddy rewriter forespørsler fra custom domains hit.
/// Vi bruker Host-headeren til å finne samlingen.
pub async fn serve_custom_domain_index(
State(state): State<AppState>,
headers: HeaderMap,
) -> Result<Response, StatusCode> {
let domain = extract_domain(&headers)?;
let collection = find_collection_by_domain(&state.db, &domain)
.await
.map_err(|e| {
tracing::error!(domain = %domain, error = %e, "DB-feil ved domene-oppslag");
StatusCode::INTERNAL_SERVER_ERROR
})?
.ok_or(StatusCode::NOT_FOUND)?;
let slug = collection.slug;
publishing::serve_index(
State(state),
axum::extract::Path(slug),
)
.await
}
/// GET /custom-domain/{article_id} — enkeltartikkel for custom domain.
pub async fn serve_custom_domain_article(
State(state): State<AppState>,
headers: HeaderMap,
axum::extract::Path(article_id): axum::extract::Path<String>,
) -> Result<Response, StatusCode> {
let domain = extract_domain(&headers)?;
let collection = find_collection_by_domain(&state.db, &domain)
.await
.map_err(|e| {
tracing::error!(domain = %domain, error = %e, "DB-feil ved domene-oppslag");
StatusCode::INTERNAL_SERVER_ERROR
})?
.ok_or(StatusCode::NOT_FOUND)?;
let slug = collection.slug;
publishing::serve_article(
State(state),
axum::extract::Path((slug, article_id)),
)
.await
}
/// GET /custom-domain/feed.xml — RSS/Atom for custom domain.
pub async fn serve_custom_domain_feed(
State(state): State<AppState>,
headers: HeaderMap,
) -> Result<Response, StatusCode> {
let domain = extract_domain(&headers)?;
let collection = find_collection_by_domain(&state.db, &domain)
.await
.map_err(|e| {
tracing::error!(domain = %domain, error = %e, "DB-feil ved domene-oppslag");
StatusCode::INTERNAL_SERVER_ERROR
})?
.ok_or(StatusCode::NOT_FOUND)?;
let slug = collection.slug;
rss::generate_feed(
State(state),
axum::extract::Path(slug),
)
.await
}
/// Normaliser Host-header: fjern port og lowercase.
fn normalize_host(host: &str) -> String {
host.split(':').next().unwrap_or(host).to_lowercase()
}
// =============================================================================
// Re-rendering ved domeneendring
// =============================================================================
/// Enqueue re-rendering av alle artikler + forside for en samling.
///
/// Kalles når custom_domain endres — canonical URL i rendret HTML
/// må oppdateres. Artiklene re-rendres via jobbkøen (ikke-blokkerende).
pub async fn rerender_collection_articles(
db: &PgPool,
collection_id: Uuid,
) -> Result<usize, sqlx::Error> {
// Finn alle artikler som tilhører samlingen
let article_ids: Vec<(Uuid,)> = sqlx::query_as(
r#"
SELECT e.source_id
FROM edges e
WHERE e.target_id = $1
AND e.edge_type = 'belongs_to'
"#,
)
.bind(collection_id)
.fetch_all(db)
.await?;
let count = article_ids.len();
// Enqueue render-jobb for hver artikkel
for (article_id,) in &article_ids {
let payload = serde_json::json!({
"node_id": article_id.to_string(),
"collection_id": collection_id.to_string(),
});
if let Err(e) = crate::jobs::enqueue(db, "render_article", payload, Some(collection_id), 3).await {
tracing::error!(
article_id = %article_id,
collection_id = %collection_id,
error = %e,
"Kunne ikke enqueue render_article ved domeneendring"
);
}
}
// Enqueue render av forsiden
let index_payload = serde_json::json!({
"collection_id": collection_id.to_string(),
});
if let Err(e) = crate::jobs::enqueue(db, "render_index", index_payload, Some(collection_id), 4).await {
tracing::error!(
collection_id = %collection_id,
error = %e,
"Kunne ikke enqueue render_index ved domeneendring"
);
}
tracing::info!(
collection_id = %collection_id,
article_count = count,
"Re-rendering enqueued for domeneendring"
);
Ok(count)
}
// =============================================================================
// Tester
// =============================================================================
#[cfg(test)]
mod tests {
use super::*;
#[test]
fn normalize_host_strips_port() {
assert_eq!(normalize_host("mittmagasin.no:443"), "mittmagasin.no");
assert_eq!(normalize_host("mittmagasin.no"), "mittmagasin.no");
assert_eq!(normalize_host("MITTMAGASIN.NO:8080"), "mittmagasin.no");
}
#[test]
fn dns_validation_rejects_empty() {
let result = validate_dns("");
assert!(result.is_err());
}
#[test]
fn dns_validation_rejects_nonexistent() {
let result = validate_dns("this-domain-does-not-exist-synops-test.invalid");
assert!(result.is_err());
}
}

View file

@ -90,6 +90,15 @@ fn validate_collection_traits(
));
}
// Valider custom_domain DNS hvis satt i publishing-trait
if let Some(publishing) = traits_obj.get("publishing") {
if let Some(domain) = publishing.get("custom_domain").and_then(|v| v.as_str()) {
if !domain.is_empty() {
crate::custom_domain::validate_dns(domain)?;
}
}
}
Ok(())
}
@ -723,11 +732,30 @@ pub async fn update_node(
let title = req.title.unwrap_or(existing.title.unwrap_or_default());
let content = req.content.unwrap_or(existing.content.unwrap_or_default());
// Hent gammelt custom_domain før existing.metadata flyttes
let old_domain = existing.metadata
.get("traits")
.and_then(|t| t.get("publishing"))
.and_then(|p| p.get("custom_domain"))
.and_then(|d| d.as_str())
.unwrap_or("")
.to_string();
let metadata = req.metadata.unwrap_or(existing.metadata);
// -- Valider traits for samlingsnoder (oppgave 13.1) --
validate_collection_traits(&node_kind, &metadata).map_err(|e| bad_request(&e))?;
// -- Sjekk om custom_domain er endret (for re-rendering) --
let new_domain = metadata
.get("traits")
.and_then(|t| t.get("publishing"))
.and_then(|p| p.get("custom_domain"))
.and_then(|d| d.as_str())
.unwrap_or("");
let domain_changed = old_domain != new_domain && node_kind == "collection";
let metadata_str = metadata.to_string();
let node_id_str = req.node_id.to_string();
@ -763,6 +791,26 @@ pub async fn update_node(
metadata,
);
// -- Re-render alle artikler hvis custom_domain endret (canonical URL) --
if domain_changed {
let db = state.db.clone();
let collection_id = req.node_id;
tokio::spawn(async move {
match crate::custom_domain::rerender_collection_articles(&db, collection_id).await {
Ok(count) => tracing::info!(
collection_id = %collection_id,
articles = count,
"Re-rendering trigget etter domeneendring"
),
Err(e) => tracing::error!(
collection_id = %collection_id,
error = %e,
"Feil ved re-rendering etter domeneendring"
),
}
});
}
Ok(Json(UpdateNodeResponse { node_id: req.node_id }))
}

View file

@ -3,6 +3,7 @@ pub mod ai_edges;
pub mod audio;
mod auth;
pub mod cas;
mod custom_domain;
mod intentions;
pub mod jobs;
pub mod livekit;
@ -180,6 +181,12 @@ async fn main() {
.route("/pub/{slug}", get(publishing::serve_index))
.route("/pub/{slug}/{article_id}", get(publishing::serve_article))
.route("/pub/{slug}/preview/{theme}", get(publishing::preview_theme))
// Custom domains: Caddy on-demand TLS callback
.route("/internal/verify-domain", get(custom_domain::verify_domain))
// Custom domains: domene-basert serving (Caddy proxyer hit)
.route("/custom-domain/index", get(custom_domain::serve_custom_domain_index))
.route("/custom-domain/feed.xml", get(custom_domain::serve_custom_domain_feed))
.route("/custom-domain/{article_id}", get(custom_domain::serve_custom_domain_article))
.layer(TraceLayer::new_for_http())
.with_state(state);

View file

@ -150,8 +150,7 @@ Uavhengige faser kan fortsatt plukkes.
- [x] 14.6 Forside-admin i frontend: visuell editor for hero/featured/strøm. Drag-and-drop mellom plasser. Pin-knapp. Forhåndsvisning. Oppdaterer edge-metadata via maskinrommet.
- [x] 14.7 Publiseringsflyt i frontend (personlig): publiseringsknapp på noder i samlinger med `publishing`-trait der `require_approval: false`. Forhåndsvisning, slug-editor, bekreftelse. Avpublisering ved fjerning av edge.
- [x] 14.8 RSS/Atom-feed: samling med `rss`-trait genererer feed automatisk ved publisering/avpublisering. `synops.no/pub/{slug}/feed.xml`. Maks `rss_max_items` (default 50).
- [~] 14.9 Custom domains: bruker registrerer domene i `publishing`-trait. Maskinrommet validerer DNS, Caddy on-demand TLS med validerings-callback. Re-rendring med riktig canonical URL.
> Påbegynt: 2026-03-18T01:43
- [x] 14.9 Custom domains: bruker registrerer domene i `publishing`-trait. Maskinrommet validerer DNS, Caddy on-demand TLS med validerings-callback. Re-rendring med riktig canonical URL.
- [ ] 14.10 Redaksjonell innsending: `submitted_to`-edge med status-metadata (`pending`, `in_review`, `revision_requested`, `rejected`, `approved`). Maskinrommet validerer at kun roller i `submission_roles` kan opprette `submitted_to`, og kun owner/admin kan endre status eller opprette `belongs_to`. Ref: `docs/concepts/publisering.md` § "Innsending".
- [ ] 14.11 Redaktørens arbeidsflate: frontend-visning av noder med `submitted_to`-edge til samling, gruppert på status. Kanban-stil drag-and-drop for statusendring. Siste kolonne ("Planlagt") setter `publish_at` i edge-metadata.
- [ ] 14.12 Planlagt publisering: maskinrommet sjekker periodisk (cron/intervall) for `belongs_to`-edges med `publish_at` i fortiden som ikke er rendret. Ved treff: render HTML → CAS → oppdater RSS.