synops-video: CLI for video-transcode, thumbnail og varighet (oppgave 29.8)

Nytt CLI-verktøy som prosesserer video fra CAS:
- Transcode til H.264/AAC MP4 med faststart (web-optimert)
- Thumbnail-generering (JPEG, 480px bred)
- Varighet-uttrekk via ffprobe

Input: --cas-hash eller --payload-json (jobbkø-modus)
Output: JSON med transcoded_hash, thumbnail_hash, duration_ms
Med --write: oppdaterer medienodens metadata i PG

Følger samme mønster som synops-audio: CAS inn/ut, --write for
DB-persistering, --payload-json for dispatch fra maskinrommet.
This commit is contained in:
vegard 2026-03-18 22:25:42 +00:00
parent 10a4dd8059
commit 3b6bab5092
5 changed files with 2935 additions and 2 deletions

View file

@ -403,8 +403,7 @@ noden er det som lever videre.
### Video ### Video
- [x] 29.7 Video-opptak i frontend: webcam/skjermopptak via MediaRecorder API → upload til CAS → media-node. Start/stopp-knapp i input-komponenten. Maks varighet konfigurerbar. - [x] 29.7 Video-opptak i frontend: webcam/skjermopptak via MediaRecorder API → upload til CAS → media-node. Start/stopp-knapp i input-komponenten. Maks varighet konfigurerbar.
- [~] 29.8 Video-prosessering: `synops-video` CLI for transcode (H.264), thumbnail-generering, og varighet-uttrekk. Input: `--cas-hash <hash>`. Output: ny CAS-hash (trancodet) + thumbnail CAS-hash. - [x] 29.8 Video-prosessering: `synops-video` CLI for transcode (H.264), thumbnail-generering, og varighet-uttrekk. Input: `--cas-hash <hash>`. Output: ny CAS-hash (trancodet) + thumbnail CAS-hash.
> Påbegynt: 2026-03-18T22:20
### Geolokasjon ### Geolokasjon
- [ ] 29.9 Lokasjon-input: "Del posisjon"-knapp i input-komponenten → Geolocation API → node med `metadata.location: { "lat": 59.91, "lon": 10.75 }`. Kart-visning i node-detaljer (Leaflet/OpenStreetMap). Valgfritt: reverse geocoding via Nominatim for adresse. - [ ] 29.9 Lokasjon-input: "Del posisjon"-knapp i input-komponenten → Geolocation API → node med `metadata.location: { "lat": 59.91, "lon": 10.75 }`. Kart-visning i node-detaljer (Leaflet/OpenStreetMap). Valgfritt: reverse geocoding via Nominatim for adresse.

View file

@ -29,6 +29,7 @@ eller maskinrommet-API. Ligger i PATH via symlink eller direkte kall.
| `synops-backup` | PG-dump + CAS-filiste + metadata-snapshot (`--full` / `--incremental`) | Ferdig | | `synops-backup` | PG-dump + CAS-filiste + metadata-snapshot (`--full` / `--incremental`) | Ferdig |
| `synops-health` | Sjekk status for alle tjenester (PG, Caddy, Maskinrommet, LiteLLM, Whisper, LiveKit, Authentik) | Ferdig | | `synops-health` | Sjekk status for alle tjenester (PG, Caddy, Maskinrommet, LiteLLM, Whisper, LiveKit, Authentik) | Ferdig |
| `synops-feed` | Abonner på RSS/Atom-feed, opprett content-noder med deduplisering og paywall-deteksjon | Ferdig | | `synops-feed` | Abonner på RSS/Atom-feed, opprett content-noder med deduplisering og paywall-deteksjon | Ferdig |
| `synops-video` | Video-transcode (H.264), thumbnail-generering, varighet-uttrekk fra CAS-hash | Ferdig |
## Delt bibliotek ## Delt bibliotek

2448
tools/synops-video/Cargo.lock generated Normal file

File diff suppressed because it is too large Load diff

View file

@ -0,0 +1,18 @@
[package]
name = "synops-video"
version = "0.1.0"
edition = "2024"
[[bin]]
name = "synops-video"
path = "src/main.rs"
[dependencies]
clap = { version = "4", features = ["derive"] }
tokio = { version = "1", features = ["full"] }
sqlx = { version = "0.8", features = ["runtime-tokio", "tls-rustls", "postgres", "uuid", "chrono", "json"] }
serde = { version = "1", features = ["derive"] }
serde_json = "1"
uuid = { version = "1", features = ["v7", "serde"] }
tracing = "0.1"
synops-common = { path = "../synops-common" }

View file

@ -0,0 +1,467 @@
// synops-video — Videoprosessering: transcode (H.264), thumbnail, varighet.
//
// Input: CAS-hash til kildefil.
// Output: JSON med ny CAS-hash (transkodert), thumbnail CAS-hash og varighet.
// Med --write: oppdaterer medienodens metadata i PG.
//
// Miljøvariabler:
// DATABASE_URL — PostgreSQL-tilkobling (påkrevd med --write)
// CAS_ROOT — Rot for content-addressable store (default: /srv/synops/media/cas)
//
// Ref: docs/retninger/unix_filosofi.md, docs/features/universell_input.md
use clap::Parser;
use serde::{Deserialize, Serialize};
use std::path::PathBuf;
use std::process;
use uuid::Uuid;
/// Prosesser videofil fra CAS: transcode til H.264, generer thumbnail, hent varighet.
#[derive(Parser)]
#[command(name = "synops-video", about = "Video-transcode (H.264), thumbnail og varighet")]
struct Cli {
/// SHA-256 CAS-hash til kildefilen
#[arg(long)]
cas_hash: Option<String>,
/// Medienode-ID (påkrevd med --write)
#[arg(long)]
node_id: Option<Uuid>,
/// Bruker-ID som utløste prosesseringen (for ressurslogging)
#[arg(long)]
requested_by: Option<Uuid>,
/// Skriv resultater til database (uten dette flagget: kun stdout)
#[arg(long)]
write: bool,
/// JSON-payload fra jobbkø (alternativ til enkeltargumenter)
#[arg(long)]
payload_json: Option<String>,
}
#[derive(Debug, Deserialize)]
struct Payload {
cas_hash: String,
#[serde(default)]
node_id: Option<Uuid>,
#[serde(default)]
requested_by: Option<Uuid>,
#[serde(default)]
write: bool,
}
#[derive(Serialize)]
struct VideoProcessResult {
original_hash: String,
transcoded_hash: String,
thumbnail_hash: String,
duration_ms: i64,
transcoded_size_bytes: u64,
thumbnail_size_bytes: u64,
#[serde(skip_serializing_if = "Option::is_none")]
node_id: Option<String>,
}
// ─── Entrypoint ───────────────────────────────────────────────────
#[tokio::main]
async fn main() {
synops_common::logging::init("synops_video");
let cli = Cli::parse();
if let Err(e) = run(cli).await {
eprintln!("Feil: {e}");
process::exit(1);
}
}
async fn run(cli: Cli) -> Result<(), String> {
// Resolve parametre: payload_json overskriver enkeltargs
let (cas_hash, node_id, requested_by, write) = if let Some(ref json) = cli.payload_json {
let p: Payload =
serde_json::from_str(json).map_err(|e| format!("Ugyldig payload JSON: {e}"))?;
(
p.cas_hash,
p.node_id.or(cli.node_id),
p.requested_by.or(cli.requested_by),
p.write || cli.write,
)
} else {
let hash = cli
.cas_hash
.ok_or("--cas-hash er påkrevd (eller bruk --payload-json)")?;
(hash, cli.node_id, cli.requested_by, cli.write)
};
if write && node_id.is_none() {
return Err("--node-id er påkrevd sammen med --write".into());
}
let cas_root = synops_common::cas::root();
// 1. Sjekk at kildefilen finnes i CAS
let source_path = synops_common::cas::path(&cas_root, &cas_hash);
if !source_path.exists() {
return Err(format!("Kildefil finnes ikke i CAS: {cas_hash}"));
}
tracing::info!(cas_hash = %cas_hash, "Starter videoprosessering");
// 2. Hent varighet via ffprobe
let duration_ms = get_duration(&source_path).await?;
tracing::info!(duration_ms, "Varighet hentet");
// 3. Transcode til H.264/AAC MP4
let (transcoded_hash, transcoded_size) =
transcode_h264(&source_path, &cas_root).await?;
tracing::info!(transcoded_hash = %transcoded_hash, size = transcoded_size, "Transkoding fullført");
// 4. Generer thumbnail
let thumbnail_time = pick_thumbnail_time(duration_ms);
let (thumbnail_hash, thumbnail_size) =
generate_thumbnail(&source_path, &cas_root, thumbnail_time).await?;
tracing::info!(thumbnail_hash = %thumbnail_hash, size = thumbnail_size, "Thumbnail generert");
// 5. Valgfritt: oppdater database
let mut result_node_id = None;
if write {
let nid = node_id.unwrap();
let uid = requested_by.ok_or("--requested-by er påkrevd sammen med --write")?;
let db = synops_common::db::connect().await?;
update_node_metadata(
&db,
nid,
uid,
&cas_hash,
&transcoded_hash,
&thumbnail_hash,
duration_ms,
transcoded_size,
)
.await?;
result_node_id = Some(nid.to_string());
tracing::info!(node_id = %nid, "Database oppdatert");
}
// 6. Skriv JSON-resultat til stdout
let result = VideoProcessResult {
original_hash: cas_hash,
transcoded_hash,
thumbnail_hash,
duration_ms,
transcoded_size_bytes: transcoded_size,
thumbnail_size_bytes: thumbnail_size,
node_id: result_node_id,
};
println!(
"{}",
serde_json::to_string_pretty(&result)
.map_err(|e| format!("JSON-serialisering feilet: {e}"))?
);
Ok(())
}
// ─── FFmpeg-kommandoer ────────────────────────────────────────────
/// Hent videovarighet via ffprobe. Returnerer millisekunder.
async fn get_duration(path: &PathBuf) -> Result<i64, String> {
let output = tokio::process::Command::new("ffprobe")
.args([
"-v", "quiet",
"-print_format", "json",
"-show_format",
])
.arg(path)
.output()
.await
.map_err(|e| format!("Kunne ikke kjøre ffprobe: {e}"))?;
if !output.status.success() {
let stderr = String::from_utf8_lossy(&output.stderr);
return Err(format!("ffprobe feilet: {stderr}"));
}
let json: serde_json::Value = serde_json::from_slice(&output.stdout)
.map_err(|e| format!("Kunne ikke parse ffprobe-output: {e}"))?;
let duration_secs: f64 = json["format"]["duration"]
.as_str()
.and_then(|s| s.parse().ok())
.ok_or("Kunne ikke lese varighet fra ffprobe")?;
Ok((duration_secs * 1000.0) as i64)
}
/// Transcode video til H.264/AAC i MP4-container.
/// Returnerer (cas_hash, size_bytes).
async fn transcode_h264(source: &PathBuf, cas_root: &str) -> Result<(String, u64), String> {
let tmp_dir = PathBuf::from(cas_root).join("tmp");
tokio::fs::create_dir_all(&tmp_dir)
.await
.map_err(|e| format!("Kunne ikke opprette tmp-katalog: {e}"))?;
let tmp_output = tmp_dir.join(format!("video_transcode_{}.mp4", Uuid::now_v7()));
let output = tokio::process::Command::new("ffmpeg")
.args(["-i"])
.arg(source)
.args([
"-c:v", "libx264",
"-preset", "fast",
"-crf", "23",
"-c:a", "aac",
"-b:a", "128k",
"-movflags", "+faststart", // Web-optimert: moov-atom først
"-y",
])
.arg(&tmp_output)
.output()
.await
.map_err(|e| format!("Kunne ikke kjøre ffmpeg transcode: {e}"))?;
if !output.status.success() {
let stderr = String::from_utf8_lossy(&output.stderr);
let _ = tokio::fs::remove_file(&tmp_output).await;
return Err(format!("ffmpeg transcode feilet: {stderr}"));
}
let result_bytes = tokio::fs::read(&tmp_output)
.await
.map_err(|e| format!("Kunne ikke lese transkodert fil: {e}"))?;
let _ = tokio::fs::remove_file(&tmp_output).await;
let size = result_bytes.len() as u64;
let hash = store_in_cas(cas_root, &result_bytes).await?;
Ok((hash, size))
}
/// Velg tidspunkt for thumbnail — 1 sekund inn, eller midten for korte videoer.
fn pick_thumbnail_time(duration_ms: i64) -> f64 {
if duration_ms <= 2000 {
0.0
} else if duration_ms <= 4000 {
(duration_ms as f64 / 2000.0).min(1.0)
} else {
1.0
}
}
/// Generer JPEG-thumbnail fra video. Returnerer (cas_hash, size_bytes).
async fn generate_thumbnail(
source: &PathBuf,
cas_root: &str,
time_secs: f64,
) -> Result<(String, u64), String> {
let tmp_dir = PathBuf::from(cas_root).join("tmp");
tokio::fs::create_dir_all(&tmp_dir)
.await
.map_err(|e| format!("Kunne ikke opprette tmp-katalog: {e}"))?;
let tmp_output = tmp_dir.join(format!("video_thumb_{}.jpg", Uuid::now_v7()));
let output = tokio::process::Command::new("ffmpeg")
.args([
"-ss", &format!("{time_secs:.3}"),
"-i",
])
.arg(source)
.args([
"-vframes", "1",
"-vf", "scale=480:-2", // 480px bred, behold aspect ratio (jevnt tall)
"-q:v", "3", // JPEG-kvalitet (2-5 er bra, 3 er god balanse)
"-y",
])
.arg(&tmp_output)
.output()
.await
.map_err(|e| format!("Kunne ikke kjøre ffmpeg thumbnail: {e}"))?;
if !output.status.success() {
let stderr = String::from_utf8_lossy(&output.stderr);
let _ = tokio::fs::remove_file(&tmp_output).await;
return Err(format!("ffmpeg thumbnail feilet: {stderr}"));
}
let result_bytes = tokio::fs::read(&tmp_output)
.await
.map_err(|e| format!("Kunne ikke lese thumbnail: {e}"))?;
let _ = tokio::fs::remove_file(&tmp_output).await;
if result_bytes.is_empty() {
return Err("Thumbnail-generering produserte tom fil".into());
}
let size = result_bytes.len() as u64;
let hash = store_in_cas(cas_root, &result_bytes).await?;
Ok((hash, size))
}
// ─── CAS-operasjoner ─────────────────────────────────────────────
/// Lagre bytes i CAS med atomisk rename. Returnerer hash.
async fn store_in_cas(cas_root: &str, data: &[u8]) -> Result<String, String> {
let hash = synops_common::cas::hash_bytes(data);
let dest = synops_common::cas::path(cas_root, &hash);
if dest.exists() {
return Ok(hash);
}
if let Some(parent) = dest.parent() {
tokio::fs::create_dir_all(parent)
.await
.map_err(|e| format!("Kunne ikke opprette CAS-katalog: {e}"))?;
}
let tmp_path = PathBuf::from(cas_root)
.join("tmp")
.join(format!("{}.tmp", hash));
tokio::fs::write(&tmp_path, data)
.await
.map_err(|e| format!("Kunne ikke skrive CAS temp-fil: {e}"))?;
tokio::fs::rename(&tmp_path, &dest)
.await
.map_err(|e| format!("Kunne ikke flytte til CAS: {e}"))?;
Ok(hash)
}
// ─── Database-operasjoner (kun med --write) ───────────────────────
async fn update_node_metadata(
db: &sqlx::PgPool,
node_id: Uuid,
requested_by: Uuid,
original_hash: &str,
transcoded_hash: &str,
thumbnail_hash: &str,
duration_ms: i64,
transcoded_size: u64,
) -> Result<(), String> {
// Oppdater nodens metadata med transkoderte verdier
sqlx::query(
r#"UPDATE nodes
SET metadata = COALESCE(metadata, '{}'::jsonb)
|| jsonb_build_object(
'transcoded_hash', $2::text,
'thumbnail_hash', $3::text,
'duration_ms', $4::bigint,
'transcoded_mime', 'video/mp4'::text,
'thumbnail_mime', 'image/jpeg'::text
),
updated_at = now()
WHERE id = $1"#,
)
.bind(node_id)
.bind(transcoded_hash)
.bind(thumbnail_hash)
.bind(duration_ms)
.execute(db)
.await
.map_err(|e| format!("Kunne ikke oppdatere node metadata: {e}"))?;
// Logg ressursforbruk
let collection_id: Option<Uuid> = sqlx::query_scalar(
"SELECT e.target_id FROM edges e
JOIN nodes n ON n.id = e.target_id
WHERE e.source_id = $1 AND e.edge_type = 'belongs_to' AND n.node_kind = 'collection'
LIMIT 1",
)
.bind(node_id)
.fetch_optional(db)
.await
.ok()
.flatten();
let detail = serde_json::json!({
"original_hash": original_hash,
"transcoded_hash": transcoded_hash,
"thumbnail_hash": thumbnail_hash,
"duration_ms": duration_ms,
"transcoded_size_bytes": transcoded_size,
});
if let Err(e) = sqlx::query(
"INSERT INTO resource_usage_log (target_node_id, triggered_by, collection_id, resource_type, detail)
VALUES ($1, $2, $3, $4, $5)",
)
.bind(node_id)
.bind(requested_by)
.bind(collection_id)
.bind("ffmpeg_video")
.bind(&detail)
.execute(db)
.await
{
tracing::warn!(error = %e, "Kunne ikke logge ressursforbruk");
}
Ok(())
}
// ─── Tester ──────────────────────────────────────────────────────
#[cfg(test)]
mod tests {
use super::*;
#[test]
fn thumbnail_time_short_video() {
assert_eq!(pick_thumbnail_time(500), 0.0); // < 2s → 0
assert_eq!(pick_thumbnail_time(2000), 0.0); // = 2s → 0
}
#[test]
fn thumbnail_time_medium_video() {
let t = pick_thumbnail_time(3000);
assert!(t > 0.0 && t <= 1.0);
}
#[test]
fn thumbnail_time_long_video() {
assert_eq!(pick_thumbnail_time(60000), 1.0); // 60s → 1s
assert_eq!(pick_thumbnail_time(5000), 1.0); // 5s → 1s
}
#[test]
fn cas_path_format() {
let p = synops_common::cas::path("/srv/synops/media/cas", "b94d27b9934d3e08");
assert_eq!(
p,
PathBuf::from("/srv/synops/media/cas/b9/4d/b94d27b9934d3e08")
);
}
#[test]
fn payload_json_deserialization() {
let json = r#"{"cas_hash":"abc123"}"#;
let p: Payload = serde_json::from_str(json).unwrap();
assert_eq!(p.cas_hash, "abc123");
assert!(p.node_id.is_none());
assert!(!p.write);
}
#[test]
fn payload_json_full() {
let json = r#"{
"cas_hash": "abc123",
"node_id": "0195c8a0-0000-7000-8000-000000000001",
"requested_by": "0195c8a0-0000-7000-8000-000000000002",
"write": true
}"#;
let p: Payload = serde_json::from_str(json).unwrap();
assert_eq!(p.cas_hash, "abc123");
assert!(p.node_id.is_some());
assert!(p.write);
}
}