2024-11-22
read time: 3 min
cat ~/posts/rust/building-terminal-velocity.md

In this post, I'll walk through the design and implementation of Terminal Velocity, a modern static site generator built in Rust. Unlike many existing static site generators, Terminal Velocity focuses on developer experience with features like hot reloading, AI-assisted content generation, and syntax highlighting.

Architecture Overview

The generator follows a modular architecture with clear separation of concerns. Here's the high-level flow:

flowchart TD
    A[Markdown Files] --> B[Parser]
    B --> C[Generator]
    C --> D[HTML Output]
    
    subgraph Parser
        E[Frontmatter] --> F[Markdown]
        F --> G[Code Highlighting]
    end
    
    subgraph Generator
        H[Templates] --> I[Asset Pipeline]
        I --> J[Output Generation]
    end
    
    K[Static Files] --> I
    L[LLM API] --> M[Content Generation]
    M --> A

Key Features

Hot Reloading Implementation

One of the most developer-friendly features is the live preview with hot reloading. Here's how we implemented it using notify:

pub struct Server {
    site_dir: PathBuf,
    output_dir: PathBuf,
    hot_reload: bool,
    port: u16,
}

impl Server {
    pub async fn run(self) -> std::io::Result<()> {
        if self.hot_reload {
            let (tx, rx) = mpsc::channel();
            let mut watcher = notify::recommended_watcher(move |res| {
                if let Ok(event) = res {
                    let _ = tx.send(event);
                }
            }).unwrap();

            // Watch directories
            for dir in ["posts", "templates", "static"].iter() {
                let path = self.site_dir.join(dir);
                if path.exists() {
                    watcher.watch(&path, RecursiveMode::Recursive).unwrap();
                }
            }

            std::thread::spawn(move || {
                let mut last_build = Instant::now();
                let debounce_duration = Duration::from_millis(500);

                while let Ok(_event) = rx.recv() {
                    if last_build.elapsed() >= debounce_duration {
                        // Trigger rebuild
                        println!("🔄 Rebuilding site...");
                        last_build = Instant::now();
                    }
                }
            });
        }
        // Server implementation...
    }
}

Syntax Highlighting

We implemented a robust syntax highlighting system using syntect and pulldown-cmark. Here's how we process code blocks:

stateDiagram-v2
    [*] --> Parsing: Markdown Input
    Parsing --> CodeBlock: Code Fence Detected
    Parsing --> RegularText: Regular Text
    
    RegularText --> HTMLOutput: Standard MD Rendering
    CodeBlock --> LanguageDetection
    LanguageDetection --> SyntaxHighlighting
    SyntaxHighlighting --> HTMLOutput
    
    HTMLOutput --> [*]

The implementation handles both fenced code blocks and inline code:

pub struct MarkdownProcessor {
    syntax_set: SyntaxSet,
    theme: Theme,
    options: Options,
}

impl MarkdownProcessor {
    pub fn render(&self, content: &str) -> String {
        let parser = MarkdownParser::new_ext(content, self.options);
        let mut events = Vec::new();
        let mut code_buffer = String::new();
        let mut in_code_block = false;
        let mut current_lang = None;

        for event in parser {
            match event {
                Event::Start(Tag::CodeBlock(CodeBlockKind::Fenced(lang))) => {
                    in_code_block = true;
                    current_lang = Some(lang.as_ref().to_string());
                    continue;
                }
                Event::End(TagEnd::CodeBlock) => {
                    let highlighted = self.highlight_code(
                        &code_buffer,
                        current_lang.as_deref(),
                    );
                    events.push(Event::Html(highlighted.into()));
                    code_buffer.clear();
                    in_code_block = false;
                    current_lang = None;
                    continue;
                }
                Event::Text(text) => {
                    if in_code_block {
                        code_buffer.push_str(&text);
                        continue;
                    }
                }
                _ => {}
            }
            
            if !in_code_block {
                events.push(event);
            }
        }

        let mut html_output = String::new();
        html::push_html(&mut html_output, events.into_iter());
        html_output
    }
}

AI-Assisted Content Generation

One unique feature is the integration with Claude for content generation. Here's the flow of our AI integration:

sequenceDiagram
    participant User
    participant CLI
    participant API
    participant Claude
    participant Editor

    User->>CLI: termv new "Post Title" --prompt "Topic"
    CLI->>API: Generate outline request
    API->>Claude: API call with prompt
    Claude->>API: Returns outline
    API->>CLI: Outline response
    CLI->>Editor: Opens new post with outline
    Editor->>User: Edit post

The API integration is handled through a clean interface:

pub async fn generate_outline(prompt: &str, api_key: Option<&str>) -> Result<String, Error> {
    let api_key = api_key.ok_or(Error::MissingApiKey)?;
    let client = reqwest::Client::new();

    let mut headers = HeaderMap::new();
    headers.insert(
        "x-api-key",
        HeaderValue::from_str(api_key).map_err(|e| Error::Api(e.to_string()))?,
    );
    headers.insert("anthropic-version", HeaderValue::from_static("2023-06-01"));
    headers.insert(ACCEPT, HeaderValue::from_static("application/json"));
    headers.insert(CONTENT_TYPE, HeaderValue::from_static("application/json"));

    let msg = format!(
        "Generate a detailed blog post outline for: {}",
        prompt
    );

    let request = MessageRequest {
        model: "claude-3-haiku-20240307".to_string(),
        max_tokens: 1000,
        messages: vec![Message {
            role: "user".to_string(),
            content: msg,
        }],
    };

    // API call implementation...
}

Error Handling Strategy

We use a comprehensive error type that covers all possible failure modes:

#[derive(thiserror::Error, Debug)]
pub enum Error {
    #[error("Directory not found: {0}")]
    DirectoryNotFound(PathBuf),

    #[error("Required directory missing: {0}")]
    MissingDirectory(String),

    #[error("IO error: {0}")]
    Io(#[from] std::io::Error),

    #[error("Template error: {0}")]
    Template(#[from] tera::Error),

    #[error("Git error: {0}")]
    Git(#[from] git2::Error),

    #[error("API error: {0}")]
    Api(String),

    #[error("Missing API key")]
    MissingApiKey,

    #[error(transparent)]
    Other(#[from] Box<dyn std::error::Error + Send + Sync>),
}

Performance Considerations

The generator is designed to be fast and memory-efficient:

Testing Strategy

We use a combination of unit tests and integration tests. Here's an example test for our Markdown processor:

#[cfg(test)]
mod tests {
    use super::*;

    #[test]
    fn test_mixed_content() {
        let processor = MarkdownProcessor::new();
        let input = r#"# Test

Some text

```python
def hello():
    print("Hello")```


More text"#;
        let output = processor.render(input);
        assert!(output.contains("<h1>"));
        assert!(output.contains("language-python"));
        assert!(output.contains("<p>Some text</p>"));
        assert!(output.contains("<p>More text</p>"));
    }
}

Future Improvements

While Terminal Velocity is already a capable static site generator, there are several areas where we plan to enhance its functionality:

Incremental Builds

Currently, the generator rebuilds all files on every change. We could significantly improve build times by implementing incremental builds:

pub struct BuildCache {
    file_hashes: HashMap<PathBuf, String>,
    template_hash: String,
}

impl BuildCache {
    pub fn needs_rebuild(&self, path: &Path) -> bool {
        if let Ok(content) = fs::read_to_string(path) {
            let hash = blake3::hash(content.as_bytes()).to_string();
            self.file_hashes.get(path).map_or(true, |old_hash| *old_hash != hash)
        } else {
            true
        }
    }
}

Parallel Processing

We can leverage Rayon to parallelize markdown processing:

use rayon::prelude::*;

impl SiteGenerator {
    fn process_posts(&self) -> Result<Vec<Post>, Error> {
        let posts: Result<Vec<_>, _> = self.collect_post_files()?
            .par_iter()
            .map(|path| self.process_post(path))
            .collect();
        posts
    }
}

Plugin System

A flexible plugin system would allow users to extend the generator's functionality:

pub trait Plugin: Send + Sync {
    fn name(&self) -> &'static str;
    fn pre_process(&self, content: &str) -> Result<String, Error>;
    fn post_process(&self, html: &str) -> Result<String, Error>;
}

pub struct PluginRegistry {
    plugins: Vec<Box<dyn Plugin>>,
}

impl PluginRegistry {
    pub fn register(&mut self, plugin: Box<dyn Plugin>) {
        self.plugins.push(plugin);
    }
}

Asset Optimization

Implement automatic optimization of images, CSS, and JavaScript:

pub struct AssetPipeline {
    css_minifier: Box<dyn CssMinifier>,
    js_minifier: Box<dyn JsMinifier>,
    image_optimizer: Box<dyn ImageOptimizer>,
}

impl AssetPipeline {
    pub async fn process_assets(&self, dir: &Path) -> Result<(), Error> {
        let tasks = vec![
            self.optimize_css(dir),
            self.optimize_js(dir),
            self.optimize_images(dir),
        ];
        futures::future::try_join_all(tasks).await?;
        Ok(())
    }
}

SSR Components

Add support for server-side rendering of React/Vue components:

pub struct ComponentRenderer {
    runtime: deno_runtime::JsRuntime,
    cache: HashMap<PathBuf, CompiledComponent>,
}

impl ComponentRenderer {
    pub async fn render_component(&self, path: &Path, props: Value) -> Result<String, Error> {
        let component = self.get_or_compile_component(path).await?;
        let html = component.render(props).await?;
        Ok(html)
    }
}

Community and Contributions

Terminal Velocity is open source and we welcome contributions. Here are some areas where you can help:

Conclusion

Building a static site generator in Rust has been an enlightening journey that showcases the language's strengths in building reliable, high-performance tools. The type system helped prevent many potential bugs, while the rich ecosystem of crates made it possible to implement advanced features with reasonable effort.

Some key takeaways from this project:

  1. Rust's async ecosystem is mature enough for real-world applications
  2. The type system helps enforce clean architecture
  3. Error handling is verbose but comprehensive
  4. Performance comes naturally with Rust's zero-cost abstractions
  5. Integration with modern AI tools can enhance developer experience

We look forward to seeing how the community uses and improves Terminal Velocity. Whether you're building a personal blog or a documentation site, we hope this tool makes your content creation process more enjoyable and efficient.

Check out the GitHub repository to get started, and don't hesitate to open issues or submit pull requests!


This post was generated on November 20, 2024, using Terminal Velocity's AI-assisted content generation feature.

// Contents