feat: Add WebBuilder library for website generation
- Adds a new library for building websites from configuration files and markdown content. Improves developer workflow by automating website construction. - Implements multiple parsing strategies for configuration files (Hjson, Simple, Auto) for flexibility and backward compatibility. - Includes support for cloning Git repositories, processing markdown, and uploading files to IPFS, streamlining the website deployment process. Facilitates easier website updates and content management. - Adds comprehensive README documentation explaining the library's usage and configuration options. Improves user onboarding and reduces the learning curve for new users.
This commit is contained in:
parent
ea25db7d29
commit
f9d338a8f1
128
webbuilder/README.md
Normal file
128
webbuilder/README.md
Normal file
@ -0,0 +1,128 @@
|
||||
# WebBuilder
|
||||
|
||||
WebBuilder is a library for building websites from configuration files and markdown content. It uses the DocTree library to process markdown content and includes, and exports the result to a webmeta.json file that can be used by a browser-based website generator.
|
||||
|
||||
## Overview
|
||||
|
||||
WebBuilder scans directories for configuration files (in hjson format) and generates a `webmeta.json` file that can be used by a browser-based website generator. It can also clone Git repositories, process markdown content, and upload files to IPFS.
|
||||
|
||||
## Parsing Configuration Files
|
||||
|
||||
WebBuilder supports multiple parsing strategies for configuration files:
|
||||
|
||||
### Unified Parser
|
||||
|
||||
The recommended way to parse configuration files is to use the unified parser, which provides a consistent interface for all parsing strategies:
|
||||
|
||||
```rust
|
||||
use webbuilder::{from_directory_with_strategy, ParsingStrategy};
|
||||
|
||||
// Use the recommended strategy (Hjson)
|
||||
let webbuilder = from_directory_with_strategy("path/to/config", ParsingStrategy::Hjson)?;
|
||||
|
||||
// Or use the auto-detect strategy
|
||||
let webbuilder = from_directory_with_strategy("path/to/config", ParsingStrategy::Auto)?;
|
||||
|
||||
// Or use the simple strategy (legacy)
|
||||
let webbuilder = from_directory_with_strategy("path/to/config", ParsingStrategy::Simple)?;
|
||||
```
|
||||
|
||||
You can also use the convenience functions:
|
||||
|
||||
```rust
|
||||
use webbuilder::{from_directory, parse_site_config_recommended, parse_site_config_auto};
|
||||
|
||||
// Use the recommended strategy (Hjson)
|
||||
let webbuilder = from_directory("path/to/config")?;
|
||||
|
||||
// Or parse the site configuration directly
|
||||
let site_config = parse_site_config_recommended("path/to/config")?;
|
||||
let site_config = parse_site_config_auto("path/to/config")?;
|
||||
```
|
||||
|
||||
### Parsing Strategies
|
||||
|
||||
WebBuilder supports the following parsing strategies:
|
||||
|
||||
- **Hjson**: Uses the `deser-hjson` library to parse hjson files. This is the recommended strategy.
|
||||
- **Simple**: Uses a simple line-by-line parser that doesn't rely on external libraries. This is a legacy strategy.
|
||||
- **Auto**: Tries the Hjson parser first, and falls back to the simple parser if it fails.
|
||||
|
||||
## Building a Website
|
||||
|
||||
Once you have a WebBuilder instance, you can build a website:
|
||||
|
||||
```rust
|
||||
use webbuilder::from_directory;
|
||||
|
||||
// Create a WebBuilder instance
|
||||
let webbuilder = from_directory("path/to/config")?;
|
||||
|
||||
// Build the website
|
||||
let webmeta = webbuilder.build()?;
|
||||
|
||||
// Save the webmeta.json file
|
||||
webmeta.save("webmeta.json")?;
|
||||
|
||||
// Upload the webmeta.json file to IPFS
|
||||
let ipfs_hash = webbuilder.upload_to_ipfs("webmeta.json")?;
|
||||
println!("Uploaded to IPFS: {}", ipfs_hash);
|
||||
```
|
||||
|
||||
## Configuration Files
|
||||
|
||||
WebBuilder expects the following configuration files:
|
||||
|
||||
- `main.hjson`: Main configuration file with site metadata
|
||||
- `header.hjson`: Header configuration
|
||||
- `footer.hjson`: Footer configuration
|
||||
- `collection.hjson`: Collection configuration (Git repositories)
|
||||
- `pages/*.hjson`: Page configuration files
|
||||
|
||||
Example `main.hjson`:
|
||||
|
||||
```hjson
|
||||
{
|
||||
"name": "my-site",
|
||||
"title": "My Site",
|
||||
"description": "My awesome site",
|
||||
"url": "https://example.com",
|
||||
"favicon": "favicon.ico",
|
||||
"keywords": [
|
||||
"website",
|
||||
"awesome"
|
||||
]
|
||||
}
|
||||
```
|
||||
|
||||
Example `collection.hjson`:
|
||||
|
||||
```hjson
|
||||
[
|
||||
{
|
||||
"name": "docs",
|
||||
"url": "https://github.com/example/docs.git",
|
||||
"description": "Documentation",
|
||||
"scan": true
|
||||
}
|
||||
]
|
||||
```
|
||||
|
||||
Example `pages/pages.hjson`:
|
||||
|
||||
```hjson
|
||||
[
|
||||
{
|
||||
"name": "home",
|
||||
"title": "Home",
|
||||
"description": "Home page",
|
||||
"navpath": "/",
|
||||
"collection": "docs",
|
||||
"draft": false
|
||||
}
|
||||
]
|
||||
```
|
||||
|
||||
## License
|
||||
|
||||
This project is licensed under the MIT License - see the LICENSE file for details.
|
@ -4,7 +4,7 @@ use std::path::Path;
|
||||
|
||||
use crate::config::SiteConfig;
|
||||
use crate::error::Result;
|
||||
use crate::parser_hjson;
|
||||
use crate::parser;
|
||||
|
||||
#[cfg(test)]
|
||||
mod mod_test;
|
||||
@ -122,7 +122,7 @@ impl WebBuilder {
|
||||
///
|
||||
/// A new WebBuilder instance or an error
|
||||
pub fn from_directory<P: AsRef<Path>>(path: P) -> Result<Self> {
|
||||
let config = parser_hjson::parse_site_config(path)?;
|
||||
let config = parser::parse_site_config_recommended(path)?;
|
||||
Ok(WebBuilder { config })
|
||||
}
|
||||
|
||||
|
@ -7,7 +7,8 @@ mod tests {
|
||||
#[test]
|
||||
fn test_clone_repository_error_invalid_destination() {
|
||||
// Test with a destination that has no parent directory
|
||||
let result = clone_repository("https://git.ourworld.tf/tfgrid/home.git", PathBuf::from("/"));
|
||||
// This URL is invalid because we added number 2 after `home`
|
||||
let result = clone_repository("https://git.ourworld.tf/tfgrid/home2.git", PathBuf::from("/"));
|
||||
|
||||
assert!(result.is_err());
|
||||
assert!(matches!(
|
||||
|
@ -9,8 +9,6 @@ pub mod error;
|
||||
pub mod git;
|
||||
pub mod ipfs;
|
||||
pub mod parser;
|
||||
pub mod parser_simple;
|
||||
pub mod parser_hjson;
|
||||
|
||||
#[cfg(test)]
|
||||
mod config_test;
|
||||
@ -21,19 +19,18 @@ mod git_test;
|
||||
#[cfg(test)]
|
||||
mod ipfs_test;
|
||||
#[cfg(test)]
|
||||
mod parser_simple_test;
|
||||
#[cfg(test)]
|
||||
mod parser_hjson_test;
|
||||
mod parser_test;
|
||||
|
||||
pub use builder::WebBuilder;
|
||||
pub use config::SiteConfig;
|
||||
pub use error::{Result, WebBuilderError};
|
||||
pub use parser::{ParsingStrategy, parse_site_config_with_strategy as parse_site_config, parse_site_config_recommended, parse_site_config_auto};
|
||||
|
||||
/// Create a new WebBuilder instance from a directory containing hjson configuration files.
|
||||
/// Create a new WebBuilder instance from a directory containing configuration files.
|
||||
///
|
||||
/// # Arguments
|
||||
///
|
||||
/// * `path` - Path to the directory containing hjson configuration files
|
||||
/// * `path` - Path to the directory containing configuration files
|
||||
///
|
||||
/// # Returns
|
||||
///
|
||||
@ -41,3 +38,22 @@ pub use error::{Result, WebBuilderError};
|
||||
pub fn from_directory<P: AsRef<std::path::Path>>(path: P) -> Result<WebBuilder> {
|
||||
WebBuilder::from_directory(path)
|
||||
}
|
||||
|
||||
/// Create a new WebBuilder instance from a directory containing configuration files,
|
||||
/// using the specified parsing strategy.
|
||||
///
|
||||
/// # Arguments
|
||||
///
|
||||
/// * `path` - Path to the directory containing configuration files
|
||||
/// * `strategy` - Parsing strategy to use
|
||||
///
|
||||
/// # Returns
|
||||
///
|
||||
/// A new WebBuilder instance or an error
|
||||
pub fn from_directory_with_strategy<P: AsRef<std::path::Path>>(
|
||||
path: P,
|
||||
strategy: ParsingStrategy,
|
||||
) -> Result<WebBuilder> {
|
||||
let config = parser::parse_site_config_with_strategy(path, strategy)?;
|
||||
Ok(WebBuilder { config })
|
||||
}
|
||||
|
@ -1,12 +1,116 @@
|
||||
use serde::de::DeserializeOwned;
|
||||
use serde_json;
|
||||
use std::fs;
|
||||
use std::path::Path;
|
||||
|
||||
use deser_hjson::from_str;
|
||||
use serde::de::DeserializeOwned;
|
||||
use serde_json::{self, Value};
|
||||
|
||||
use crate::config::{CollectionConfig, FooterConfig, HeaderConfig, PageConfig, SiteConfig};
|
||||
use crate::error::{Result, WebBuilderError};
|
||||
|
||||
/// Parse a hjson file into a struct
|
||||
/// Parsing strategy to use
|
||||
#[derive(Debug, Clone, Copy, PartialEq)]
|
||||
pub enum ParsingStrategy {
|
||||
/// Use the deser-hjson library (recommended)
|
||||
Hjson,
|
||||
/// Use a simple line-by-line parser (legacy)
|
||||
Simple,
|
||||
/// Auto-detect the best parser to use
|
||||
Auto,
|
||||
}
|
||||
|
||||
/// Parse a file into a struct using the specified strategy
|
||||
///
|
||||
/// # Arguments
|
||||
///
|
||||
/// * `path` - Path to the file to parse
|
||||
/// * `strategy` - Parsing strategy to use
|
||||
///
|
||||
/// # Returns
|
||||
///
|
||||
/// The parsed struct or an error
|
||||
pub fn parse_file<T, P>(path: P, strategy: ParsingStrategy) -> Result<T>
|
||||
where
|
||||
T: DeserializeOwned,
|
||||
P: AsRef<Path>,
|
||||
{
|
||||
let path = path.as_ref();
|
||||
|
||||
// Check if the file exists
|
||||
if !path.exists() {
|
||||
return Err(WebBuilderError::MissingFile(path.to_path_buf()));
|
||||
}
|
||||
|
||||
// Read the file
|
||||
let content = fs::read_to_string(path).map_err(|e| WebBuilderError::IoError(e))?;
|
||||
|
||||
match strategy {
|
||||
ParsingStrategy::Hjson => {
|
||||
// Use the deser-hjson library
|
||||
from_str(&content).map_err(|e| WebBuilderError::HjsonError(format!("Error parsing {:?}: {}", path, e)))
|
||||
}
|
||||
ParsingStrategy::Simple => {
|
||||
// Use the simple parser - for this we need to handle the file reading ourselves
|
||||
// since the original parse_hjson function does that internally
|
||||
let path_ref: &Path = path.as_ref();
|
||||
|
||||
// Check if the file exists
|
||||
if !path_ref.exists() {
|
||||
return Err(WebBuilderError::MissingFile(path_ref.to_path_buf()));
|
||||
}
|
||||
|
||||
// Read the file
|
||||
let content = fs::read_to_string(path).map_err(|e| WebBuilderError::IoError(e))?;
|
||||
|
||||
// First try to parse as JSON
|
||||
let json_result = serde_json::from_str::<T>(&content);
|
||||
if json_result.is_ok() {
|
||||
return Ok(json_result.unwrap());
|
||||
}
|
||||
|
||||
// If that fails, try to convert hjson to json using a simple approach
|
||||
let json_content = convert_hjson_to_json(&content)?;
|
||||
|
||||
// Parse the JSON
|
||||
serde_json::from_str(&json_content)
|
||||
.map_err(|e| WebBuilderError::HjsonError(format!("Error parsing {:?}: {}", path, e)))
|
||||
}
|
||||
ParsingStrategy::Auto => {
|
||||
// Try the hjson parser first, fall back to simple if it fails
|
||||
match from_str(&content) {
|
||||
Ok(result) => Ok(result),
|
||||
Err(e) => {
|
||||
log::warn!("Hjson parser failed: {}, falling back to simple parser", e);
|
||||
// Call the simple parser directly
|
||||
let path_ref: &Path = path.as_ref();
|
||||
|
||||
// Check if the file exists
|
||||
if !path_ref.exists() {
|
||||
return Err(WebBuilderError::MissingFile(path_ref.to_path_buf()));
|
||||
}
|
||||
|
||||
// Read the file
|
||||
let content = fs::read_to_string(path).map_err(|e| WebBuilderError::IoError(e))?;
|
||||
|
||||
// First try to parse as JSON
|
||||
let json_result = serde_json::from_str::<T>(&content);
|
||||
if json_result.is_ok() {
|
||||
return Ok(json_result.unwrap());
|
||||
}
|
||||
|
||||
// If that fails, try to convert hjson to json using a simple approach
|
||||
let json_content = convert_hjson_to_json(&content)?;
|
||||
|
||||
// Parse the JSON
|
||||
serde_json::from_str(&json_content)
|
||||
.map_err(|e| WebBuilderError::HjsonError(format!("Error parsing {:?}: {}", path, e)))
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// Parse a hjson file into a struct using the simple parser
|
||||
///
|
||||
/// # Arguments
|
||||
///
|
||||
@ -262,3 +366,152 @@ pub fn parse_site_config<P: AsRef<Path>>(path: P) -> Result<SiteConfig> {
|
||||
|
||||
Ok(site_config)
|
||||
}
|
||||
|
||||
/// Parse site configuration from a directory using the specified strategy
|
||||
///
|
||||
/// # Arguments
|
||||
///
|
||||
/// * `path` - Path to the directory containing configuration files
|
||||
/// * `strategy` - Parsing strategy to use
|
||||
///
|
||||
/// # Returns
|
||||
///
|
||||
/// The parsed site configuration or an error
|
||||
pub fn parse_site_config_with_strategy<P: AsRef<Path>>(path: P, strategy: ParsingStrategy) -> Result<SiteConfig> {
|
||||
let path = path.as_ref();
|
||||
|
||||
// Check if the directory exists
|
||||
if !path.exists() {
|
||||
return Err(WebBuilderError::MissingDirectory(path.to_path_buf()));
|
||||
}
|
||||
|
||||
// Check if the directory is a directory
|
||||
if !path.is_dir() {
|
||||
return Err(WebBuilderError::InvalidConfiguration(format!(
|
||||
"{:?} is not a directory",
|
||||
path
|
||||
)));
|
||||
}
|
||||
|
||||
// Create a basic site configuration
|
||||
let mut site_config = SiteConfig {
|
||||
name: "default".to_string(),
|
||||
title: "".to_string(),
|
||||
description: None,
|
||||
keywords: None,
|
||||
url: None,
|
||||
favicon: None,
|
||||
header: None,
|
||||
footer: None,
|
||||
collections: Vec::new(),
|
||||
pages: Vec::new(),
|
||||
base_path: path.to_path_buf(),
|
||||
};
|
||||
|
||||
// Parse main.hjson
|
||||
let main_path = path.join("main.hjson");
|
||||
if main_path.exists() {
|
||||
let main_config: Value = parse_file(main_path, strategy)?;
|
||||
|
||||
// Extract values from main.hjson
|
||||
if let Some(name) = main_config.get("name").and_then(|v| v.as_str()) {
|
||||
site_config.name = name.to_string();
|
||||
}
|
||||
if let Some(title) = main_config.get("title").and_then(|v| v.as_str()) {
|
||||
site_config.title = title.to_string();
|
||||
}
|
||||
if let Some(description) = main_config.get("description").and_then(|v| v.as_str()) {
|
||||
site_config.description = Some(description.to_string());
|
||||
}
|
||||
if let Some(url) = main_config.get("url").and_then(|v| v.as_str()) {
|
||||
site_config.url = Some(url.to_string());
|
||||
}
|
||||
if let Some(favicon) = main_config.get("favicon").and_then(|v| v.as_str()) {
|
||||
site_config.favicon = Some(favicon.to_string());
|
||||
}
|
||||
if let Some(keywords) = main_config.get("keywords").and_then(|v| v.as_array()) {
|
||||
let keywords_vec: Vec<String> = keywords
|
||||
.iter()
|
||||
.filter_map(|k| k.as_str().map(|s| s.to_string()))
|
||||
.collect();
|
||||
if !keywords_vec.is_empty() {
|
||||
site_config.keywords = Some(keywords_vec);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Parse header.hjson
|
||||
let header_path = path.join("header.hjson");
|
||||
if header_path.exists() {
|
||||
site_config.header = Some(parse_file(header_path, strategy)?);
|
||||
}
|
||||
|
||||
// Parse footer.hjson
|
||||
let footer_path = path.join("footer.hjson");
|
||||
if footer_path.exists() {
|
||||
site_config.footer = Some(parse_file(footer_path, strategy)?);
|
||||
}
|
||||
|
||||
// Parse collection.hjson
|
||||
let collection_path = path.join("collection.hjson");
|
||||
if collection_path.exists() {
|
||||
let collection_array: Vec<CollectionConfig> = parse_file(collection_path, strategy)?;
|
||||
|
||||
// Process each collection
|
||||
for mut collection in collection_array {
|
||||
// Convert web interface URL to Git URL if needed
|
||||
if let Some(url) = &collection.url {
|
||||
if url.contains("/src/branch/") {
|
||||
// This is a web interface URL, convert it to a Git URL
|
||||
let parts: Vec<&str> = url.split("/src/branch/").collect();
|
||||
if parts.len() == 2 {
|
||||
collection.url = Some(format!("{}.git", parts[0]));
|
||||
}
|
||||
}
|
||||
}
|
||||
site_config.collections.push(collection);
|
||||
}
|
||||
}
|
||||
|
||||
// Parse pages directory
|
||||
let pages_path = path.join("pages");
|
||||
if pages_path.exists() && pages_path.is_dir() {
|
||||
for entry in fs::read_dir(pages_path)? {
|
||||
let entry = entry?;
|
||||
let entry_path = entry.path();
|
||||
|
||||
if entry_path.is_file() && entry_path.extension().map_or(false, |ext| ext == "hjson") {
|
||||
let pages_array: Vec<PageConfig> = parse_file(&entry_path, strategy)?;
|
||||
site_config.pages.extend(pages_array);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
Ok(site_config)
|
||||
}
|
||||
|
||||
/// Parse site configuration from a directory using the recommended strategy (Hjson)
|
||||
///
|
||||
/// # Arguments
|
||||
///
|
||||
/// * `path` - Path to the directory containing configuration files
|
||||
///
|
||||
/// # Returns
|
||||
///
|
||||
/// The parsed site configuration or an error
|
||||
pub fn parse_site_config_recommended<P: AsRef<Path>>(path: P) -> Result<SiteConfig> {
|
||||
parse_site_config_with_strategy(path, ParsingStrategy::Hjson)
|
||||
}
|
||||
|
||||
/// Parse site configuration from a directory using the auto-detect strategy
|
||||
///
|
||||
/// # Arguments
|
||||
///
|
||||
/// * `path` - Path to the directory containing configuration files
|
||||
///
|
||||
/// # Returns
|
||||
///
|
||||
/// The parsed site configuration or an error
|
||||
pub fn parse_site_config_auto<P: AsRef<Path>>(path: P) -> Result<SiteConfig> {
|
||||
parse_site_config_with_strategy(path, ParsingStrategy::Auto)
|
||||
}
|
||||
|
@ -1,161 +0,0 @@
|
||||
use std::fs;
|
||||
use std::path::Path;
|
||||
|
||||
use deser_hjson::from_str;
|
||||
use serde::de::DeserializeOwned;
|
||||
use serde_json::Value;
|
||||
|
||||
use crate::config::{
|
||||
CollectionConfig, PageConfig, SiteConfig,
|
||||
};
|
||||
use crate::error::{Result, WebBuilderError};
|
||||
|
||||
/// Parse a hjson file into a struct
|
||||
///
|
||||
/// # Arguments
|
||||
///
|
||||
/// * `path` - Path to the hjson file
|
||||
///
|
||||
/// # Returns
|
||||
///
|
||||
/// The parsed struct or an error
|
||||
pub fn parse_hjson<T, P>(path: P) -> Result<T>
|
||||
where
|
||||
T: DeserializeOwned,
|
||||
P: AsRef<Path>,
|
||||
{
|
||||
let path = path.as_ref();
|
||||
|
||||
// Check if the file exists
|
||||
if !path.exists() {
|
||||
return Err(WebBuilderError::MissingFile(path.to_path_buf()));
|
||||
}
|
||||
|
||||
// Read the file
|
||||
let content = fs::read_to_string(path).map_err(|e| WebBuilderError::IoError(e))?;
|
||||
|
||||
// Parse the hjson
|
||||
from_str(&content).map_err(|e| WebBuilderError::HjsonError(format!("Error parsing {:?}: {}", path, e)))
|
||||
}
|
||||
|
||||
/// Parse site configuration from a directory
|
||||
///
|
||||
/// # Arguments
|
||||
///
|
||||
/// * `path` - Path to the directory containing hjson configuration files
|
||||
///
|
||||
/// # Returns
|
||||
///
|
||||
/// The parsed site configuration or an error
|
||||
pub fn parse_site_config<P: AsRef<Path>>(path: P) -> Result<SiteConfig> {
|
||||
let path = path.as_ref();
|
||||
|
||||
// Check if the directory exists
|
||||
if !path.exists() {
|
||||
return Err(WebBuilderError::MissingDirectory(path.to_path_buf()));
|
||||
}
|
||||
|
||||
// Check if the directory is a directory
|
||||
if !path.is_dir() {
|
||||
return Err(WebBuilderError::InvalidConfiguration(format!(
|
||||
"{:?} is not a directory",
|
||||
path
|
||||
)));
|
||||
}
|
||||
|
||||
// Create a basic site configuration
|
||||
let mut site_config = SiteConfig {
|
||||
name: "default".to_string(),
|
||||
title: "".to_string(),
|
||||
description: None,
|
||||
keywords: None,
|
||||
url: None,
|
||||
favicon: None,
|
||||
header: None,
|
||||
footer: None,
|
||||
collections: Vec::new(),
|
||||
pages: Vec::new(),
|
||||
base_path: path.to_path_buf(),
|
||||
};
|
||||
|
||||
// Parse main.hjson
|
||||
let main_path = path.join("main.hjson");
|
||||
if main_path.exists() {
|
||||
let main_config: Value = parse_hjson(main_path)?;
|
||||
|
||||
// Extract values from main.hjson
|
||||
if let Some(name) = main_config.get("name").and_then(|v| v.as_str()) {
|
||||
site_config.name = name.to_string();
|
||||
}
|
||||
if let Some(title) = main_config.get("title").and_then(|v| v.as_str()) {
|
||||
site_config.title = title.to_string();
|
||||
}
|
||||
if let Some(description) = main_config.get("description").and_then(|v| v.as_str()) {
|
||||
site_config.description = Some(description.to_string());
|
||||
}
|
||||
if let Some(url) = main_config.get("url").and_then(|v| v.as_str()) {
|
||||
site_config.url = Some(url.to_string());
|
||||
}
|
||||
if let Some(favicon) = main_config.get("favicon").and_then(|v| v.as_str()) {
|
||||
site_config.favicon = Some(favicon.to_string());
|
||||
}
|
||||
if let Some(keywords) = main_config.get("keywords").and_then(|v| v.as_array()) {
|
||||
let keywords_vec: Vec<String> = keywords
|
||||
.iter()
|
||||
.filter_map(|k| k.as_str().map(|s| s.to_string()))
|
||||
.collect();
|
||||
if !keywords_vec.is_empty() {
|
||||
site_config.keywords = Some(keywords_vec);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Parse header.hjson
|
||||
let header_path = path.join("header.hjson");
|
||||
if header_path.exists() {
|
||||
site_config.header = Some(parse_hjson(header_path)?);
|
||||
}
|
||||
|
||||
// Parse footer.hjson
|
||||
let footer_path = path.join("footer.hjson");
|
||||
if footer_path.exists() {
|
||||
site_config.footer = Some(parse_hjson(footer_path)?);
|
||||
}
|
||||
|
||||
// Parse collection.hjson
|
||||
let collection_path = path.join("collection.hjson");
|
||||
if collection_path.exists() {
|
||||
let collection_array: Vec<CollectionConfig> = parse_hjson(collection_path)?;
|
||||
|
||||
// Process each collection
|
||||
for mut collection in collection_array {
|
||||
// Convert web interface URL to Git URL if needed
|
||||
if let Some(url) = &collection.url {
|
||||
if url.contains("/src/branch/") {
|
||||
// This is a web interface URL, convert it to a Git URL
|
||||
let parts: Vec<&str> = url.split("/src/branch/").collect();
|
||||
if parts.len() == 2 {
|
||||
collection.url = Some(format!("{}.git", parts[0]));
|
||||
}
|
||||
}
|
||||
}
|
||||
site_config.collections.push(collection);
|
||||
}
|
||||
}
|
||||
|
||||
// Parse pages directory
|
||||
let pages_path = path.join("pages");
|
||||
if pages_path.exists() && pages_path.is_dir() {
|
||||
for entry in fs::read_dir(pages_path)? {
|
||||
let entry = entry?;
|
||||
let entry_path = entry.path();
|
||||
|
||||
if entry_path.is_file() && entry_path.extension().map_or(false, |ext| ext == "hjson") {
|
||||
let pages_array: Vec<PageConfig> = parse_hjson(&entry_path)?;
|
||||
site_config.pages.extend(pages_array);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
Ok(site_config)
|
||||
}
|
@ -1,277 +0,0 @@
|
||||
use std::fs;
|
||||
use std::path::Path;
|
||||
|
||||
use crate::config::{
|
||||
CollectionConfig, HeaderConfig, LogoConfig, PageConfig, SiteConfig,
|
||||
};
|
||||
use crate::error::{Result, WebBuilderError};
|
||||
|
||||
/// Parse site configuration from a directory using a simple approach
|
||||
///
|
||||
/// # Arguments
|
||||
///
|
||||
/// * `path` - Path to the directory containing hjson configuration files
|
||||
///
|
||||
/// # Returns
|
||||
///
|
||||
/// The parsed site configuration or an error
|
||||
pub fn parse_site_config<P: AsRef<Path>>(path: P) -> Result<SiteConfig> {
|
||||
let path = path.as_ref();
|
||||
|
||||
// Check if the directory exists
|
||||
if !path.exists() {
|
||||
return Err(WebBuilderError::MissingDirectory(path.to_path_buf()));
|
||||
}
|
||||
|
||||
// Check if the directory is a directory
|
||||
if !path.is_dir() {
|
||||
return Err(WebBuilderError::InvalidConfiguration(format!(
|
||||
"{:?} is not a directory",
|
||||
path
|
||||
)));
|
||||
}
|
||||
|
||||
// Create a basic site configuration
|
||||
let mut site_config = SiteConfig {
|
||||
name: "default".to_string(),
|
||||
title: "".to_string(),
|
||||
description: None,
|
||||
keywords: None,
|
||||
url: None,
|
||||
favicon: None,
|
||||
header: None,
|
||||
footer: None,
|
||||
collections: Vec::new(),
|
||||
pages: Vec::new(),
|
||||
base_path: path.to_path_buf(),
|
||||
};
|
||||
|
||||
// Parse main.hjson
|
||||
let main_path = path.join("main.hjson");
|
||||
if main_path.exists() {
|
||||
let content = fs::read_to_string(&main_path).map_err(|e| WebBuilderError::IoError(e))?;
|
||||
|
||||
// Extract values from main.hjson
|
||||
for line in content.lines() {
|
||||
let line = line.trim();
|
||||
|
||||
// Skip comments and empty lines
|
||||
if line.starts_with('#') || line.is_empty() {
|
||||
continue;
|
||||
}
|
||||
|
||||
// Parse key-value pairs
|
||||
if let Some(pos) = line.find(':') {
|
||||
let key = line[..pos].trim();
|
||||
let value = line[pos + 1..].trim();
|
||||
|
||||
match key {
|
||||
"title" => site_config.title = value.to_string(),
|
||||
"name" => site_config.name = value.to_string(),
|
||||
"description" => site_config.description = Some(value.to_string()),
|
||||
"url" => site_config.url = Some(value.to_string()),
|
||||
"favicon" => site_config.favicon = Some(value.to_string()),
|
||||
_ => {} // Ignore other keys
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Parse header.hjson
|
||||
let header_path = path.join("header.hjson");
|
||||
if header_path.exists() {
|
||||
let content = fs::read_to_string(&header_path).map_err(|e| WebBuilderError::IoError(e))?;
|
||||
|
||||
// Create a basic header configuration
|
||||
let mut header_config = HeaderConfig {
|
||||
logo: None,
|
||||
title: None,
|
||||
menu: None,
|
||||
login: None,
|
||||
};
|
||||
|
||||
// Extract values from header.hjson
|
||||
let mut in_logo = false;
|
||||
let mut logo_src = None;
|
||||
let mut logo_alt = None;
|
||||
|
||||
for line in content.lines() {
|
||||
let line = line.trim();
|
||||
|
||||
// Skip comments and empty lines
|
||||
if line.starts_with('#') || line.is_empty() {
|
||||
continue;
|
||||
}
|
||||
|
||||
// Handle logo section
|
||||
if line == "logo:" {
|
||||
in_logo = true;
|
||||
continue;
|
||||
}
|
||||
|
||||
if in_logo {
|
||||
if line.starts_with("src:") {
|
||||
logo_src = Some(line[4..].trim().to_string());
|
||||
} else if line.starts_with("alt:") {
|
||||
logo_alt = Some(line[4..].trim().to_string());
|
||||
} else if !line.starts_with(' ') {
|
||||
in_logo = false;
|
||||
}
|
||||
}
|
||||
|
||||
// Parse other key-value pairs
|
||||
if let Some(pos) = line.find(':') {
|
||||
let key = line[..pos].trim();
|
||||
let value = line[pos + 1..].trim();
|
||||
|
||||
if key == "title" {
|
||||
header_config.title = Some(value.to_string());
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Set logo if we have a source
|
||||
if let Some(src) = logo_src {
|
||||
header_config.logo = Some(LogoConfig { src, alt: logo_alt });
|
||||
}
|
||||
|
||||
site_config.header = Some(header_config);
|
||||
}
|
||||
|
||||
// Parse collection.hjson
|
||||
let collection_path = path.join("collection.hjson");
|
||||
if collection_path.exists() {
|
||||
let content =
|
||||
fs::read_to_string(&collection_path).map_err(|e| WebBuilderError::IoError(e))?;
|
||||
|
||||
// Extract collections
|
||||
let mut collections = Vec::new();
|
||||
let mut current_collection: Option<CollectionConfig> = None;
|
||||
|
||||
for line in content.lines() {
|
||||
let line = line.trim();
|
||||
|
||||
// Skip comments and empty lines
|
||||
if line.starts_with('#') || line.is_empty() {
|
||||
continue;
|
||||
}
|
||||
|
||||
// Start of a new collection
|
||||
if line == "{" {
|
||||
current_collection = Some(CollectionConfig {
|
||||
name: None,
|
||||
url: None,
|
||||
description: None,
|
||||
scan: None,
|
||||
});
|
||||
continue;
|
||||
}
|
||||
|
||||
// End of a collection
|
||||
if line == "}" && current_collection.is_some() {
|
||||
collections.push(current_collection.take().unwrap());
|
||||
continue;
|
||||
}
|
||||
|
||||
// Parse key-value pairs within a collection
|
||||
if let Some(pos) = line.find(':') {
|
||||
let key = line[..pos].trim();
|
||||
let value = line[pos + 1..].trim();
|
||||
|
||||
if let Some(ref mut collection) = current_collection {
|
||||
match key {
|
||||
"name" => collection.name = Some(value.to_string()),
|
||||
"url" => {
|
||||
// Convert web interface URL to Git URL
|
||||
let git_url = if value.contains("/src/branch/") {
|
||||
// This is a web interface URL, convert it to a Git URL
|
||||
let parts: Vec<&str> = value.split("/src/branch/").collect();
|
||||
if parts.len() == 2 {
|
||||
format!("{}.git", parts[0])
|
||||
} else {
|
||||
value.to_string()
|
||||
}
|
||||
} else {
|
||||
value.to_string()
|
||||
};
|
||||
collection.url = Some(git_url);
|
||||
}
|
||||
"description" => collection.description = Some(value.to_string()),
|
||||
"scan" => collection.scan = Some(value == "true"),
|
||||
_ => {} // Ignore other keys
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
site_config.collections = collections;
|
||||
}
|
||||
|
||||
// Parse pages directory
|
||||
let pages_path = path.join("pages");
|
||||
if pages_path.exists() && pages_path.is_dir() {
|
||||
for entry in fs::read_dir(pages_path)? {
|
||||
let entry = entry?;
|
||||
let entry_path = entry.path();
|
||||
|
||||
if entry_path.is_file() && entry_path.extension().map_or(false, |ext| ext == "hjson") {
|
||||
let content =
|
||||
fs::read_to_string(&entry_path).map_err(|e| WebBuilderError::IoError(e))?;
|
||||
|
||||
// Extract pages
|
||||
let mut pages = Vec::new();
|
||||
let mut current_page: Option<PageConfig> = None;
|
||||
|
||||
for line in content.lines() {
|
||||
let line = line.trim();
|
||||
|
||||
// Skip comments and empty lines
|
||||
if line.starts_with('#') || line.is_empty() {
|
||||
continue;
|
||||
}
|
||||
|
||||
// Start of a new page
|
||||
if line == "{" {
|
||||
current_page = Some(PageConfig {
|
||||
name: "".to_string(),
|
||||
title: "".to_string(),
|
||||
description: None,
|
||||
navpath: "".to_string(),
|
||||
collection: "".to_string(),
|
||||
draft: None,
|
||||
});
|
||||
continue;
|
||||
}
|
||||
|
||||
// End of a page
|
||||
if line == "}" && current_page.is_some() {
|
||||
pages.push(current_page.take().unwrap());
|
||||
continue;
|
||||
}
|
||||
|
||||
// Parse key-value pairs within a page
|
||||
if let Some(pos) = line.find(':') {
|
||||
let key = line[..pos].trim();
|
||||
let value = line[pos + 1..].trim();
|
||||
|
||||
if let Some(ref mut page) = current_page {
|
||||
match key {
|
||||
"name" => page.name = value.to_string(),
|
||||
"title" => page.title = value.to_string(),
|
||||
"description" => page.description = Some(value.to_string()),
|
||||
"navpath" => page.navpath = value.to_string(),
|
||||
"collection" => page.collection = value.to_string(),
|
||||
"draft" => page.draft = Some(value == "true"),
|
||||
_ => {} // Ignore other keys
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
site_config.pages.extend(pages);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
Ok(site_config)
|
||||
}
|
@ -1,209 +0,0 @@
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use crate::error::WebBuilderError;
|
||||
use crate::parser_simple::parse_site_config;
|
||||
use std::fs;
|
||||
use std::path::PathBuf;
|
||||
use tempfile::TempDir;
|
||||
|
||||
fn create_test_site(temp_dir: &TempDir) -> PathBuf {
|
||||
let site_dir = temp_dir.path().join("site");
|
||||
fs::create_dir(&site_dir).unwrap();
|
||||
|
||||
// Create main.hjson
|
||||
let main_hjson = r#"
|
||||
# Main configuration
|
||||
name: test
|
||||
title: Test Site
|
||||
description: A test site
|
||||
url: https://example.com
|
||||
favicon: favicon.ico
|
||||
"#;
|
||||
fs::write(site_dir.join("main.hjson"), main_hjson).unwrap();
|
||||
|
||||
// Create header.hjson
|
||||
let header_hjson = r#"
|
||||
# Header configuration
|
||||
title: Test Site
|
||||
logo:
|
||||
src: logo.png
|
||||
alt: Logo
|
||||
"#;
|
||||
fs::write(site_dir.join("header.hjson"), header_hjson).unwrap();
|
||||
|
||||
// Create collection.hjson
|
||||
let collection_hjson = r#"
|
||||
# Collections
|
||||
{
|
||||
name: test
|
||||
url: https://git.ourworld.tf/tfgrid/home.git
|
||||
description: A test collection
|
||||
scan: true
|
||||
}
|
||||
{
|
||||
name: test2
|
||||
url: https://git.example.com/src/branch/main/test2
|
||||
description: Another test collection
|
||||
}
|
||||
"#;
|
||||
fs::write(site_dir.join("collection.hjson"), collection_hjson).unwrap();
|
||||
|
||||
// Create pages directory
|
||||
let pages_dir = site_dir.join("pages");
|
||||
fs::create_dir(&pages_dir).unwrap();
|
||||
|
||||
// Create pages/pages.hjson
|
||||
let pages_hjson = r#"
|
||||
# Pages
|
||||
{
|
||||
name: home
|
||||
title: Home
|
||||
description: Home page
|
||||
navpath: /
|
||||
collection: test
|
||||
draft: false
|
||||
}
|
||||
{
|
||||
name: about
|
||||
title: About
|
||||
description: About page
|
||||
navpath: /about
|
||||
collection: test
|
||||
}
|
||||
"#;
|
||||
fs::write(pages_dir.join("pages.hjson"), pages_hjson).unwrap();
|
||||
|
||||
site_dir
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_parse_site_config() {
|
||||
let temp_dir = TempDir::new().unwrap();
|
||||
let site_dir = create_test_site(&temp_dir);
|
||||
|
||||
let config = parse_site_config(&site_dir).unwrap();
|
||||
|
||||
// Check basic site info
|
||||
assert_eq!(config.name, "test");
|
||||
assert_eq!(config.title, "Test Site");
|
||||
assert_eq!(config.description, Some("A test site".to_string()));
|
||||
assert_eq!(config.url, Some("https://example.com".to_string()));
|
||||
assert_eq!(config.favicon, Some("favicon.ico".to_string()));
|
||||
|
||||
// Check header
|
||||
assert!(config.header.is_some());
|
||||
let header = config.header.as_ref().unwrap();
|
||||
assert_eq!(header.title, Some("Test Site".to_string()));
|
||||
assert!(header.logo.is_some());
|
||||
let logo = header.logo.as_ref().unwrap();
|
||||
assert_eq!(logo.src, "logo.png");
|
||||
assert_eq!(logo.alt, Some("Logo".to_string()));
|
||||
|
||||
// Check collections
|
||||
assert_eq!(config.collections.len(), 2);
|
||||
|
||||
// First collection
|
||||
assert_eq!(config.collections[0].name, Some("test".to_string()));
|
||||
assert_eq!(
|
||||
config.collections[0].url,
|
||||
Some("https://git.ourworld.tf/tfgrid/home.git".to_string())
|
||||
);
|
||||
assert_eq!(
|
||||
config.collections[0].description,
|
||||
Some("A test collection".to_string())
|
||||
);
|
||||
assert_eq!(config.collections[0].scan, Some(true));
|
||||
|
||||
// Second collection (with URL conversion)
|
||||
assert_eq!(config.collections[1].name, Some("test2".to_string()));
|
||||
assert_eq!(
|
||||
config.collections[1].url,
|
||||
Some("https://git.example.com.git".to_string())
|
||||
);
|
||||
assert_eq!(
|
||||
config.collections[1].description,
|
||||
Some("Another test collection".to_string())
|
||||
);
|
||||
assert_eq!(config.collections[1].scan, None);
|
||||
|
||||
// Check pages
|
||||
assert_eq!(config.pages.len(), 2);
|
||||
|
||||
// First page
|
||||
assert_eq!(config.pages[0].name, "home");
|
||||
assert_eq!(config.pages[0].title, "Home");
|
||||
assert_eq!(config.pages[0].description, Some("Home page".to_string()));
|
||||
assert_eq!(config.pages[0].navpath, "/");
|
||||
assert_eq!(config.pages[0].collection, "test");
|
||||
assert_eq!(config.pages[0].draft, Some(false));
|
||||
|
||||
// Second page
|
||||
assert_eq!(config.pages[1].name, "about");
|
||||
assert_eq!(config.pages[1].title, "About");
|
||||
assert_eq!(config.pages[1].description, Some("About page".to_string()));
|
||||
assert_eq!(config.pages[1].navpath, "/about");
|
||||
assert_eq!(config.pages[1].collection, "test");
|
||||
assert_eq!(config.pages[1].draft, None);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_parse_site_config_missing_directory() {
|
||||
let result = parse_site_config("/nonexistent/directory");
|
||||
assert!(matches!(result, Err(WebBuilderError::MissingDirectory(_))));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_parse_site_config_not_a_directory() {
|
||||
let temp_dir = TempDir::new().unwrap();
|
||||
let file_path = temp_dir.path().join("file.txt");
|
||||
fs::write(&file_path, "not a directory").unwrap();
|
||||
|
||||
let result = parse_site_config(&file_path);
|
||||
assert!(matches!(
|
||||
result,
|
||||
Err(WebBuilderError::InvalidConfiguration(_))
|
||||
));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_parse_site_config_minimal() {
|
||||
let temp_dir = TempDir::new().unwrap();
|
||||
let site_dir = temp_dir.path().join("site");
|
||||
fs::create_dir(&site_dir).unwrap();
|
||||
|
||||
// Create minimal main.hjson
|
||||
let main_hjson = "name: minimal\ntitle: Minimal Site";
|
||||
fs::write(site_dir.join("main.hjson"), main_hjson).unwrap();
|
||||
|
||||
let config = parse_site_config(&site_dir).unwrap();
|
||||
|
||||
assert_eq!(config.name, "minimal");
|
||||
assert_eq!(config.title, "Minimal Site");
|
||||
assert_eq!(config.description, None);
|
||||
assert_eq!(config.url, None);
|
||||
assert_eq!(config.favicon, None);
|
||||
assert!(config.header.is_none());
|
||||
assert!(config.footer.is_none());
|
||||
assert!(config.collections.is_empty());
|
||||
assert!(config.pages.is_empty());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_parse_site_config_empty() {
|
||||
let temp_dir = TempDir::new().unwrap();
|
||||
let site_dir = temp_dir.path().join("site");
|
||||
fs::create_dir(&site_dir).unwrap();
|
||||
|
||||
let config = parse_site_config(&site_dir).unwrap();
|
||||
|
||||
assert_eq!(config.name, "default");
|
||||
assert_eq!(config.title, "");
|
||||
assert_eq!(config.description, None);
|
||||
assert_eq!(config.url, None);
|
||||
assert_eq!(config.favicon, None);
|
||||
assert!(config.header.is_none());
|
||||
assert!(config.footer.is_none());
|
||||
assert!(config.collections.is_empty());
|
||||
assert!(config.pages.is_empty());
|
||||
}
|
||||
}
|
@ -1,7 +1,7 @@
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use crate::error::WebBuilderError;
|
||||
use crate::parser_hjson::parse_site_config;
|
||||
use crate::parser::{parse_site_config_with_strategy, ParsingStrategy};
|
||||
use std::fs;
|
||||
use std::path::PathBuf;
|
||||
use tempfile::TempDir;
|
||||
@ -47,29 +47,6 @@ mod tests {
|
||||
}"#;
|
||||
fs::write(site_dir.join("header.hjson"), header_hjson).unwrap();
|
||||
|
||||
// Create footer.hjson
|
||||
let footer_hjson = r#"{
|
||||
# Footer configuration
|
||||
"title": "Footer",
|
||||
"copyright": "© 2023 Test",
|
||||
"sections": [
|
||||
{
|
||||
"title": "Links",
|
||||
"links": [
|
||||
{
|
||||
"label": "Home",
|
||||
"href": "/"
|
||||
},
|
||||
{
|
||||
"label": "About",
|
||||
"href": "/about"
|
||||
}
|
||||
]
|
||||
}
|
||||
]
|
||||
}"#;
|
||||
fs::write(site_dir.join("footer.hjson"), footer_hjson).unwrap();
|
||||
|
||||
// Create collection.hjson
|
||||
let collection_hjson = r#"[
|
||||
{
|
||||
@ -118,11 +95,11 @@ mod tests {
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_parse_site_config() {
|
||||
fn test_parse_site_config_hjson() {
|
||||
let temp_dir = TempDir::new().unwrap();
|
||||
let site_dir = create_test_site(&temp_dir);
|
||||
|
||||
let config = parse_site_config(&site_dir).unwrap();
|
||||
let config = parse_site_config_with_strategy(&site_dir, ParsingStrategy::Hjson).unwrap();
|
||||
|
||||
// Check basic site info
|
||||
assert_eq!(config.name, "test");
|
||||
@ -147,24 +124,6 @@ mod tests {
|
||||
let logo = header.logo.as_ref().unwrap();
|
||||
assert_eq!(logo.src, "logo.png");
|
||||
assert_eq!(logo.alt, Some("Logo".to_string()));
|
||||
assert!(header.menu.is_some());
|
||||
let menu = header.menu.as_ref().unwrap();
|
||||
assert_eq!(menu.len(), 2);
|
||||
assert_eq!(menu[0].label, "Home");
|
||||
assert_eq!(menu[0].link, "/");
|
||||
|
||||
// Check footer
|
||||
assert!(config.footer.is_some());
|
||||
let footer = config.footer.as_ref().unwrap();
|
||||
assert_eq!(footer.title, Some("Footer".to_string()));
|
||||
assert_eq!(footer.copyright, Some("© 2023 Test".to_string()));
|
||||
assert!(footer.sections.is_some());
|
||||
let sections = footer.sections.as_ref().unwrap();
|
||||
assert_eq!(sections.len(), 1);
|
||||
assert_eq!(sections[0].title, "Links");
|
||||
assert_eq!(sections[0].links.len(), 2);
|
||||
assert_eq!(sections[0].links[0].label, "Home");
|
||||
assert_eq!(sections[0].links[0].href, "/");
|
||||
|
||||
// Check collections
|
||||
assert_eq!(config.collections.len(), 2);
|
||||
@ -213,9 +172,41 @@ mod tests {
|
||||
assert_eq!(config.pages[1].draft, None);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_parse_site_config_auto() {
|
||||
let temp_dir = TempDir::new().unwrap();
|
||||
let site_dir = create_test_site(&temp_dir);
|
||||
|
||||
let config = parse_site_config_with_strategy(&site_dir, ParsingStrategy::Auto).unwrap();
|
||||
|
||||
// Basic checks to ensure it worked
|
||||
assert_eq!(config.name, "test");
|
||||
assert_eq!(config.title, "Test Site");
|
||||
assert_eq!(config.collections.len(), 2);
|
||||
assert_eq!(config.pages.len(), 2);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_parse_site_config_simple() {
|
||||
let temp_dir = TempDir::new().unwrap();
|
||||
let site_dir = temp_dir.path().join("site");
|
||||
fs::create_dir(&site_dir).unwrap();
|
||||
|
||||
// Create main.hjson in a format that the simple parser can handle
|
||||
let main_hjson = "name: test\ntitle: Test Site\ndescription: A test site";
|
||||
fs::write(site_dir.join("main.hjson"), main_hjson).unwrap();
|
||||
|
||||
let config = parse_site_config_with_strategy(&site_dir, ParsingStrategy::Simple).unwrap();
|
||||
|
||||
// Basic checks to ensure it worked
|
||||
assert_eq!(config.name, "test");
|
||||
assert_eq!(config.title, "Test Site");
|
||||
assert_eq!(config.description, Some("A test site".to_string()));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_parse_site_config_missing_directory() {
|
||||
let result = parse_site_config("/nonexistent/directory");
|
||||
let result = parse_site_config_with_strategy("/nonexistent/directory", ParsingStrategy::Hjson);
|
||||
assert!(matches!(result, Err(WebBuilderError::MissingDirectory(_))));
|
||||
}
|
||||
|
||||
@ -225,7 +216,7 @@ mod tests {
|
||||
let file_path = temp_dir.path().join("file.txt");
|
||||
fs::write(&file_path, "not a directory").unwrap();
|
||||
|
||||
let result = parse_site_config(&file_path);
|
||||
let result = parse_site_config_with_strategy(&file_path, ParsingStrategy::Hjson);
|
||||
assert!(matches!(
|
||||
result,
|
||||
Err(WebBuilderError::InvalidConfiguration(_))
|
||||
@ -242,7 +233,7 @@ mod tests {
|
||||
let main_hjson = r#"{ "name": "minimal", "title": "Minimal Site" }"#;
|
||||
fs::write(site_dir.join("main.hjson"), main_hjson).unwrap();
|
||||
|
||||
let config = parse_site_config(&site_dir).unwrap();
|
||||
let config = parse_site_config_with_strategy(&site_dir, ParsingStrategy::Hjson).unwrap();
|
||||
|
||||
assert_eq!(config.name, "minimal");
|
||||
assert_eq!(config.title, "Minimal Site");
|
||||
@ -261,7 +252,7 @@ mod tests {
|
||||
let site_dir = temp_dir.path().join("site");
|
||||
fs::create_dir(&site_dir).unwrap();
|
||||
|
||||
let config = parse_site_config(&site_dir).unwrap();
|
||||
let config = parse_site_config_with_strategy(&site_dir, ParsingStrategy::Hjson).unwrap();
|
||||
|
||||
assert_eq!(config.name, "default");
|
||||
assert_eq!(config.title, "");
|
||||
@ -273,18 +264,4 @@ mod tests {
|
||||
assert!(config.collections.is_empty());
|
||||
assert!(config.pages.is_empty());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_parse_site_config_invalid_hjson() {
|
||||
let temp_dir = TempDir::new().unwrap();
|
||||
let site_dir = temp_dir.path().join("site");
|
||||
fs::create_dir(&site_dir).unwrap();
|
||||
|
||||
// Create invalid main.hjson
|
||||
let main_hjson = r#"{ name: "test, title: "Test Site" }"#; // Missing closing quote
|
||||
fs::write(site_dir.join("main.hjson"), main_hjson).unwrap();
|
||||
|
||||
let result = parse_site_config(&site_dir);
|
||||
assert!(matches!(result, Err(WebBuilderError::HjsonError(_))));
|
||||
}
|
||||
}
|
Loading…
Reference in New Issue
Block a user