Episode 7: Completing the Model & Wiring a CLI

Julien Truffaut image

Julien Truffaut

6 November 2025

Let’s start with a quick status check. So far, the project can:

  • import items from DofusDB,
  • save the raw JSON locally,
  • read and parse that JSON into a (so-far) partial model.

Now it’s time to complete the model so it covers all gear types and characteristics the game throws at us.

Gap analysis

Earlier I made the parser forgiving: unknown characteristics were silently dropped. That was handy for a proof of concept, but now I need to go strict to discover what’s missing.

Before (lenient):

fn parse_characteristics(
    effects: Vec<Effect>
) -> Vec<CharacteristicRange> {
    effects
        .into_iter()
        .filter_map(|e| parse_characteristic(e).ok())
        .collect()
}

After (strict):

fn parse_characteristics(
    effects: Vec<Effect>
) -> Result<Vec<CharacteristicRange>, String> {
    effects
        .into_iter()
        .map(parse_characteristic_range)
        .collect()
}

With this change, the parser will now fail if any characteristic is unknown, which is great for surfacing gaps. Now I want to run through the whole dataset to see what parses cleanly and what doesn’t.

A CLI so I stop commenting code in main

main.rs is importing and saving data from DofusDB, and I’m about to repurpose it to export parsed data. Instead of toggling code by hand (been there…), a small CLI switch is perfect.

Let’s add a dependency to clap:

[dependencies]
clap = { version = "4", features = ["derive"] }

Define the import and export flags:

use clap::Parser;

#[derive(Parser, Debug)]
#[command(author, version, about, long_about = None)]
struct Args {
    /// Import data from DofusDB and save locally
    #[arg(short = 'i', long = "import")]
    import: bool,

    /// Parse local DofusDB JSON files into Rust models and export them
    #[arg(short = 'e', long = "export")]
    export: bool,
}

And finally, wire it up in main.rs:

use anyhow::Result;

#[tokio::main]
async fn main() -> Result<()> {
    let args = Args::parse();

    if args.import {
        println!("Importing data from DofusDB...");
        // fetch & save
    }

    if args.export {
        println!("Exporting DofusDB data to our model...");
        // read, parse, write
    }

    Ok(())
}

Run examples:

> cargo run -- --import
Importing data from DofusDB...
> cargo run -- -i
Importing data from DofusDB...
> cargo run -- -e
Exporting DofusDB data to our model...

Parsing all the things

Parsing all gears is straightforward:

fn parse_gears(
    dofus_db_objects: Vec<DofusDbObject>
) -> Result<Vec<Gear>, String> {
    dofus_db_objects
        .into_iter()
        .map(parse_gear)
        .collect()
}

But this code returns a Result<Vec<Gear>>, which means it either successfully parses all gears or stops at the first error. Ideally, I’d like to know how many gears were parsed correctly and what went wrong for the ones that failed.

fn parse_gears(
    gear_type: &GearType,
    dofus_db_objects: Vec<DofusDbObject>,
) -> Vec<Gear> {
    let number_of_objects = dofus_db_objects.len();
    let mut gears = Vec::new();

    for object in dofus_db_objects {
        let object_name = object.name.en.clone();
        match parse_gear(object) {
            Ok(gear) => gears.push(gear),
            Err(e) => eprintln!("❌ Failed to parse gear: {e} from {}", object_name),
        }
    }

    println!(
        "✅ Successfully parsed {}/{} {gear_type}",
        gears.len(),
        number_of_objects
    );

    gears
}

Do you know a more idiomatic way to do this in Rust? Please let me know if you do!

Exporting model data

Armed with parse_gears, I filled in the remaining CharacteristicType mappings (40 of them!) so the model covers the game properly. Then I exported the model to a dedicated folder:

const IMPORT_PATH: &str = "dofus_db/data";
const EXPORT_PATH: &str = "core/data";

fn export_gears(
    gear_type: &GearType
) -> Result<()> {
    let raw: Vec<DofusDbObject> = read_gears(IMPORT_PATH, gear_type)?;
    let gears: Vec<Gear> = parse_gears(gear_type, raw);
    write_gears(EXPORT_PATH, gear_type, &gears)?;
    Ok(())
}

Resulting layout:

core/
└── data/
    ├── Amulet/
    │   ├── aerdala_amulet.json
    │   ├── helsephine_love.json
    │   └── ...
    ├── Belt/
    │   ├── minotoror.json
    │   ├── ogivol.json
    │   └── ...

Sample file:

{
  "id"       : "aerdala_amulet",
  "name"     : "Aerdala Amulet",
  "gear_type": "Amulet",
  "level"    : 62,
  "characteristics": [
    { "kind": "Vitality"    , "min": 16 , "max": 20 },
    { "kind": "Agility"     , "min": 21 , "max": 30 },
    { "kind": "AbilityPoint", "min":  1 , "max":  1 },
    { "kind": "AirDamage"   , "min":  2 , "max":  3 }
  ]
}

Nice. I’ve now got the full gear dataset in my own model format — which means I can finally start focusing on the fun part: the build optimizer.

As usual, all the code for this episode is available here.