| name | parse-tbl |
| description | Parses a plain-text.tbl (table) file into a structured JSON schema, using '$Name:' as the delimiter for each object. |
| version | 1.0.0 |
TBL Metadata Parser - Data Structure Extraction
Instructions for MigrationArchitect Agent
When tasked with analyzing a .tbl file (e.g., ships.tbl), you MUST use the scripts/parse_tbl.py skill.
The skill will return a structured JSON schema (e.g., ships_schema.json). This JSON is the required input for the generate-resource skill.
When to Use
Use this skill whenever you encounter a .tbl file during migration planning. This is a Plan phase skill that extracts structured data from legacy plain-text database files.
What This Skill Does
This skill deconstructs the legacy plain-text database files into a structured, engine-agnostic JSON schema, forming the first step of the data-driven pipeline.
Extracts These Data Patterns
- Object Delimiters: Uses
$Name:as the delimiter to identify the start of a new object - Key-Value Pairs: Parses all subsequent lines as key-value pairs (splitting on
:) - Data Type Inference: Performs intelligent data type "guessing" to convert text values to:
- float (for numeric values with decimals)
- int (for integer values)
- array (for comma-separated values)
- string (for text values)
Skips Non-Data Content
- Comments and blank lines
- Section headers that don't contain key-value data
- Malformed entries (in non-strict mode)
Expected Output Format
The skill returns structured JSON data like this:
{
"schema_type": "ship_stats",
"entries": {
"GTF Apollo": {
"$Name": "GTF Apollo",
"$Mass": 20000.0,
"$Max Velocity": 70.0,
"$Shields": 150.0,
"$Armor": 100.0,
"Weapons": [
"Subach HL-7",
"Tempest"
]
},
"GTF Artemis": {
"$Name": "GTF Artemis",
"$Mass": 18500.0,
"$Max Velocity": 75.0,
"$Shields": 125.0,
"$Armor": 85.0
}
}
}
Usage Instructions
Basic Usage
python scripts/parse_tbl.py input/ships.tbl > output/ships_schema.json
The skill handles:
- Reading the .tbl file in text mode ('r')
- Using
$Name:as the delimiter to identify the start of a new object - Parsing all subsequent lines as key-value pairs (splitting on
:) - Performing intelligent data type "guessing" to convert text values to appropriate types
- Building a nested dictionary structure (e.g., {"GTF Apollo": {"$Mass": 20000.0, ...}})
- Returning the data as JSON via
print(json.dumps(schema_dict))
Data Type Mapping
- Text with decimals → float
- Text with only digits → int
- Comma-separated values → Array
- Text in quotes → string
- Other text → string
Next Steps in Pipeline
After using this skill:
- Save the JSON output to a file
- Pass the JSON file path to GDScriptEngineer
- GDScriptEngineer will use generate-resource skill with this data
- The generate-resource skill will create Custom Resource scripts with @export properties
Critical: Do NOT Use For
- Parsing binary files (use parse-pof instead)
- Converting 3D geometry (not relevant for table data)
- Extracting metadata from model files (handled separately)
This skill focuses solely on extracting structured data from plain-text table files, providing the clean data foundation for the resource generation pipeline.