Skip to content

Conversation

@bosd
Copy link
Member

@bosd bosd commented Jul 9, 2025

No description provided.

bosd and others added 24 commits July 9, 2025 11:40
Main use case of this library is import/export. Makes sense to have an
import compatible export by default.
The script was previously checking for the database name from the
command-line arguments before it had parsed the config file. This caused
an incorrect "database not found" error when the database was correctly
specified in the config file but not as a `-d` argument.

This commit fixes the bug by reordering the logic in the `main` function
to ensure the config file is read *before* any operations that depend on
its values are executed. The script now correctly uses the database name
from the config file as a fallback if the `-d` flag is not provided.
Adds support for separate source and destination database configurations
to the scaffolding tool, providing a more flexible workflow for data
migrations between different Odoo instances.

Previously, the scaffold assumed a single `connection.conf` for all
operations, making it difficult to read metadata from one database
(e.g., Odoo 12) and generate an import script for another (e.g., Odoo
18).

This commit introduces two new command-line arguments: -
`--source-config`: Specifies the connection file for the source
database. - `--destination-config`: Specifies the connection file for
the destination database.

The script now uses these arguments to: - Generate a `files.py` that
references both config files. - Ensure that operations requiring
database inspection (like `--export-fields`) use the source
configuration. - The generated transformation scripts will be updated in
a subsequent commit to use the destination config for the final import
step.

The old `--config` flag is retained as a fallback for single-database
workflows, ensuring backward compatibility.
Document the new source-config and destination-config. For consistency
use `.conf` extension
transformation scripts

This commit continues the refactoring to make the scaffolding tool
"migration-aware".

The generated Python transformation scripts (`your_model.py`) are now
aware of separate source and destination databases.

- The `Processor` is now initialized with the `source_config_file` to
ensure it reads metadata from the correct source database when
necessary. - The `params` dictionary passed to `processor.process()` now
includes a `'config'` key pointing to the `destination_config_file`.

This ensures that the final `load.sh` script generated by
`processor.write_to_file()` will correctly target the destination
database for the import. This change depends on a corresponding update
to the `odoo-data-flow` library to correctly handle the new `'config'`
parameter.
The `--create-export-script` command was previously using a hardcoded
path (`conf/connection.conf`) instead of respecting the new
migration-aware configuration flags. This caused the generated export
script to point to the wrong database in a migration scenario.

This commit fixes the bug by: 1. Updating the
`create_export_script_file` function to accept the source config file
path as an argument. 2. Passing the correct source config path from the
`main` function. 3. Using this dynamic path to generate the correct
`--config` flag in the export script.

The generated export script now correctly uses the `--source-config`
file, allowing it to pull data from the intended source database.
Performance caan be increased by not exporting all the resized image
fields. Resulting file size will be a lot smaller by excluding them as
well.
--file is deprecated for export commands. Use `--output` now.
performance

The scaffold tool now automatically generates a Polars schema dictionary
(`dtypes`) in the model transformation scripts it creates.

This schema is built by inspecting the field types in the source Odoo
database. By providing this schema to the `Processor`, the CSV reading
step can skip the slow type-inference process, resulting in a
significant performance improvement for large files.

Additionally, the generated `transform.sh` script now includes `export
POLARS_MAX_THREADS` to ensure that Polars can leverage all available CPU
cores for transformations, further boosting performance.
performance

The scaffold tool now automatically generates a Polars schema dictionary
(`dtypes`) in the model transformation scripts it creates.

This schema is built by inspecting the field types in the source Odoo
database. By providing this schema to the `Processor`, the CSV reading
step can skip the slow type-inference process, resulting in a
significant performance improvement for large files.

Additionally, the generated `transform.sh` script now includes `export
POLARS_MAX_THREADS` to ensure that Polars can leverage all available CPU
cores for transformations, further boosting performance.
Make sure the fields only contains a list of clean field names.
The script would crash with an `odoo.exceptions.AccessError` when run by
a user without administrative privileges. This was because it directly
queried protected administrative models (`ir.model.fields`, `ir.model`,
`ir.model.data`) to gather metadata for scaffolding.

This commit introduces several fallback mechanisms to make the process
more resilient and usable in environments with restricted permissions.
The script now gracefully degrades functionality when access is denied,
rather than crashing.

The changes include:

1. **`load_fields()`**: Now attempts to fetch field data via
`ir.model.fields` and, upon failure, falls back to a new
`load_fields_from_model` function. This fallback uses the `fields_get()`
method on the target model itself, which typically has less restrictive
access.

2. **`model_exists()`**: No longer queries `ir.model`. It now confirms a
model's existence and accessibility by attempting to retrieve the model
proxy and perform a lightweight `fields_get()` call, all within a
`try...except` block.

3. **`ModelField.get_info()`**: The lookup for related XML-IDs, which
queries `ir.model.data`, is now wrapped in a `try...except` block. This
makes the helpful feature an optional enhancement that is gracefully
skipped if permissions are insufficient.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants