taipy.config.Config
Configuration singleton.
add_migration_function(target_version, config, migration_fct, **properties)
staticmethod
¶
Add a migration function for a Configuration to migrate entities to the target version.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
target_version |
str
|
The production version that entities are migrated to. |
required |
config |
Union[Section, str]
|
The configuration or the |
required |
migration_fct |
Callable
|
Migration function that takes an entity as input and returns a new entity that is compatible with the target production version. |
required |
**properties |
Dict[str, Any]
|
A keyworded variable length list of additional arguments. |
{}
|
backup(filename)
classmethod
¶
Backup a configuration.
The backup is done in a toml file.
The backed up configuration is a compilation from the three possible methods to configure the application: the Python code configuration, the file configuration and the environment configuration.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
filename |
Union[str, Path]
|
The path of the file to export. |
required |
block_update()
classmethod
¶
Block update on the configuration signgleton.
check()
classmethod
¶
Check configuration.
This method logs issue messages and returns an issue collector.
Returns:
Type | Description |
---|---|
IssueCollector
|
Collector containing the info, warning and error issues. |
configure_authentication(protocol, secret_key=None, auth_session_duration=3600, **properties)
staticmethod
¶
Configure authentication.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
protocol |
str
|
The name of the protocol to configure ("ldap", "taipy" or "none"). |
required |
secret_key |
str
|
A secret string used to internally encrypt the credentials' information. If no value is provided, the first run-time authentication sets the default value to a random text string. |
None
|
auth_session_duration |
int
|
How long, in seconds, are credentials valid after their creation. The default value is 3600, corresponding to an hour. |
3600
|
**properties |
Dict[str, Any]
|
A keyworded variable length list of additional arguments.
|
{}
|
Returns:
Type | Description |
---|---|
AuthenticationConfig
|
The authentication configuration. |
configure_core(root_folder=None, storage_folder=None, taipy_storage_folder=None, repository_type=None, repository_properties=None, read_entity_retry=None, mode=None, version_number=None, force=None, **properties)
staticmethod
¶
Configure the Core service.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
root_folder |
Optional[str]
|
Path of the base folder for the taipy application. The default value is "./taipy/" |
None
|
storage_folder |
str
|
Folder name used to store user data. The default value is "user_data/". It is used in
conjunction with the root_folder attribute. That means the storage path is
|
None
|
taipy_storage_folder |
str
|
Folder name used to store Taipy data. The default value is ".taipy/". It is
used in conjunction with the root_folder attribute. That means the storage path is
|
None
|
repository_type |
Optional[str]
|
The type of the repository to be used to store Taipy data. The default value is "filesystem". |
None
|
repository_properties |
Optional[Dict[str, Union[str, int]]]
|
A dictionary of additional properties to be used by the repository. |
None
|
read_entity_retry |
Optional[int]
|
Number of retries to read an entity from the repository before return failure. The default value is 3. |
None
|
mode |
Optional[str]
|
Indicates the mode of the version management system. Possible values are "development", "experiment", or "production". |
None
|
version_number |
Optional[str]
|
The string identifier of the version. In development mode, the version number is ignored. |
None
|
force |
Optional[bool]
|
If True, Taipy will override a version even if the configuration has changed and run the application. |
None
|
**properties |
Dict[str, Any]
|
A keyworded variable length list of additional arguments configure the
behavior of the |
{}
|
configure_csv_data_node(id, default_path=None, encoding=None, has_header=None, exposed_type=None, scope=None, validity_period=None, **properties)
staticmethod
¶
Configure a new CSV data node configuration.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
id |
str
|
The unique identifier of the new CSV data node configuration. |
required |
default_path |
Optional[str]
|
The default path of the CSV file. |
None
|
encoding |
Optional[str]
|
The encoding of the CSV file. |
None
|
has_header |
Optional[bool]
|
If True, indicates that the CSV file has a header. |
None
|
exposed_type |
Optional[str]
|
The exposed type of the data read from CSV file. |
None
|
scope |
Optional[Scope]
|
The scope of the CSV data node configuration. |
None
|
validity_period |
Optional[timedelta]
|
The duration since the last edit date for which the data node can be considered up-to-date. Once the validity period has passed, the data node is considered stale and relevant tasks will run even if they are skippable (see the Task configs page for more details). If validity_period is set to None, the data node is always up-to-date. |
None
|
**properties |
dict[str, any]
|
A keyworded variable length list of additional arguments. |
{}
|
Returns:
Type | Description |
---|---|
DataNodeConfig
|
The new CSV data node configuration. |
configure_data_node(id, storage_type=None, scope=None, validity_period=None, **properties)
staticmethod
¶
Configure a new data node configuration.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
id |
str
|
The unique identifier of the new data node configuration. |
required |
storage_type |
Optional[str]
|
The data node configuration storage type. The possible values
are None (which is the default value of "pickle", unless it has been overloaded by the
storage_type value set in the default data node configuration
(see |
None
|
scope |
Optional[Scope]
|
The scope of the data node configuration. |
None
|
validity_period |
Optional[timedelta]
|
The duration since the last edit date for which the data node can be considered up-to-date. Once the validity period has passed, the data node is considered stale and relevant tasks will run even if they are skippable (see the Task configs page for more details). If validity_period is set to None, the data node is always up-to-date. |
None
|
**properties |
dict[str, any]
|
A keyworded variable length list of additional arguments. |
{}
|
Returns:
Type | Description |
---|---|
DataNodeConfig
|
The new data node configuration. |
configure_data_node_from(source_configuration, id, **properties)
staticmethod
¶
Configure a new data node configuration from an existing one.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
source_configuration |
DataNodeConfig
|
The source data node configuration. |
required |
id |
str
|
The unique identifier of the new data node configuration. |
required |
**properties |
dict[str, any]
|
A keyworded variable length list of additional arguments. |
{}
|
Returns:
Type | Description |
---|---|
DataNodeConfig
|
The new data node configuration. |
configure_excel_data_node(id, default_path=None, has_header=None, sheet_name=None, exposed_type=None, scope=None, validity_period=None, **properties)
staticmethod
¶
Configure a new Excel data node configuration.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
id |
str
|
The unique identifier of the new Excel data node configuration. |
required |
default_path |
Optional[str]
|
The path of the Excel file. |
None
|
has_header |
Optional[bool]
|
If True, indicates that the Excel file has a header. |
None
|
sheet_name |
Optional[Union[List[str], str]]
|
The list of sheet names to be used. This can be a unique name. |
None
|
exposed_type |
Optional[str]
|
The exposed type of the data read from Excel file. |
None
|
scope |
Optional[Scope]
|
The scope of the Excel data node configuration. |
None
|
validity_period |
Optional[timedelta]
|
The duration since the last edit date for which the data node can be considered up-to-date. Once the validity period has passed, the data node is considered stale and relevant tasks will run even if they are skippable (see the Task configs page for more details). If validity_period is set to None, the data node is always up-to-date. |
None
|
**properties |
dict[str, any]
|
A keyworded variable length list of additional arguments. |
{}
|
Returns:
Type | Description |
---|---|
DataNodeConfig
|
The new Excel data node configuration. |
configure_generic_data_node(id, read_fct=None, write_fct=None, read_fct_args=None, write_fct_args=None, scope=None, validity_period=None, **properties)
staticmethod
¶
Configure a new generic data node configuration.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
id |
str
|
The unique identifier of the new generic data node configuration. |
required |
read_fct |
Optional[Callable]
|
The Python function called to read the data. |
None
|
write_fct |
Optional[Callable]
|
The Python function called to write the data. The provided function must have at least one parameter that receives the data to be written. |
None
|
read_fct_args |
Optional[List]
|
The list of arguments that are passed to the function read_fct to read data. |
None
|
write_fct_args |
Optional[List]
|
The list of arguments that are passed to the function write_fct to write the data. |
None
|
scope |
Optional[Scope]
|
The scope of the Generic data node configuration. |
None
|
validity_period |
Optional[timedelta]
|
The duration since the last edit date for which the data node can be considered up-to-date. Once the validity period has passed, the data node is considered stale and relevant tasks will run even if they are skippable (see the Task configs page for more details). If validity_period is set to None, the data node is always up-to-date. |
None
|
**properties |
dict[str, any]
|
A keyworded variable length list of additional arguments. |
{}
|
configure_global_app(**properties)
classmethod
¶
Configure the global application.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
**properties |
Dict[str, Any]
|
A dictionary of additional properties. |
{}
|
configure_gui(**properties)
staticmethod
¶
NOT DOCUMENTED Configure the Graphical User Interface.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
**properties |
dict[str, any]
|
Keyword arguments that configure the behavior of the |
{}
|
configure_in_memory_data_node(id, default_data=None, scope=None, validity_period=None, **properties)
staticmethod
¶
Configure a new in-memory data node configuration.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
id |
str
|
The unique identifier of the new in_memory data node configuration. |
required |
default_data |
Optional[any]
|
The default data of the data nodes instantiated from this in_memory data node configuration. |
None
|
scope |
Optional[Scope]
|
The scope of the in_memory data node configuration. |
None
|
validity_period |
Optional[timedelta]
|
The duration since the last edit date for which the data node can be considered up-to-date. Once the validity period has passed, the data node is considered stale and relevant tasks will run even if they are skippable (see the Task configs page for more details). If validity_period is set to None, the data node is always up-to-date. |
None
|
**properties |
dict[str, any]
|
A keyworded variable length list of additional arguments. |
{}
|
Returns:
Type | Description |
---|---|
DataNodeConfig
|
The new in-memory data node configuration. |
configure_job_executions(mode=None, max_nb_of_workers=None, **properties)
staticmethod
¶
Configure job execution.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
mode |
Optional[str]
|
The job execution mode. Possible values are: "standalone" (the default value) or "development". |
None
|
max_nb_of_workers |
Optional[int, str]
|
Parameter used only in default "standalone" mode.
This indicates the maximum number of jobs able to run in parallel. |
None
|
**properties |
dict[str, any]
|
A keyworded variable length list of additional arguments. |
{}
|
Returns:
Type | Description |
---|---|
JobConfig
|
The new job execution configuration. |
configure_json_data_node(id, default_path=None, encoding=None, encoder=None, decoder=None, scope=None, validity_period=None, **properties)
staticmethod
¶
Configure a new JSON data node configuration.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
id |
str
|
The unique identifier of the new JSON data node configuration. |
required |
default_path |
Optional[str]
|
The default path of the JSON file. |
None
|
encoding |
Optional[str]
|
The encoding of the JSON file. |
None
|
encoder |
Optional[JSONEncoder]
|
The JSON encoder used to write data into the JSON file. |
None
|
decoder |
Optional[JSONDecoder]
|
The JSON decoder used to read data from the JSON file. |
None
|
scope |
Optional[Scope]
|
The scope of the JSON data node configuration. |
None
|
validity_period |
Optional[timedelta]
|
The duration since the last edit date for which the data node can be considered up-to-date. Once the validity period has passed, the data node is considered stale and relevant tasks will run even if they are skippable (see the Task configs page for more details). If validity_period is set to None, the data node is always up-to-date. |
None
|
**properties |
dict[str, any]
|
A keyworded variable length list of additional arguments. |
{}
|
configure_mongo_collection_data_node(id, db_name, collection_name, custom_document=None, db_username=None, db_password=None, db_host=None, db_port=None, db_driver=None, db_extra_args=None, scope=None, validity_period=None, **properties)
staticmethod
¶
Configure a new Mongo collection data node configuration.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
id |
str
|
The unique identifier of the new Mongo collection data node configuration. |
required |
db_name |
str
|
The database name. |
required |
collection_name |
str
|
The collection in the database to read from and to write the data to. |
required |
custom_document |
Optional[any]
|
The custom document class to store, encode, and decode data when reading and writing to a Mongo collection. The custom_document can have an optional decode() method to decode data in the Mongo collection to a custom object, and an optional encode()) method to encode the object's properties to the Mongo collection when writing. |
None
|
db_username |
Optional[str]
|
The database username. |
None
|
db_password |
Optional[str]
|
The database password. |
None
|
db_host |
Optional[str]
|
The database host. |
None
|
db_port |
Optional[int]
|
The database port. |
None
|
db_driver |
Optional[str]
|
The database driver. |
None
|
db_extra_args |
Optional[dict[str, any]]
|
A dictionary of additional arguments to be passed into database connection string. |
None
|
scope |
Optional[Scope]
|
The scope of the Mongo collection data node configuration. |
None
|
validity_period |
Optional[timedelta]
|
The duration since the last edit date for which the data node can be considered up-to-date. Once the validity period has passed, the data node is considered stale and relevant tasks will run even if they are skippable (see the Task configs page for more details). If validity_period is set to None, the data node is always up-to-date. |
None
|
**properties |
dict[str, any]
|
A keyworded variable length list of additional arguments. |
{}
|
Returns:
Type | Description |
---|---|
DataNodeConfig
|
The new Mongo collection data node configuration. |
configure_parquet_data_node(id, default_path=None, engine=None, compression=None, read_kwargs=None, write_kwargs=None, exposed_type=None, scope=None, validity_period=None, **properties)
staticmethod
¶
Configure a new Parquet data node configuration.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
id |
str
|
The unique identifier of the new Parquet data node configuration. |
required |
default_path |
Optional[str]
|
The default path of the Parquet file. |
None
|
engine |
Optional[str]
|
Parquet library to use. Possible values are "fastparquet" or
"pyarrow". |
None
|
compression |
Optional[str]
|
Name of the compression to use. Possible values are "snappy", "gzip", "brotli", or "none" (no compression). The default value is "snappy". |
None
|
read_kwargs |
Optional[dict]
|
Additional parameters passed to the |
None
|
write_kwargs |
Optional[dict]
|
Additional parameters passed to the
|
None
|
exposed_type |
Optional[str]
|
The exposed type of the data read from Parquet file. |
None
|
scope |
Optional[Scope]
|
The scope of the Parquet data node configuration. |
None
|
validity_period |
Optional[timedelta]
|
The duration since the last edit date for which the data node can be considered up-to-date. Once the validity period has passed, the data node is considered stale and relevant tasks will run even if they are skippable (see the Task configs page for more details). If validity_period is set to None, the data node is always up-to-date. |
None
|
**properties |
dict[str, any]
|
A keyworded variable length list of additional arguments. |
{}
|
Returns:
Type | Description |
---|---|
DataNodeConfig
|
The new Parquet data node configuration. |
configure_pickle_data_node(id, default_path=None, default_data=None, scope=None, validity_period=None, **properties)
staticmethod
¶
Configure a new pickle data node configuration.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
id |
str
|
The unique identifier of the new pickle data node configuration. |
required |
default_path |
Optional[str]
|
The path of the pickle file. |
None
|
default_data |
Optional[any]
|
The default data of the data nodes instantiated from this pickle data node configuration. |
None
|
scope |
Optional[Scope]
|
The scope of the pickle data node configuration. |
None
|
validity_period |
Optional[timedelta]
|
The duration since the last edit date for which the data node can be considered up-to-date. Once the validity period has passed, the data node is considered stale and relevant tasks will run even if they are skippable (see the Task configs page for more details). If validity_period is set to None, the data node is always up-to-date. |
None
|
**properties |
dict[str, any]
|
A keyworded variable length list of additional arguments. |
{}
|
Returns:
Type | Description |
---|---|
DataNodeConfig
|
The new pickle data node configuration. |
configure_s3_object_data_node(id, aws_access_key, aws_secret_access_key, aws_s3_bucket_name, aws_s3_object_key, aws_region=None, aws_s3_object_parameters=None, scope=None, validity_period=None, **properties)
staticmethod
¶
Configure a new S3 object data node configuration.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
id |
str
|
The unique identifier of the new S3 Object data node configuration. |
required |
aws_access_key |
str
|
Amazon Web Services ID for to identify account. |
required |
aws_secret_access_key |
str
|
Amazon Web Services access key to authenticate programmatic requests. |
required |
aws_s3_bucket_name |
str
|
The bucket in S3 to read from and to write the data to. |
required |
aws_region |
Optional[str]
|
Self-contained geographic area where Amazon Web Services (AWS) infrastructure is located. |
None
|
aws_s3_object_parameters |
Optional[dict[str, any]]
|
A dictionary of additional arguments to be passed into AWS S3 bucket access string. |
None
|
scope |
Optional[Scope]
|
The scope of the S3 Object data node configuration. |
None
|
validity_period |
Optional[timedelta]
|
The duration since the last edit date for which the data node can be considered up-to-date. Once the validity period has passed, the data node is considered stale and relevant tasks will run even if they are skippable (see the Task configs page for more details). If validity_period is set to None, the data node is always up-to-date. |
None
|
**properties |
dict[str, any]
|
A keyworded variable length list of additional arguments. |
{}
|
Returns:
Type | Description |
---|---|
DataNodeConfig
|
The new S3 object data node configuration. |
configure_scenario(id, task_configs=None, additional_data_node_configs=None, frequency=None, comparators=None, sequences=None, **properties)
staticmethod
¶
Configure a new scenario configuration.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
id |
str
|
The unique identifier of the new scenario configuration. |
required |
task_configs |
Optional[List[TaskConfig]]
|
The list of task configurations used by this scenario configuration. The default value is None. |
None
|
additional_data_node_configs |
Optional[List[DataNodeConfig]]
|
The list of additional data nodes related to this scenario configuration. The default value is None. |
None
|
frequency |
Optional[Frequency]
|
The scenario frequency. |
None
|
comparators |
Optional[Dict[str, Union[List[Callable], Callable]]]
|
The list of
functions used to compare scenarios. A comparator function is attached to a
scenario's data node configuration. The key of the dictionary parameter
corresponds to the data node configuration id. During the scenarios'
comparison, each comparator is applied to all the data nodes instantiated from
the data node configuration attached to the comparator. See
|
None
|
sequences |
Optional[Dict[str, List[TaskConfig]]]
|
Dictionary of sequence descriptions. The default value is None. |
None
|
**properties |
dict[str, any]
|
A keyworded variable length list of additional arguments. |
{}
|
Returns:
Type | Description |
---|---|
ScenarioConfig
|
The new scenario configuration. |
configure_sql_data_node(id, db_name, db_engine, read_query, write_query_builder, append_query_builder=None, db_username=None, db_password=None, db_host=None, db_port=None, db_driver=None, sqlite_folder_path=None, sqlite_file_extension=None, db_extra_args=None, exposed_type=None, scope=None, validity_period=None, **properties)
staticmethod
¶
Configure a new SQL data node configuration.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
id |
str
|
The unique identifier of the new SQL data node configuration. |
required |
db_name |
str
|
The database name, or the name of the SQLite database file. |
required |
db_engine |
str
|
The database engine. Possible values are "sqlite", "mssql", "mysql", or "postgresql". |
required |
read_query |
str
|
The SQL query string used to read the data from the database. |
required |
write_query_builder |
Callable
|
A callback function that takes the data as an input parameter and returns a list of SQL queries to be executed when writing data to the data node. |
required |
append_query_builder |
Optional[Callable]
|
A callback function that takes the data as an input parameter and returns a list of SQL queries to be executed when appending data to the data node. |
None
|
db_username |
Optional[str]
|
The database username. Required by the "mssql", "mysql", and "postgresql" engines. |
None
|
db_password |
Optional[str]
|
The database password. Required by the "mssql", "mysql", and "postgresql" engines. |
None
|
db_host |
Optional[str]
|
The database host. |
None
|
db_port |
Optional[int]
|
The database port. |
None
|
db_driver |
Optional[str]
|
The database driver. |
None
|
sqlite_folder_path |
Optional[str]
|
The path to the folder that contains SQLite file. |
None
|
sqlite_file_extension |
Optional[str]
|
The file extension of the SQLite file. |
None
|
db_extra_args |
Optional[dict[str, any]]
|
A dictionary of additional arguments to be passed into database connection string. |
None
|
exposed_type |
Optional[str]
|
The exposed type of the data read from SQL query. |
None
|
scope |
Optional[Scope]
|
The scope of the SQL data node configuration. |
None
|
validity_period |
Optional[timedelta]
|
The duration since the last edit date for which the data node can be considered up-to-date. Once the validity period has passed, the data node is considered stale and relevant tasks will run even if they are skippable (see the Task configs page for more details). If validity_period is set to None, the data node is always up-to-date. |
None
|
**properties |
dict[str, any]
|
A keyworded variable length list of additional arguments. |
{}
|
configure_sql_table_data_node(id, db_name, db_engine, table_name, db_username=None, db_password=None, db_host=None, db_port=None, db_driver=None, sqlite_folder_path=None, sqlite_file_extension=None, db_extra_args=None, exposed_type=None, scope=None, validity_period=None, **properties)
staticmethod
¶
Configure a new SQL table data node configuration.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
id |
str
|
The unique identifier of the new SQL data node configuration. |
required |
db_name |
str
|
The database name, or the name of the SQLite database file. |
required |
db_engine |
str
|
The database engine. Possible values are "sqlite", "mssql", "mysql", or "postgresql". |
required |
table_name |
str
|
The name of the SQL table. |
required |
db_username |
Optional[str]
|
The database username. Required by the "mssql", "mysql", and "postgresql" engines. |
None
|
db_password |
Optional[str]
|
The database password. Required by the "mssql", "mysql", and "postgresql" engines. |
None
|
db_host |
Optional[str]
|
The database host. |
None
|
db_port |
Optional[int]
|
The database port. |
None
|
db_driver |
Optional[str]
|
The database driver. |
None
|
sqlite_folder_path |
Optional[str]
|
The path to the folder that contains SQLite file. |
None
|
sqlite_file_extension |
Optional[str]
|
The file extension of the SQLite file. |
None
|
db_extra_args |
Optional[dict[str, any]]
|
A dictionary of additional arguments to be passed into database connection string. |
None
|
exposed_type |
Optional[str]
|
The exposed type of the data read from SQL table. |
None
|
scope |
Optional[Scope]
|
The scope of the SQL data node configuration. |
None
|
validity_period |
Optional[timedelta]
|
The duration since the last edit date for which the data node can be considered up-to-date. Once the validity period has passed, the data node is considered stale and relevant tasks will run even if they are skippable (see the Task configs page for more details). If validity_period is set to None, the data node is always up-to-date. |
None
|
**properties |
dict[str, any]
|
A keyworded variable length list of additional arguments. |
{}
|
Returns:
Type | Description |
---|---|
DataNodeConfig
|
The new SQL data node configuration. |
configure_task(id, function, input=None, output=None, skippable=False, **properties)
staticmethod
¶
Configure a new task configuration.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
id |
str
|
The unique identifier of this task configuration. |
required |
function |
Callable
|
The python function called by Taipy to run the task. |
required |
input |
Optional[Union[DataNodeConfig, List[DataNodeConfig]]]
|
The list of the function input data node configurations. This can be a unique data node configuration if there is a single input data node, or None if there are none. |
None
|
output |
Optional[Union[DataNodeConfig, List[DataNodeConfig]]]
|
The list of the function output data node configurations. This can be a unique data node configuration if there is a single output data node, or None if there are none. |
None
|
skippable |
bool
|
If True, indicates that the task can be skipped if no change has
been made on inputs. |
False
|
**properties |
dict[str, any]
|
A keyworded variable length list of additional arguments. |
{}
|
Returns:
Type | Description |
---|---|
TaskConfig
|
The new task configuration. |
configure_telemetry(enabled=None, service_name=None, otel_endpoint=None, **properties)
staticmethod
¶
Configure the Telemetry service.
Create a telemetry service section in the Taipy Config, holding attributes to configure the telemetry. When enabled, the Taipy application will connect to the Open Telemetry endpoint specified (or "localhost" if not) to send metrics.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
enabled |
Optional[bool]
|
Enable telemetry. If True, the telemetry is activated The default is "false". |
None
|
service_name |
Optional[str]
|
The service name. |
None
|
otel_endpoint |
Optional[str]
|
|
None
|
**properties |
Dict[str, Any]
|
A dictionary of additional properties. |
{}
|
Returns:
Type | Description |
---|---|
The Telemetry Section. |
core()
¶
data_nodes()
¶
export(filename)
classmethod
¶
Export a configuration.
The export is done in a toml file.
The exported configuration is taken from the Python code configuration.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
filename |
Union[str, Path]
|
The path of the file to export. |
required |
global_config()
¶
Return configuration values related to the global application as a GlobalAppConfig
.
job_config()
¶
load(filename)
classmethod
¶
Load a configuration file.
The current Python configuration is replaced and the Config compilation is triggered.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
filename |
Union[str, Path]
|
The path of the toml configuration file to load. |
required |
migration_functions()
¶
override(filename)
classmethod
¶
Load a configuration from a file and overrides the current config.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
filename |
Union[str, Path]
|
The path of the toml configuration file to load. |
required |
restore(filename)
classmethod
¶
Restore a configuration file and replace the current applied configuration.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
filename |
Union[str, Path]
|
The path of the toml configuration file to load. |
required |
scenarios()
¶
sections()
¶
Return all non unique sections.
set_default_data_node_configuration(storage_type, scope=None, validity_period=None, **properties)
staticmethod
¶
Set the default values for data node configurations.
This function creates the default data node configuration object, where all data node configuration objects will find their default values when needed.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
storage_type |
str
|
The default storage type for all data node configurations. The possible values are "pickle" (the default value), "csv", "excel", "sql", "mongo_collection", "in_memory", "json", "parquet", "generic", or "s3_object". |
required |
scope |
Optional[Scope]
|
The default scope for all data node configurations. |
None
|
validity_period |
Optional[timedelta]
|
The duration since the last edit date for which the data node can be considered up-to-date. Once the validity period has passed, the data node is considered stale and relevant tasks will run even if they are skippable (see the Task configs page for more details). If validity_period is set to None, the data node is always up-to-date. |
None
|
**properties |
dict[str, any]
|
A keyworded variable length list of additional arguments. |
{}
|
Returns:
Type | Description |
---|---|
DataNodeConfig
|
The default data node configuration. |
set_default_scenario_configuration(task_configs=None, additional_data_node_configs=None, frequency=None, comparators=None, sequences=None, **properties)
staticmethod
¶
Set the default values for scenario configurations.
This function creates the default scenario configuration object, where all scenario configuration objects will find their default values when needed.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
task_configs |
Optional[List[TaskConfig]]
|
The list of task configurations used by this scenario configuration. |
None
|
additional_data_node_configs |
Optional[List[DataNodeConfig]]
|
The list of additional data nodes related to this scenario configuration. |
None
|
frequency |
Optional[Frequency]
|
The scenario frequency. It corresponds to the recurrence of the scenarios instantiated from this configuration. Based on this frequency each scenario will be attached to the relevant cycle. |
None
|
comparators |
Optional[Dict[str, Union[List[Callable], Callable]]]
|
The list of
functions used to compare scenarios. A comparator function is attached to a
scenario's data node configuration. The key of the dictionary parameter
corresponds to the data node configuration id. During the scenarios'
comparison, each comparator is applied to all the data nodes instantiated from
the data node configuration attached to the comparator. See
|
None
|
sequences |
Optional[Dict[str, List[TaskConfig]]]
|
Dictionary of sequences. The default value is None. |
None
|
**properties |
dict[str, any]
|
A keyworded variable length list of additional arguments. |
{}
|
Returns:
Type | Description |
---|---|
ScenarioConfig
|
The new default scenario configuration. |
set_default_task_configuration(function, input=None, output=None, skippable=False, **properties)
staticmethod
¶
Set the default values for task configurations.
This function creates the default task configuration object, where all task configuration objects will find their default values when needed.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
function |
Callable
|
The python function called by Taipy to run the task. |
required |
input |
Optional[Union[DataNodeConfig, List[DataNodeConfig]]]
|
The list of the input data node configurations. This can be a unique data node configuration if there is a single input data node, or None if there are none. |
None
|
output |
Optional[Union[DataNodeConfig, List[DataNodeConfig]]]
|
The list of the output data node configurations. This can be a unique data node configuration if there is a single output data node, or None if there are none. |
None
|
skippable |
bool
|
If True, indicates that the task can be skipped if no change has
been made on inputs. |
False
|
**properties |
dict[str, any]
|
A keyworded variable length list of additional arguments. |
{}
|
tasks()
¶
unblock_update()
classmethod
¶
Unblock update on the configuration signgleton.
unique_sections()
¶
Return all unique sections.