Creating a New Optimizer Class
Make a copy of
OptimizerTemplate.pyand rename it asOptimizerName.py, replacingNamewith the name or an abbreviation of your optimizer. For example, if this template was being used to add support for the optimizer, Nelder-Mead Simplex, we may renameOptimizerTemplate.pytoOptimizerNMSimplex.py. You can download the file here:OptimizerTemplate.py. It can also be found in the CyRxnOpt repository atdocs/source/developer/tutorials/OptimizerTemplate.py.Change the class name to your optimizer class name, for example,
OptimizerName.class OptimizerName(OptimizerABC):
Add all the necessary libraries required for your optimizer to the
__packageslist in the template file._packages = ["package1", "package2", ...]
Check that the
install()andcheck_install()functions do not need to be modified:install()- This function will install an optimizer and its dependencies optimizer class. When we run an optimizer for the first time, the install function will create a new virtual environment to contain the installation, then install all the packages using pip. When we use an optimizer, it will activate the relevant virtual environment.check_install()- This function checks whether necessary packages are installed or not.In the
get_config()function, you need to update the configuration dictionary. This dictionary serves as a description of the configuration options for an optimization algorithm so user-facing programs can dynamically adjust the offered configuration options for different optimization algorithms. Here you can add more variables that are only used by your algorithm and it should dynamically change the front-end user interface.Add all the necessary configurations and variables required as user input to run your optimizer. Follow the same dictionary keys as other optimizer classes. The following is an example
get_configdictionary with extra options added:config = [ { "name": "continuous_feature_names", "type": List[str], "value": [] }, { "name": "continuous_feature_bounds", "type": List[List[float]], "value": [], }, { "name": "continuous_feature_resolutions", "type": List[float], "value": [], }, { "name": "categorical_feature_names", "type": List[str], "value": [], }, { "name": "categorical_feature_values", "type": List[List[str]], "value": [], }, { "name": "budget", "type": int, "value": 100, }, { "name": "objectives", "type": List[str], "value": ["yield"], }, { "name": "direction", "type": str, "value": "min", "range": ["min", "max"], }, ]
In the
set_config()function, you need to add the necessary code to handle and initialize the optimizer. This function handles configuring an optimizer before training or prediction begins. Code to generate your reaction space, handle the format of configuration data into your algorithm format, and generate initial files will go insideset_config()function.If your optimizer requires training steps, add the necessary code for training inside the
train()function. For example, AMLRO requires training the initial ML model before starting the active learning prediction. Therefore, the AMLRO optimizer classtrain()function includes code to generate training dataset and perform each training step.In the
predict()function, your algorithm should be called to find the optimal reaction conditions. This function will return the suggested reaction conditions that should be run in next cycle. Actual optimization loop/prediction step code should be implemented here.In the
_import_deps()function, write the necessary package import lines. Each package should be added to the_importsdictionary, and for the dictionary key, use the package name. As a example,numpyandpandasare imported here:def _import_deps(self) -> None: import numpy as np import pandas as pd self._imports = {"np": np, "pd": pd}
Then, when you want to use the imported library, you can access it through the
self._importsdictionary:self._imports["np"].array() self._imports["pd"].DataFrame()
Depending on your optimizer workflow, add more class methods as necessary. Refer to how existing optimizer classes are defined for guidance.
Adding New Optimizer to CyRxnOpt
After implementing your optimizer class, update the OptimizerController.py
file to use your optimizer.
At the top of this file, add the optimizer import line, replacing
OptimizerNamewith the name of your optimizer class:from CyRxnOpt.OptimizerName import OptimizerName
Update the
get_optimizer()function to include your optimizer:elif optimizer_name == "name": optimizer = OptimizerName(venv)
All function parameters should match with the corresponding abstract function defined in
OptimizerABC. If you want to add a new parameter for any function, first add that toOptimizerControllerandOptimizerABC, then give the default value asNone. For example, if your algorithm predict function requires a new parameter, learning rate:def predict( optimizer_name: str, prev_param: List[Any], yield_value: float, experiment_dir: str, config: Dict, venv: NestedVenv = "", obj_func=None, learning_rate=None, ) -> List[Any]: