@@ -49,7 +49,7 @@ The QoR analysis tool class, pyQoR, is responsible for quantifying the performan
The learning framework class, models, is the primary learning framework for optimising NoC switch configuration for a particular given traffic trace and QoR. The models class can accommodate different learning methods. Currently, it supports 3 different types of optimization flows: 1. MLE, 2. CMA-ES and 3. RbfOpt, with ability to progressively add more. The user should create an object of this learning models class and choose the method of optimisation. Once the object is created, the user simply needs to invoke the <b>[step](https://git.uwaterloo.ca/watcag-public/hoplite-ml/-/blob/master/evolveNoC/models.py#L14-15)</b> function for a certain number of times until a specific condition is met. This condition could be an upper limit on time spent learning, number of optimization steps, a specific QoR goal or simply wait for the optimization algorithm to tell the user that no more optimization is possible. Below is a breakdown of some core functions of the models class below:
> * <b>init_model</b>: According to the optimization flow selected, this function initializes the optimization flow; setting up the optimization space and constraining the learning to user defined flags and QoRs. You will see that there are other functions by the naming convention "init_XYZ", where XYZ refers to the optimization flow. Currently, as mentioned before, the repository support the use of "mle", "cma" and "rbf" optimisation flows. If a user wanted to extend this work to include other flows, it would be expected to add an initialization function for that flow in similar spirit.
> * <b>init_model</b>: According to the optimization flow selected, this function initializes the optimization flow; setting up the optimization space and constraining the learning to user defined flags and QoRs. You will see that there are other functions by the naming convention "init_XYZ", where XYZ refers to the optimization flow and the function "init_XYZ" refers to that optimization flow's initialization, with the primary call being made by the "init_model" function. Currently, as mentioned before, the repository support the use of "mle", "cma" and "rbf" optimisation flows. If a user wanted to extend this work to include other flows, it would be expected to add an initialization function for that flow in similar spirit.
> * <b>step</b>: The step function performs one step of NoC switch configuration discovery. Although the exact mechanics of this step vary according to the optimization flow, in general, it consists of generating a varying NoC configurations by mutating a base NoC state and evaluating each of these generated NoC configurations on the user defined QoR using the QoR analysis tool class, pyQoR. Having evaluated all generated NoC configurations, a certain top performing percentage of them are then chosen to generate a new base NoC state, effectively capturing the qualities of the top performing NoC configurations in the next base state. Finally, this step returns a boolean flag carrying information if the optimization is complete. Each optimization flow determines its stopping criteria differently. You will also notice that there are other functions of the naming convention "step_XYZ" that are called by this step function according the optimisation flow chosen. Currently, "XYZ" can refer to "mle", "cma" and "rbf". If a user wanted to add another optimisation flow to extend this work, it would be expected to wrap the optimisation/NoC discovery step/single epoch/single evolution with this step function in a similar fashion.
> * <b>compare_QoR_against_vanilla</b>: This function allows the user to compare the current best NoC discovered with the optimization flow against vanilla HopliteBP and HopliteBuf NoCs. You can call this function any time during the learning or once the best NoC configuration has been discovered and learning is complete, use this function to compare its QoR against vanilla variants.