Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bug fix: Link operator names to the MLGraphBuilder methods #547

Merged
merged 4 commits into from
Feb 15, 2024

Conversation

inexorabletash
Copy link
Member

@inexorabletash inexorabletash commented Jan 31, 2024

For a "reshape" reference a malformed IDL link was present as *reshape*}}. The other references are linked as appropriately to the builder references rather than just being styled text.

Documented this convention in SpecCodingConventions.md.


Preview | Diff

@zolkis
Copy link
Collaborator

zolkis commented Feb 1, 2024

Personally I'd prefer linking and it should be default generic thing, not a specific convention for this spec.
This emphasis style looks like a leftover from early spec times with draft texts changing fast.

Copy link
Member

@anssiko anssiko left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks, I support an update to the conventions & style guide doc.

@inexorabletash
Copy link
Member Author

Updated with a note in the conventions doc.

Copy link
Contributor

@huningxin huningxin left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM with questions, thanks much!

index.bs Outdated Show resolved Hide resolved
index.bs Outdated Show resolved Hide resolved
index.bs Outdated Show resolved Hide resolved
index.bs Outdated Show resolved Hide resolved
@inexorabletash inexorabletash force-pushed the bugfix-op-links branch 2 times, most recently from 5e61d26 to bc45246 Compare February 6, 2024 18:07
@inexorabletash
Copy link
Member Author

I noticed a few more potential places this style could be applied. Should I roll these in too, or leave them alone?

@@ -923,12 +923,12 @@ interface MLActivation {
 </div>
 
 <div class="note">
-These activations function types are used to create other operations. One such use of this interface is for when an activation function is fused into another operation such as [[#api-mlgraphbuilder-conv2d]] or [[#api-mlgraphbuilder-batchnorm]] during a graph construction session. Such fused activation functions can provide a significant performance improvement when supported natively by the underlying implementation. This is intended as an optimization opportunity for implementers.
+These activations function types are used to create other operations. One such use of this interface is for when an activation function is fused into another operation such as {{MLGraphBuilder/conv2d()}} or {{MLGraphBuilder/batchNormalization()}} during a graph construction session. Such fused activation functions can provide a significant performance improvement when supported natively by the underlying implementation. This is intended as an optimization opportunity for implementers.
 </div>
 
 ### Creating {{MLActivation}} ### {#api-mlactivation-create}
 <div class="note">
-The {{MLActivation}} objects (including the ones passed as input to methods) are created by the methods of {{MLGraphBuilder}} and are identified by their name. The |options| dictionary is defined by those methods. The actual creation of the activation function e.g. a [[#api-mlgraphbuilder-sigmoid-method]] or [[#api-mlgraphbuilder-relu-method]] can then be deferred until when the rest of the graph is ready to connect with it such as during the construction of [[#api-mlgraphbuilder-conv2d]] for example.
+The {{MLActivation}} objects (including the ones passed as input to methods) are created by the methods of {{MLGraphBuilder}} and are identified by their name. The |options| dictionary is defined by those methods. The actual creation of the activation function e.g. a {{MLGraphBuilder/sigmoid()}} or {{MLGraphBuilder/relu()}} can then be deferred until when the rest of the graph is ready to connect with it such as during the construction of {{MLGraphBuilder/conv2d()}} for example.
 </div>
 
 <details open algorithm>
@@ -3674,7 +3674,7 @@ Create a named {{MLOperand}} based on a descriptor, that can be used as an input
 </details>
 
 ### instanceNormalization ### {#api-mlgraphbuilder-instancenorm}
-Normalize the input using [[Instance-Normalization]]. Unlike [[#api-mlgraphbuilder-batchnorm]] where the mean and variance values used in the normalization are computed across all the samples in the batch dimension while the model is trained, the mean and variance values used in the instance normalization are computed on the fly for each input feature of each individual sample in the batch.
+Normalize the input using [[Instance-Normalization]]. Unlike {{MLGraphBuilder/batchNormalization()}} where the mean and variance values used in the normalization are computed across all the samples in the batch dimension while the model is trained, the mean and variance values used in the instance normalization are computed on the fly for each input feature of each individual sample in the batch.
 
 <script type=idl>
 dictionary MLInstanceNormalizationOptions {
@@ -3776,7 +3776,7 @@ partial interface MLGraphBuilder {
 </div>
 
 ### layerNormalization ### {#api-mlgraphbuilder-layernorm}
-Normalize the input using [[Layer-Normalization]]. Unlike [[#api-mlgraphbuilder-batchnorm]] where the mean and variance values are computed across all the samples in the batch dimension while the model is trained, and in [[#api-mlgraphbuilder-instancenorm]] where the mean and variance values are computed on the fly for each input feature of each individual sample in the batch, the means and variance values of the layer normalization are computed on the fly across all the input features of each individual sample in the batch.
+Normalize the input using [[Layer-Normalization]]. Unlike {{MLGraphBuilder/batchNormalization()}} where the mean and variance values are computed across all the samples in the batch dimension while the model is trained, and in {{MLGraphBuilder/instanceNormalization()}} where the mean and variance values are computed on the fly for each input feature of each individual sample in the batch, the means and variance values of the layer normalization are computed on the fly across all the input features of each individual sample in the batch.
 
 <script type=idl>
 dictionary MLLayerNormalizationOptions {

@zolkis
Copy link
Collaborator

zolkis commented Feb 7, 2024

Roll'em.

For a "reshape" reference a malformed IDL link was present. The other
references are linked as appropriately to the builder references
rather than just being styled text or links to document sections.
@anssiko anssiko merged commit cfab815 into webmachinelearning:main Feb 15, 2024
2 checks passed
github-actions bot added a commit that referenced this pull request Feb 15, 2024
SHA: cfab815
Reason: push, by anssiko

Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
@inexorabletash inexorabletash deleted the bugfix-op-links branch February 15, 2024 15:55
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants