-
Notifications
You must be signed in to change notification settings - Fork 48
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Bug fix: Link operator names to the MLGraphBuilder methods #547
Bug fix: Link operator names to the MLGraphBuilder methods #547
Conversation
Personally I'd prefer linking and it should be default generic thing, not a specific convention for this spec. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks, I support an update to the conventions & style guide doc.
490e326
to
cd31480
Compare
Updated with a note in the conventions doc. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM with questions, thanks much!
cd31480
to
2d52aa7
Compare
5e61d26
to
bc45246
Compare
I noticed a few more potential places this style could be applied. Should I roll these in too, or leave them alone? @@ -923,12 +923,12 @@ interface MLActivation {
</div>
<div class="note">
-These activations function types are used to create other operations. One such use of this interface is for when an activation function is fused into another operation such as [[#api-mlgraphbuilder-conv2d]] or [[#api-mlgraphbuilder-batchnorm]] during a graph construction session. Such fused activation functions can provide a significant performance improvement when supported natively by the underlying implementation. This is intended as an optimization opportunity for implementers.
+These activations function types are used to create other operations. One such use of this interface is for when an activation function is fused into another operation such as {{MLGraphBuilder/conv2d()}} or {{MLGraphBuilder/batchNormalization()}} during a graph construction session. Such fused activation functions can provide a significant performance improvement when supported natively by the underlying implementation. This is intended as an optimization opportunity for implementers.
</div>
### Creating {{MLActivation}} ### {#api-mlactivation-create}
<div class="note">
-The {{MLActivation}} objects (including the ones passed as input to methods) are created by the methods of {{MLGraphBuilder}} and are identified by their name. The |options| dictionary is defined by those methods. The actual creation of the activation function e.g. a [[#api-mlgraphbuilder-sigmoid-method]] or [[#api-mlgraphbuilder-relu-method]] can then be deferred until when the rest of the graph is ready to connect with it such as during the construction of [[#api-mlgraphbuilder-conv2d]] for example.
+The {{MLActivation}} objects (including the ones passed as input to methods) are created by the methods of {{MLGraphBuilder}} and are identified by their name. The |options| dictionary is defined by those methods. The actual creation of the activation function e.g. a {{MLGraphBuilder/sigmoid()}} or {{MLGraphBuilder/relu()}} can then be deferred until when the rest of the graph is ready to connect with it such as during the construction of {{MLGraphBuilder/conv2d()}} for example.
</div>
<details open algorithm>
@@ -3674,7 +3674,7 @@ Create a named {{MLOperand}} based on a descriptor, that can be used as an input
</details>
### instanceNormalization ### {#api-mlgraphbuilder-instancenorm}
-Normalize the input using [[Instance-Normalization]]. Unlike [[#api-mlgraphbuilder-batchnorm]] where the mean and variance values used in the normalization are computed across all the samples in the batch dimension while the model is trained, the mean and variance values used in the instance normalization are computed on the fly for each input feature of each individual sample in the batch.
+Normalize the input using [[Instance-Normalization]]. Unlike {{MLGraphBuilder/batchNormalization()}} where the mean and variance values used in the normalization are computed across all the samples in the batch dimension while the model is trained, the mean and variance values used in the instance normalization are computed on the fly for each input feature of each individual sample in the batch.
<script type=idl>
dictionary MLInstanceNormalizationOptions {
@@ -3776,7 +3776,7 @@ partial interface MLGraphBuilder {
</div>
### layerNormalization ### {#api-mlgraphbuilder-layernorm}
-Normalize the input using [[Layer-Normalization]]. Unlike [[#api-mlgraphbuilder-batchnorm]] where the mean and variance values are computed across all the samples in the batch dimension while the model is trained, and in [[#api-mlgraphbuilder-instancenorm]] where the mean and variance values are computed on the fly for each input feature of each individual sample in the batch, the means and variance values of the layer normalization are computed on the fly across all the input features of each individual sample in the batch.
+Normalize the input using [[Layer-Normalization]]. Unlike {{MLGraphBuilder/batchNormalization()}} where the mean and variance values are computed across all the samples in the batch dimension while the model is trained, and in {{MLGraphBuilder/instanceNormalization()}} where the mean and variance values are computed on the fly for each input feature of each individual sample in the batch, the means and variance values of the layer normalization are computed on the fly across all the input features of each individual sample in the batch.
<script type=idl>
dictionary MLLayerNormalizationOptions { |
Roll'em. |
For a "reshape" reference a malformed IDL link was present. The other references are linked as appropriately to the builder references rather than just being styled text or links to document sections.
bc45246
to
4b7abcb
Compare
SHA: cfab815 Reason: push, by anssiko Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
For a "reshape" reference a malformed IDL link was present as
*reshape*}}
. The other references are linked as appropriately to the builder references rather than just being styled text.Documented this convention in SpecCodingConventions.md.
Preview | Diff