Skip to content

Commit

Permalink
Bug fix: Link operator names to the MLGraphBuilder methods
Browse files Browse the repository at this point in the history
For a "reshape" reference a malformed IDL link was present. The
other references are linked as appropriately to the builder
references rather than just being styled text.
  • Loading branch information
inexorabletash committed Feb 6, 2024
1 parent c9d2dd7 commit bc45246
Show file tree
Hide file tree
Showing 2 changed files with 7 additions and 5 deletions.
2 changes: 2 additions & 0 deletions SpecCodingConventions.md
Original file line number Diff line number Diff line change
Expand Up @@ -70,6 +70,8 @@ Example:
1. If |shape| is a [=circle=], draw it at |shape|'s [=circle/origin=].
```

* When referencing an operator in text (e.g. sigmoid, tanh, etc), link the operator name to the `MLGraphBuilder` methods for creating the corresponding `MLOperand` or `MLActivation`, e.g. `{{MLGraphBuilder/sigmoid()}}`. This provides consistent styling, and provides a thorough overview of the operator, even if the method itself isn't being discussed.


### Formatting

Expand Down
10 changes: 5 additions & 5 deletions index.bs
Original file line number Diff line number Diff line change
Expand Up @@ -1637,7 +1637,7 @@ partial interface MLGraphBuilder {
<div class="note">
<details open>
<summary>
The behavior of this operation when the input tensor is 4-D of the {{MLInputOperandLayout/"nchw"}} layout and the activation is of operator type *relu* can be generically emulated from the usage of other operations as follow. However, user agents typically have a more efficient implementation for it, therefore its usage is encouraged from the performance standpoint.
The behavior of this operation when the input tensor is 4-D of the {{MLInputOperandLayout/"nchw"}} layout and the activation is {{MLGraphBuilder/relu()}} can be generically emulated from the usage of other operations as follow. However, user agents typically have a more efficient implementation for it, therefore its usage is encouraged from the performance standpoint.
</summary>
<pre highlight="js">
const shape = [1,null,1,1];
Expand Down Expand Up @@ -2508,7 +2508,7 @@ partial interface MLGraphBuilder {
</div>

<div class="note">
Although operations *greaterOrEqual* and *lesserOrEqual* can each be implemented in terms of operations *not*, *lesser*, and *greater* in other words `greater-or-equal(a, b)` is `not(lesser(a, b))`, they are specifically defined to handle NaN cases and for performance reason to avoid double comparisons.
Although operations {{MLGraphBuilder/greaterOrEqual()}} and {{MLGraphBuilder/lesserOrEqual()}} can each be implemented in terms of operations {{MLGraphBuilder/not()}}, {{MLGraphBuilder/lesser()}}, and {{MLGraphBuilder/greater()}} in other words `builder.greaterOrEqual(a, b)` is `builder.not(builder.lesser(a, b))`, they are specifically defined to handle NaN cases and for performance reason to avoid double comparisons.
</div>

<details open algorithm>
Expand Down Expand Up @@ -3368,7 +3368,7 @@ partial interface MLGraphBuilder {
<div class="note">
<details open>
<summary>
The behavior of this operation can be generically emulated via other operations as shown below, when the weight layout is the default {{MLGruWeightLayout/"zrn"}} layout, and the activation functions of the update/reset gate and new gate are of the operator types *sigmoid* and *tanh* respectively.
The behavior of this operation can be generically emulated via other operations as shown below, when the weight layout is the default {{MLGruWeightLayout/"zrn"}} layout, and the activation functions of the update/reset gate and new gate are {{MLGraphBuilder/sigmoid()}} and {{MLGraphBuilder/tanh()}} respectively.
</summary>
<pre highlight="js">
const one = builder.constant(1);
Expand Down Expand Up @@ -4372,7 +4372,7 @@ partial interface MLGraphBuilder {
<div class="note">
<details open>
<summary>
The behavior of this operation can be generically emulated via other operations as shown below, when the weight layout is the default {{MLLstmWeightLayout/"iofg"}} layout, and the activation functions of the input/forget/output gate and the cell gate/the cell state's filter for the output hidden state are of the operator types *sigmoid* and *tanh* respectively.
The behavior of this operation can be generically emulated via other operations as shown below, when the weight layout is the default {{MLLstmWeightLayout/"iofg"}} layout, and the activation functions of the input/forget/output gate and the cell gate/the cell state's filter for the output hidden state are {{MLGraphBuilder/sigmoid()}} and {{MLGraphBuilder/tanh()}} respectively.
</summary>
<pre highlight="js">
const zero = builder.constant(0);
Expand Down Expand Up @@ -5290,7 +5290,7 @@ partial interface MLGraphBuilder {
<div class="note">
<details open>
<summary>
Many shape-related operations such as [squeeze](https://pytorch.org/docs/stable/generated/torch.squeeze.html), [unsqueeze](https://pytorch.org/docs/stable/generated/torch.unsqueeze.html), and [flatten](https://pytorch.org/docs/stable/generated/torch.flatten.html) can be generically implemented using the *reshape*}} operation as follows:
Many shape-related operations such as [squeeze](https://pytorch.org/docs/stable/generated/torch.squeeze.html), [unsqueeze](https://pytorch.org/docs/stable/generated/torch.unsqueeze.html), and [flatten](https://pytorch.org/docs/stable/generated/torch.flatten.html) can be generically implemented using the {{MLGraphBuilder/reshape()}} operation as follows:
</summary>
<pre highlight="js">
// Returns a tensor with all specified dimensions of input of size 1 removed.
Expand Down

0 comments on commit bc45246

Please sign in to comment.