Display title | Rectifier (neural networks) |
Default sort key | Rectifier (neural networks) |
Page length (in bytes) | 22,934 |
Namespace ID | 0 |
Page ID | 215574 |
Page content language | en - English |
Page content model | wikitext |
Indexing by robots | Allowed |
Number of redirects to this page | 0 |
Counted as a content page | Yes |
Page image |  |
HandWiki item ID | None |
Edit | Allow all users (infinite) |
Move | Allow all users (infinite) |
Page creator | imported>CodeMe |
Date of page creation | 21:32, 28 July 2025 |
Latest editor | imported>CodeMe |
Date of latest edit | 21:32, 28 July 2025 |
Total number of edits | 1 |
Recent number of edits (within past 90 days) | 1 |
Recent number of distinct authors | 1 |
Description | Content |
Article description: (description ) This attribute controls the content of the description and og:description elements. | In the context of artificial neural networks, the rectifier or ReLU (rectified linear unit) activation function is an activation function defined as the non-negative part of its argument, i.e., the ramp function:
$ \operatorname {ReLU} (x)=x^{+}=\max(0,x)={\frac {x+|x|}{2}}={\begin{cases}x&{\text... |