From 3135e83f6472f38fd9c9dbfaf7fa68b4ef4f8855 Mon Sep 17 00:00:00 2001 From: skindhu Date: Sun, 27 Oct 2024 11:27:59 +0800 Subject: [PATCH] add third chapter --- Book/3.实现注意力机制.md | 10 +++++++--- 1 file changed, 7 insertions(+), 3 deletions(-) diff --git a/Book/3.实现注意力机制.md b/Book/3.实现注意力机制.md index 8d44d0e..1a971fd 100644 --- a/Book/3.实现注意力机制.md +++ b/Book/3.实现注意力机制.md @@ -284,11 +284,15 @@ Sum: tensor(1.) > > 1. **Softmax 的原理** > -> Softmax 函数的公式如下: $\operatorname{softmax}\left(z_{i}\right)=\frac{e^{z_{i}}}{\sum_{j} e^{z_{j}}}$ +> Softmax 函数的公式如下:g > -> +> \usepackage{amsmath} > -> g +> [ +> +> \operatorname{softmax}\left(z_{i}\right)=\frac{e^{z_{i}}}{\sum_{j} e^{z_{j}}} +> +> ] > > This is an inline formula: $E=mc^2$. >