||exponentially surprised|| [2H theoryblog]

{about} {projects} {notes} {contact}
kiran_pic

MathJax Tests

First post, woo hoo! This blog will be roughly about my thoughts about research and maybe other things, I haven’t really decided yet. Probably some notes and projects.

In the meantime, here are some tests to ensure MathJax is working.

\begin{equation} x_{t+1} = \prod_{\mathcal{K}} \left(x_{t} - \eta \nabla_t\right) \tag{1} \label{eq:OGD} \end{equation}

Equation \eqref{eq:OGD} is the online gradient descent update. See \eqref{eq:vandermonde} for a matrix. Consider \(x, y \in \mathbb{R}\): Then, \(x + y \in \mathbb{R}\) as well (this math is inline).

Here’s the Vandermonde matrix:

\begin{pmatrix} 1 & a_1 & {a_1}^2 & \cdots & {a_1}^n \\ 1 & a_2 & {a_2}^2 & \cdots & {a_2}^n \\ \tag{2.1} \label{eq:vandermonde} \vdots & \vdots & \vdots & \ddots & \vdots \\ 1 & a_m & {a_m}^2 & \cdots & {a_m}^n \\ \end{pmatrix}

Here’s some matrix multiplication:

\begin{align} \begin{pmatrix} 1 & 0 \\ 0 & e^{i\pi} \\ \tag{2.2} \end{pmatrix} \begin{pmatrix} u \\ v \end{pmatrix} &= \begin{pmatrix} u \\ -v \end{pmatrix} \\ \large\equiv \\
\begin{bmatrix} 1 & 0 \\ 0 & -1 \\ \end{bmatrix} \begin{bmatrix} x \\ y \end{bmatrix} &= \begin{bmatrix} x \\ -y \end{bmatrix} \end{align}

And finally, we have the Cauchy-Schwarz inequality:

Here’s a useful link to some MathJax tricks.