(You do not have to) Start Here

Welcome to my page! I made this website to maintain (or retain) my habit of writing.
With the aid of Codex, I organized all the posts chronologically in this meta-post entitled “(You do not have to) Start Here”. This page functions as both the home page and a parent post that nests all the child posts in this repository.
Each post has a comment function where you can start or join a conversation. Feel free to open the post and leave a remark!

Contacts

Salients

Ape, Inc

Plan for Primal Times

  • We live and grow like hermit crabs do; we are to find a better shelter that would fit ourselves.
    • We live and grow like apes, crafting all recognized entities by our hand.
  • But what distinguishes us humans from other species is that we have memory, text, and the cognitive system for structures.
    • We thus create incremental, internal, recursive patterns of wall systems to better hide our raw bodies and avoid exposure to the exterior.
  • We are one step behind cyborgs, one step away from apes.
  • If not written, not exist.

Naive Hope

The Second Birth of Early Human

Since the moment we observed the auto-reproduction of the Tower of Babel that we built while we ascended, and became convinced that we would never see the other end of its fate, we decided to descend.

On the ground, we would find the locals, animals, and nature—things we had never been able to understand so well until then, by virtue of the loss of words.

Questions that Comprise One’s Self

The question that triggers me the most is:

Does a generative model (I see it as the second being) understand the structure of what it generates?

A series of questions awaits a series of answers before I can tackle the above question. For example,

  • What does it mean to generate?
  • What is the second being and what is not?
  • Why do we care about the structure?

I am yet to discover the methods to answer the primal question.

A few words that would describe the area of my research are:

Geometric Deep Learning, Structure Learning, Generative Models

The examples of the data format I deal with every day are:

Graph, Image, text

My research sounds vague, broad and obscure. But my question, essentially, is about the obscure.

2025/09/11


Notes

2025/09/13

  • There is nothing happier than receiving money for the mathematics I do.
  • All the assumption I had made to build a self image were to be discarded before the entry of the tower.

2025/09/11

  • The definition of one’s intelligence is the ability to cognize and localize one’s self in the embedding context and act on the persona.
  • For sure, I will only be resituated in the world reflecting my position in the stream of society, however implicitly.

About

About me

I work at the intersection of research and engineering, building creative systems with machine learning. My everyday tools include probabilistic models, geometric deep learning, and quick sketches that explain how the pieces attach.

What I’m exploring

  • Generative models for creative applications
  • Optimization techniques that make training actually finish
  • Interfaces that help humans collaborate with their models

Why this feed exists

The post stream doubles as my lab notebook. Ideas, demos, and formal writing show up together so you can see how one informs the next.

Reach out

Collaborations, critiques, and wild hypotheses are all welcome. You can reach me at kunosho1225@g.ecc.u-tokyo.ac.jp or find a more personal note in the tagged posts.

Mathematical Expressions and Visual Content Test

Mathematical Expressions and Visual Content Test

This post demonstrates the rendering capabilities for various mathematical expressions, images, and visual content on the blog.

Basic Mathematical Expressions

Inline Math

Here’s some inline math: E=mc2E = mc^2, and the Pythagorean theorem:

a2+b2=c2b2+c2=d2+e2\begin{align} a^2 + b^2 &= c^2 \\ b^2 + c^2 &= d^2 + e^2 \end{align}

Block Math

Here are some more complex mathematical expressions:

ex2dx=πn=11n2=π26\begin{align} \int_{-\infty}^{\infty} e^{-x^2} dx = \sqrt{\pi} \\ \sum_{n=1}^{\infty} \frac{1}{n^2} = \frac{\pi^2}{6} \end{align}

Advanced Mathematical Concepts

Matrix Operations

(abcd)(xy)=(ax+bycx+dy)\begin{pmatrix} a & b \\ c & d \end{pmatrix} \begin{pmatrix} x \\ y \end{pmatrix} = \begin{pmatrix} ax + by \\ cx + dy \end{pmatrix}

Probability and Statistics

The probability density function of a normal distribution:

f(xμ,σ2)=12πσ2e(xμ)22σ2f(x|\mu,\sigma^2) = \frac{1}{\sqrt{2\pi\sigma^2}} e^{-\frac{(x-\mu)^2}{2\sigma^2}}

Bayes’ theorem:

P(AB)=P(BA)P(A)P(B)P(A|B) = \frac{P(B|A) \cdot P(A)}{P(B)}

Calculus

Fundamental theorem of calculus:

abf(x)dx=f(b)f(a)\int_a^b f'(x) dx = f(b) - f(a)

Chain rule for derivatives:

ddx[f(g(x))]=f(g(x))g(x)\frac{d}{dx}[f(g(x))] = f'(g(x)) \cdot g'(x)

Machine Learning Mathematics

Neural Network Forward Pass

z[l]=W[l]a[l1]+b[l]z^{[l]} = W^{[l]} a^{[l-1]} + b^{[l]} a[l]=σ(z[l])a^{[l]} = \sigma(z^{[l]})

Loss Functions

Mean squared error:

LMSE=1ni=1n(yiy^i)2\mathcal{L}_{MSE} = \frac{1}{n} \sum_{i=1}^{n} (y_i - \hat{y}_i)^2

Cross-entropy loss:

LCE=1ni=1nc=1Cyi,clog(y^i,c)\mathcal{L}_{CE} = -\frac{1}{n} \sum_{i=1}^{n} \sum_{c=1}^{C} y_{i,c} \log(\hat{y}_{i,c})

Optimization

Gradient descent update rule: θt+1=θtαθL(θt)\theta_{t+1} = \theta_t - \alpha \nabla_\theta \mathcal{L}(\theta_t)

Adam optimizer:

mt=β1mt1+(1β1)gtm_t = \beta_1 m_{t-1} + (1-\beta_1) g_t vt=β2vt1+(1β2)gt2v_t = \beta_2 v_{t-1} + (1-\beta_2) g_t^2 m^t=mt1β1t,v^t=vt1β2t\hat{m}_t = \frac{m_t}{1-\beta_1^t}, \quad \hat{v}_t = \frac{v_t}{1-\beta_2^t} θt+1=θtαv^t+ϵm^t\theta_{t+1} = \theta_t - \frac{\alpha}{\sqrt{\hat{v}_t} + \epsilon} \hat{m}_t

Set Theory and Logic

Set Operations

AB={x:xA or xB}A \cup B = \{x : x \in A \text{ or } x \in B\} AB={x:xA and xB}A \cap B = \{x : x \in A \text{ and } x \in B\} AB={x:xA and xB}A \setminus B = \{x : x \in A \text{ and } x \notin B\}

Logical Expressions

xR,yR:y>x\forall x \in \mathbb{R}, \exists y \in \mathbb{R} : y > x ¬(PQ)(¬P)(¬Q)\neg (P \land Q) \equiv (\neg P) \lor (\neg Q)

Complex Mathematical Structures

Fourier Transform

The Fourier Transform reveals the frequency content of a signal by decomposing it into its constituent sinusoidal components:

F{f(t)}=F(ω)=f(t)eiωtdt\mathcal{F}\{f(t)\} = F(\omega) = \int_{-\infty}^{\infty} f(t) e^{-i\omega t} dt

Fourier Transform Animation

This animated visualization shows the intuitive understanding of the Fourier Transform:

  • Top Left: The combined time-domain signal being built up
  • Top Right: Rotating phasors representing each frequency component
  • Bottom Left: Individual sinusoidal components with different frequencies
  • Bottom Right: The frequency spectrum showing the magnitude of each component

The rotating phasors demonstrate how each frequency component contributes to the overall signal, with the rotation speed corresponding to the frequency and the radius corresponding to the amplitude.

Taylor Series

f(x)=n=0f(n)(a)n!(xa)nf(x) = \sum_{n=0}^{\infty} \frac{f^{(n)}(a)}{n!}(x-a)^n

Eigenvalue Decomposition

Av=λvA\mathbf{v} = \lambda\mathbf{v} where v\mathbf{v} is an eigenvector and λ\lambda is the corresponding eigenvalue.

Citations Test

Academic References

Mathematical foundations are built upon rigorous proofs \cite{louDiscreteDiffusionModeling2024}. The understanding of discrete structures has evolved significantly \cite{gatDiscreteFlowMatching2024a}.

Visual Content Placeholders

Static Images

Note: In a real blog post, you would include images like:

  • Mathematical diagrams and plots
  • Algorithm flowcharts
  • Neural network architectures
  • Data visualizations

Animated Content

For animated content, you could include:

  • GIFs showing mathematical transformations
  • Interactive plots and graphs
  • Algorithm step-by-step animations
  • Mathematical concept demonstrations

Code Blocks with Math Comments

import numpy as np
import matplotlib.pyplot as plt

# Generate data for f(x) = x^2
x = np.linspace(-10, 10, 100)
y = x**2  # This represents the function f(x) = x²

# Plot the quadratic function
plt.figure(figsize=(8, 6))
plt.plot(x, y, 'b-', linewidth=2, label='$f(x) = x^2$')
plt.xlabel('$x$')
plt.ylabel('$f(x)$')
plt.title('Quadratic Function: $f(x) = x^2$')
plt.grid(True, alpha=0.3)
plt.legend()
plt.show()

Mathematical Tables

FunctionDerivativeIntegral
xnx^nnxn1nx^{n-1}xn+1n+1+C\frac{x^{n+1}}{n+1} + C
exe^xexe^xex+Ce^x + C
ln(x)\ln(x)1x\frac{1}{x}xln(x)x+Cx\ln(x) - x + C
sin(x)\sin(x)cos(x)\cos(x)cos(x)+C-\cos(x) + C
cos(x)\cos(x)sin(x)-\sin(x)sin(x)+C\sin(x) + C

Conclusion

This test post demonstrates the blog’s capability to render:

  1. Inline and block mathematical expressions using KaTeX

  2. Complex mathematical notation including matrices, integrals, and summations

  3. Academic citations with proper bibliography generation

  4. Code blocks with mathematical comments

  5. Tables with mathematical content

The rendering system successfully handles both simple expressions like E=mc2E = mc^2 and complex multi-line equations with proper formatting and spacing.