Tikz Trove

https://github.com/xiaohanyu/awesome-tikz

https://github.com/PetarV-/TikZ

neural network: https://tikz.net/neural_networks

Neural Network 1D-2D Cross-Connection Visualization

Keywords: Technical illustration, Network topology, Dimensional transition diagram, Feature transformation, Node connectivity visualization, Matrix representation

A technical visualization demonstrating the dimensional transformation between 1D and 2D feature spaces in neural networks through an interconnected node matrix system.

1D-2D Cross-Connection Visualization
2D Convolution Operation Visualization

Keywords: Mathematical visualization, Convolution operation, Matrix multiplication, Kernel operation, Technical diagram, Feature extraction illustration

A detailed visualization of the 2D convolution operation, demonstrating how a 3×3 kernel matrix slides across a binary image to produce convolved features, with element-wise multiplication and summation clearly illustrated through color-coding and mathematical notation.

2D Convolution Operation
À Trous Convolution Network Architecture

Keywords: Deep learning architecture, Dilated convolution, Neural network topology, Temporal dependencies, Network flow diagram, Sequential processing

A sophisticated visualization of an à trous (dilated) convolutional network architecture, showing how the receptive field grows exponentially through layers with increasing dilation factors, enabling the network to capture long-range dependencies efficiently without increasing the number of parameters.

À Trous Convolution Architecture
A3C Algorithm Execution Flow

Keywords: Reinforcement learning architecture, Asynchronous advantage actor-critic, Parallel agents, Queue visualization, State-action flow diagram, Parameter synchronization

A technical visualization of the Asynchronous Advantage Actor-Critic (A3C) algorithm’s execution flow, showing multiple parallel agents interacting with their environments, a global parameter server, and a queue system for gradient updates. The diagram illustrates the asynchronous nature of the algorithm with agents maintaining local copies of parameters (θ’, ψ’) while periodically updating the global parameters (θ, ψ).

A3C Algorithm Execution Flow
A3C Neural Network Architecture

Keywords: Deep learning architecture, Actor-critic network, Policy network, Value network, Shared parameters, Neural network topology

A detailed visualization of the A3C (Asynchronous Advantage Actor-Critic) neural network architecture, featuring shared layers that branch into policy (πθ) and value (Vψ) networks. The diagram illustrates how a single state input flows through shared feature extraction layers before splitting into action probabilities (blue) and state value estimation (red), with the shared parameters highlighted by the green overlay.

A3C Neural Network Architecture
Amplitude Modulation Wave Visualization

Keywords: Signal processing, Wave modulation, Carrier wave, AM signal, Waveform visualization, Signal multiplication

A mathematical visualization of the amplitude modulation (AM) process, showing three key components: the original message signal x(t) (top), the high-frequency carrier wave (middle, blue), and the resulting amplitude-modulated signal (bottom, red). The diagram demonstrates how the carrier wave’s amplitude is modulated by the message signal to produce the final AM waveform.

Amplitude Modulation Wave
Bidirectional LSTM Network Architecture

Keywords: Neural network topology, Bidirectional processing, LSTM architecture, Sequential data processing, Forward-backward connections, Deep learning diagram

A technical visualization of a Bidirectional Long Short-Term Memory (BiLSTM) network, showing parallel forward (LSTM→) and backward (LSTM←) processing paths. The diagram illustrates how the network processes input sequences (x̄ᵢ) in both directions simultaneously, combining forward (h̄ᵢ→) and backward (h̄ᵢ←) hidden states to produce final outputs (h̄ᵢ), enabling the network to capture both past and future context.

Bidirectional LSTM Architecture
Burrows-Wheeler Transform Process

Keywords: String transformation, BWT algorithm, Cyclic rotations, Lexicographical sorting, Text preprocessing, Compression visualization

A step-by-step visualization of the Burrows-Wheeler Transform (BWT) process, showing how the input string “ACAACG” is transformed through cyclic rotations and lexicographical sorting. The diagram illustrates the three key steps: generating all cyclic rotations, sorting them alphabetically, and extracting the last column along with the original string’s position to produce the final BWT output (CGAAAC, 2).

Burrows-Wheeler Transform Process
Convolutional Autoencoder Architecture

Keywords: Deep learning architecture, Autoencoder visualization, Convolutional layers, Feature compression, Dimensionality reduction, Neural network topology

A detailed visualization of a convolutional autoencoder (CAE) architecture, illustrating the transformation of input data X through the encoder (green overlay) and decoder (red overlay) pathways. The diagram shows the progressive feature extraction and compression through convolutional and pooling layers, followed by reconstruction through deconvolutional layers, with ReLU and logistic activations. The architecture demonstrates how the input is compressed into a latent representation z before being reconstructed into X’.

Convolutional Autoencoder Architecture
Convolutional Cross-Connection Network Architecture

Keywords: Neural network topology, Cross-modal architecture, Feature fusion, Convolutional networks, Multi-stream processing, Deep learning visualization

A detailed visualization of a convolutional cross-connection network architecture, featuring parallel processing streams with cross-modal connections. The diagram illustrates two parallel pathways with convolution and max-pooling operations, interconnected through cross-stream convolutions, leading to merged feature representations. The cascaded shadow effect emphasizes the depth and hierarchical nature of the feature processing, while dashed lines indicate feature map dimensions throughout the network.

Convolutional Cross-Connection Architecture
Cartesian-Polar Coordinate System Transformation

Keywords: Mathematical visualization, Coordinate transformation, Polar coordinates, Cartesian coordinates, Geometric representation, Angular measurement

A fundamental mathematical visualization demonstrating the relationship between Cartesian (x,y) and polar (r,α) coordinate systems. The diagram shows a point P with its Cartesian components (red and blue lines representing x and y coordinates) and its polar representation (green radius vector r and angle α), all within a unit circle. This classical representation illustrates how the same point can be uniquely described in both coordinate systems.

Coordinate Systems Transformation
CRT Display Scanning Pattern

Keywords: Display technology, Raster scanning, Horizontal blanking, Vertical blanking, Electron beam path, CRT visualization

A technical visualization of the Cathode Ray Tube (CRT) display scanning pattern, showing the electron beam’s path across 144 scan lines. The diagram illustrates both horizontal blanking (HBlank, blue) periods between lines and the vertical blanking (VBlank, red) period at the frame’s end. The systematic left-to-right scanning and retrace pattern demonstrates the fundamental mechanism behind CRT display technology.

CRT Scanning Pattern
CycleGAN Architecture Flow

Keywords: Generative adversarial network, Cycle-consistency, Domain translation, Bidirectional mapping, GAN architecture, Neural network topology

A technical visualization of the Cycle-Consistent Generative Adversarial Network (CycleGAN) architecture, showing the bidirectional translation between domains A and B. The diagram illustrates the cycle-consistency path (a_real → b_fake → a_rec), the generators G_AB and G_BA, and the discriminator D_B evaluating the authenticity of domain B samples. The dashed lines indicate the cycle-consistency constraint and the comparison pathway for real versus generated samples.

CycleGAN Architecture
de Bruijn Graph for DNA Sequence Assembly

Keywords: Bioinformatics visualization, Graph theory, DNA sequencing, Eulerian path, Sequence reconstruction, Genomic assembly

A visualization of a de Bruijn graph used in DNA sequence assembly, showing overlapping k-mers as vertices (AT, TG, GG, etc.) connected by directed edges labeled with k+1-mers. The red dashed line highlights the Eulerian cycle that reconstructs the original DNA sequence, demonstrating how the graph structure can be used to assemble longer sequences from shorter overlapping fragments.

de Bruijn Graph

Posted

in

by

Tags:

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *

🧭