## Handwritten digit recognition using CrysX-NN neural network trained on MNIST_Plus

In the last post, I showcased an app based on a neural network, trained on the MNIST dataset using the CrysX-NN library. The…

Read More
Skip to content
# BragitOff.com

## READ-LEARN-BRAG!

# Category: Compu geek

## Handwritten digit recognition using CrysX-NN neural network trained on MNIST_Plus

## Handwritten digit recognition using CrysX – Neural Network library and Streamlit Web App

## Digit recognition App (Streamlit) trained on the MNIST_Plus dataset using PyTorch CNN model

## Streamlit App for Digit recognition trained on the MNIST dataset using PyTorch

## Mean Squared Error loss function and its gradient (derivative) for a batch of inputs [Python Code]

## Efficient implementation of Tanh activation function and its Derivative (gradient) in Python

## Efficient implementation of Softplus activation function and its Derivative (gradient) in Python

## Efficient implementation of Softmax activation function and its Derivative (jacobian) in Python

## Efficient implementation of Sigmoid activation function and its Derivative (gradient) in Python

## Efficient implementation of ReLU activation function and its Derivative (gradient) in Python

Find all the latest tips and news regarding computers, gadgets, hacking and web publishing. Also regularly updated jokes full of geeky humor.

In the last post, I showcased an app based on a neural network, trained on the MNIST dataset using the CrysX-NN library. The…

Read MoreRecently, I wrote a blog post on creating a Streamlit app for recognizing handwritten digits using a convolutional neural network model with the…

Read MoreIn my previous blog post, I shared with you a streamlit app that recognizes handwritten digits entered by the user through mouse/touch input.…

Read MoreRecently, I learned how to program deep and convolutional neural networks using various frameworks like PyTorch and TensorFlow-Keras. Here is an application of…

Read MoreIn this post, I show you how to implement the Mean Squared Error (MSE) Loss/Cost function as well as its derivative for Neural…

Read MoreThe mathematical definition of the Tanh activation function is and its derivative is defined as The Tanh function and its derivative for a…

Read MoreThe mathematical definition of the Softplus activation function is with the derivative defined as , which is actually the Sigmoid function. We have…

Read MoreThe mathematical definition of the Softmax activation function is with the derivative defined as The Softmax function and its derivative for a batch…

Read MoreThe mathematical definition of the Sigmoid activation function is and its derivative is The Sigmoid function and its derivative for a batch of…

Read MoreThe mathematical definition of the ReLU activation function is and its derivative is defined as The ReLU function and its derivative for a…

Read More