# Getting warm with Machine Learning

My current role as a Design Technologist is centred around building prototypes to describe ideas, stimulate conversations, and even provoke wilder imaginations. Designing for the future can sometimes mean building things that are not solving an immediate problem or pain point.

Anyhow, I’ve been diving into ML for quite some time, and am taking notes along the way as I grow from “copy, paste & pray” to understanding what’s really going on. Instead of taking an online course, I learn better while trying to build something first and catching up on the theory later. So, this page is merely a place for me to braindump stuff I’ve learned.

## Basics

• A Tensor is simply a container which can house data in `N` dimensions
• Tensor rank is the number of indices required to uniquely select each element of the tensor:
``````# rank = 0, otherwise known as a 0-D tensor or scalar
"string_1"
# rank = 1 i.e. tensor => "string_1"
["string_1", "string_2"]
# rank = 2 i.e. tensor is "string_1"
[ ["string_1", "string_2"] ]
``````

## Convolutional Neural Networks

• A kernel is a small matrix of weights. It’s what slides over 2D input data in a 2D convolution operation, performing an elementwise multiplication then summing up the results into a single output pixel:
• e.g. for a 5x5 greyscale image, a 3x3 kernel will produce a 3x3 output, essentially converting a 2D matrix of features into another
• And a filter is a collection of kernels, with there being one unique kernel for each input channel
• Each filter is independently convolved with the image, producing smaller outputs of feature or activation maps
• For a particular feature map, each neuron is only connected to a small chunk of the input image (local connectivity), and all neurons have the same connection weights (parameter sharing), i.e. not fully connected
• Pooling layers are meant to reduce the representation’s spatial size. This reduces the number of parameters and hence computation in the network

## Image classification problems

• Binary: two exclusive classes -> use binary cross-entropy
• Multi-class: more than two exclusive classes -> use categorical cross-entropy
• Multi-label: non-exclusive classes -> use binary cross-entropy