computer_science:machine_learning:machine_learning_terms

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revision Previous revision
Next revision
Previous revision
computer_science:machine_learning:machine_learning_terms [2020/08/11 12:58] carlossousacomputer_science:machine_learning:machine_learning_terms [2023/12/01 12:07] (current) – external edit 127.0.0.1
Line 19: Line 19:
   * **Forward pass:**  The computation of output values from input   * **Forward pass:**  The computation of output values from input
   * **Backward pass (backpropagation):**  The calculation of internal variable adjustments according to the optimizer algorithm, starting from the output layer and working back through each layer to the input.   * **Backward pass (backpropagation):**  The calculation of internal variable adjustments according to the optimizer algorithm, starting from the output layer and working back through each layer to the input.
 +  * **Flattening:**  The process of converting a 2d image into 1d vector
 +  * **ReLU:**  An activation function that allows a model to solve nonlinear problems
 +  * **Softmax:**  A function that provides probabilities for each possible output class
 +  * **Classification:**  A machine learning model used for distinguishing among two or more output categories
 +  * **Training Set:**  The data used for training the neural network.
 +  * **Test set:**  The data used for testing the final performance of our neural network.
 +  * **Regression:**  A model that outputs a single value. For example, an estimate of a house’s value.
 +  * **Classification:**  A model that outputs a probability distribution across several categories. For example, in Fashion MNIST, the output was 10 probabilities, one for each of the different types of clothing. Remember, we use Softmax as the activation function in our last Dense layer to create this probability distribution.
 +  * **CNNs:**  Convolutional neural network. That is, a network which has at least one convolutional layer. A typical CNN also includes other types of layers, such as pooling layers and dense layers.
 +  * **Convolution:**  The process of applying a kernel (filter) to an image
 +  * **Kernel / filter:**  A matrix which is smaller than the input, used to transform the input into chunks
 +  * **Padding:**  Adding pixels of some value, usually 0, around the input image
 +  * **Pooling**  The process of reducing the size of an image through downsampling.There are several types of pooling layers. For example, average pooling converts many values into a single value by taking the average. However, maxpooling is the most common.
 +  * **Maxpooling**: A pooling process in which many values are converted into a single value by taking the maximum value from among them.
 +  * **Stride:**  the number of pixels to slide the kernel (filter) across the image.
 +  * **Downsampling:**  The act of reducing the size of an image
  
  
  • computer_science/machine_learning/machine_learning_terms.1597150698.txt.gz
  • Last modified: 2023/12/01 12:07
  • (external edit)