15.14 Layers: the building blocks of deep learning
- Deep-learning model: directed, acyclic graph of layers
- Most common is a linear stack of layers, mapping a single input to a single output
- Layer: a data-processing module that takes as input 1+ tensors (e.g. vectors) and that outputs one or more tensors
- Different layer types for different tensor formats/data processing types (e.g., densely connected layers, recurrent layers etc.)
- Layer’s state: the layer’s weights which together contain the network’s knowledge
- Layers = LEGO bricks of deep learning (this metaphor is made explicit by frameworks like Keras)
- Building deep-learning models in Keras: Clip together compatible layers to form useful data-transformation pipelines
- Layer compatibility = every layer will only accept input tensors of a certain shape and will return output tensors of a certain shape
- Keras takes care of layer compatibility (dynamically built to match shape of incoming layer)
# Example code: DL model
# Network architecture
<- keras_model_sequential() %>% # linear stack of layers
model layer_dense(units = 64, # Output space dim.
activation = "relu", # Activation function (linear)
input_shape = dim(train_data)[[2]]) %>% # Input Dim.
layer_dense(units = 64,
activation = "relu") %>%
layer_dense(units = 1)
# "relu": Rectified linear unit activation function