Hello Everyone,
Welcome to the 35th edition of my newsletter ML & AI Cupcakes!
The agenda of today’s newsletter is to help you get a quick knowledge check of your understanding of neural networks basics.
Good luck!
What is the main objective of using activation functions in neurons?
A) Missing values imputation
B) Feature selection
C) Feature scaling
D) Introducing non-linearity
What is the main role of bias in a neuron?
A) Helps shifting activation functions
B) Replaces activation functions
C) Scales the value of learning rate with each iteration
D) Introduces sparsity by randomly deactivating neurons
Which property mainly allows neural networks to learn complex patterns from the data?
A) Initializing all weights to zero
B) Adding bias to each neuron
C) Adding non-linear activation functions
D) Using multiple layers with linear activation functions
What is the possible downside of using sigmoid activation function in hidden layers?
A) Zero gradients
B) Vanishing gradients
C) Exploding gradients
D) Faster convergence to the optimal solution
Activation functions can be used only in hidden layers.
A) True
B) False
Biases are not updated during backpropagation.
A) True
B) False
Bias is also used in output layer.
A) True
B) False
What is the main role of backpropagation?
A) To initialize weights and biases
B) To update weights and biases
C) To select activation functions
D) To choose optimal value of learning rate
Which rule from calculus is used by backpropagation?
A) Chain rule
B) Product rule
C) Quotient rule
D) L’ Hospital’s rule
Which layer is updated first during backpropagation?
A) Input layer
B) First hidden layer
C) Last hidden layer
D) Output layer
Which activation function commonly face the “dying neurons” problem?
A) Sigmoid
B) Tanh
C) ReLU
D) Leaky ReLU
Which of the following holds true for iterations?
A) Iterations = total samples*batch size
B) Iterations = total samples/batch size
C) Iterations = epochs*batch size
D) Iterations = epochs/batch size
If total number of samples is 1000 and batch size is 50, the number of iterations in one epoch will be __ .
A) 10
B) 20
C) 25
D) 50
________ is not a data preprocessing step in neural networks.
A) Backpropagation
B) Missing values imputation
C) Removing outliers
D) Removing duplicates
Which core element of a neural network design measures the difference between predicted and actual values?
A) Activation function
B) Optimization algorithm
C) Loss function
D) None of the above
Which layer provides final predictions in a neural network?
A) Input layer
B) First hidden layer
C) Last hidden layer
D) Output layer
Which activation function can’t predict values greater than 1?
A) ReLU
B) Leaky ReLU
C) Linear
D) Tanh
ReLU activation function can give only non-negative output.
A) True
B) False
Which activation function can give negative output values?
A) Sigmoid
B) Tanh
C) ReLU
D) None of the above
What is minimized during backpropagation?
A) Loss function
B) Number of neurons
C) Number of layers
D) Number of activation functions
Answers
1. D 2.A 3.C 4.B 5.B 6.B 7.A 8.B 9.A 10.D
11.C 12.B 13.B 14.A 15.C 16.D 17. D 18. A 19.B 20.A
Helpful links
Core elements of a neural network design
Writing each newsletter takes a lot of research, time and effort. Just want to make sure it reaches maximum people to help them grow in their AI/ML journey.
It would be great if you could share this newsletter with your network.
Also, please let me know your feedbacks and suggestions in the comments section. That will help me keep going. Even a “like” on my posts will tell me that my posts are helpful to you.
See you soon!
-Kavita
Thank you ! This was very educative and helpful. Would love more of this 🙏🙏