Cyber Monday 2024! Hurry Up, Grab the Special Discount - Save 25% - Ends In 00:00:00 Coupon code: SAVE25
Welcome to Pass4Success

- Free Preparation Discussions

SAS Exam A00-405 Topic 9 Question 53 Discussion

Actual exam question for SAS's A00-405 exam
Question #: 53
Topic #: 9
[All A00-405 Questions]

Which option is the correct activation (unction for the output layer in a CNN model trained to classify an image belonging to one of the n classes (CI. C2, C3, , Cn)?

Show Suggested Answer Hide Answer
Suggested Answer: A

Contribute your Thoughts:

Shawnta
6 months ago
Yes, ReLU is commonly used for hidden layers to introduce non-linearity.
upvoted 0 times
...
Olga
6 months ago
What about ReLU? Isn't it commonly used for hidden layers in CNNs?
upvoted 0 times
...
Dong
6 months ago
I agree, Softmax is used for multi-class classification tasks like in CNN models.
upvoted 0 times
...
Shawnta
6 months ago
I think the correct activation function is Softmax.
upvoted 0 times
...
William
7 months ago
I prefer Sigmoid as the activation function for the output layer, it helps in binary classification tasks.
upvoted 0 times
...
Elden
7 months ago
I believe Softmax is the correct activation function because it gives probabilities for each class.
upvoted 0 times
...
Glory
7 months ago
I would go with ReLU as the correct activation function for the output layer.
upvoted 0 times
...
Mable
7 months ago
I think the correct activation function for the output layer in a CNN model is Softmax.
upvoted 0 times
...

Save Cancel
az-700  pass4success  az-104  200-301  200-201  cissp  350-401  350-201  350-501  350-601  350-801  350-901  az-720  az-305  pl-300  

Warning: Cannot modify header information - headers already sent by (output started at /pass.php:70) in /pass.php on line 77