Deal of The Day! Hurry Up, Grab the Special Discount - Save 25% - Ends In 00:00:00 Coupon code: SAVE25
Welcome to Pass4Success

- Free Preparation Discussions

Cisco Exam 350-901 Topic 5 Question 68 Discussion

Actual exam question for Cisco's 350-901 exam
Question #: 68
Topic #: 5
[All 350-901 Questions]

A developer is designing a modem, distributed microservice enterprise application. The application will be integrating with other systems and focus on a large deployment, so control of API calls is necessary. What is the best practice to reduce application response latency and protect the application from excessive use?

Show Suggested Answer Hide Answer
Suggested Answer: B

Contribute your Thoughts:

Clarinda
4 months ago
I think not enforcing any rate limiting could lead to high usage and impact application performance.
upvoted 0 times
...
Terrilyn
4 months ago
That might be true, but having it on the server side adds an extra layer of protection.
upvoted 0 times
...
Allene
4 months ago
But wouldn't enforcing rate limiting on the client side only be enough to reduce response latency?
upvoted 0 times
...
Clarinda
4 months ago
I agree, it helps to control the flow of API calls and protect the application.
upvoted 0 times
...
Terrilyn
4 months ago
I think implementing rate limiting on both client and server sides is the best practice.
upvoted 0 times
...
Ty
5 months ago
I believe option B is the most secure approach, covering both client and server sides.
upvoted 0 times
...
Rachael
5 months ago
But wouldn't implementing rate limiting on both client and server sides be more effective in controlling API calls?
upvoted 0 times
...
Dean
5 months ago
I agree with Tammara, server-side rate limiting can prevent excessive use and reduce latency.
upvoted 0 times
...
Tammara
5 months ago
I think implementing rate limiting on the server side would be the best option.
upvoted 0 times
...
Teddy
6 months ago
I'm with Beckie on this one. B is the way to go. Gotta protect that application from getting hammered, but also keep those response times snappy. Can't have one without the other, you know?
upvoted 0 times
...
Tequila
6 months ago
Haha, C is just begging for some serious performance issues, don't you think? I mean, who in their right mind would choose to not enforce any rate limiting at all? That's just asking for trouble.
upvoted 0 times
...
Claribel
6 months ago
You know, I was initially thinking D - server-side rate limiting only. But then I realized that could lead to a lot of wasted client-side resources if the requests are getting throttled. B makes a lot more sense to me.
upvoted 0 times
...
Beckie
6 months ago
Hmm, this is a tricky one. I'm definitely leaning towards B - implementing rate limiting on both the client and server sides. That way, we can protect the application from excessive use and also ensure low response latency.
upvoted 0 times
Lemuel
5 months ago
I see, that makes sense.
upvoted 0 times
...
Casey
5 months ago
Because implementing rate limiting on the server side can provide more control over API calls.
upvoted 0 times
...
Lynda
5 months ago
Why do you think option D would be better?
upvoted 0 times
...
Regenia
5 months ago
D
upvoted 0 times
...
Fletcher
5 months ago
Good point, I agree with option B as well.
upvoted 0 times
...
Tamekia
5 months ago
B
upvoted 0 times
...
Geoffrey
5 months ago
A
upvoted 0 times
...
...

Save Cancel
az-700  pass4success  az-104  200-301  200-201  cissp  350-401  350-201  350-501  350-601  350-801  350-901  az-720  az-305  pl-300  

Warning: Cannot modify header information - headers already sent by (output started at /pass.php:70) in /pass.php on line 77