Cyber Monday 2024! Hurry Up, Grab the Special Discount - Save 25% - Ends In 00:00:00 Coupon code: SAVE25
Welcome to Pass4Success

- Free Preparation Discussions

Salesforce Exam Data Architect Topic 1 Question 43 Discussion

Actual exam question for Salesforce's Data Architect exam
Question #: 43
Topic #: 1
[All Data Architect Questions]

Northern Trail Outfitters needs to implement an archive solution for Salesforce dat

a. This archive solution needs to help NTO do the following:

1. Remove outdated Information not required on a day-to-day basis.

2. Improve Salesforce performance.

Which solution should be used to meet these requirements?

Show Suggested Answer Hide Answer
Suggested Answer: A

Identifying a location to store archived data and using scheduled batch jobs to migrate and purge the aged data on a nightly basis can be a way to meet the requirements for an archive solution. The article provides a use case of how to use Heroku Connect, Postgres, and Salesforce Connect to archive old data, free up space in the org, and still retain the option to unarchive the data if needed. The article also explains how this solution can improve Salesforce performance and meet data retention policies.


Contribute your Thoughts:

Noah
3 months ago
I agree with Stephaine, option A seems like the most practical choice.
upvoted 0 times
...
Stephaine
3 months ago
But option A involves scheduled batch jobs for purging aged data.
upvoted 0 times
...
Shawna
3 months ago
I disagree, I believe option C is more efficient.
upvoted 0 times
...
Stephaine
3 months ago
I think option A is the best solution.
upvoted 0 times
...
Howard
3 months ago
I prefer option D, creating a full copy sandbox seems like a secure way to retain archived data.
upvoted 0 times
...
Junita
4 months ago
But option A involves scheduled batch jobs to migrate and purge data, which can be automated and reliable.
upvoted 0 times
...
Deandrea
4 months ago
If they're trying to 'remove outdated information,' I hope they're not just deleting it all willy-nilly. That's a surefire way to end up in the doghouse with the sales team.
upvoted 0 times
Leonie
2 months ago
C) Using a formula field to identify aged data seems like a safer option.
upvoted 0 times
...
Hollis
3 months ago
B) I agree, deleting important data could cause issues with the sales team.
upvoted 0 times
...
Bette
3 months ago
A) Identify a location to store archived data and use scheduled batch jobs to migrate and purge the aged data on a nightly basis.
upvoted 0 times
...
...
Louisa
4 months ago
Option D with the full copy sandbox? That's like using a bazooka to swat a fly. Talk about overkill!
upvoted 0 times
...
Dorothy
4 months ago
Using a formula field to identify old records and then exporting to SharePoint? That's a creative approach, I'll give them that. But I'm not sure it's the most robust solution.
upvoted 0 times
...
Chu
4 months ago
I don't know, the time-based workflow in Option B sounds pretty interesting too. Automation is key when it comes to archiving, you know?
upvoted 0 times
Herminia
4 months ago
True, automation is definitely key. Option B with the time-based workflow could also be a good solution for archiving.
upvoted 0 times
...
Yaeko
4 months ago
I think Option A is the best choice. Scheduled batch jobs can efficiently migrate and purge aged data.
upvoted 0 times
...
...
Minna
4 months ago
I disagree, I believe option C is more efficient as it uses a formula field to identify aged data.
upvoted 0 times
...
Junita
4 months ago
I think option A is the best choice for archiving Salesforce data.
upvoted 0 times
...
Jacinta
4 months ago
Option A seems like the way to go. Scheduled batch jobs to archive and purge data - that's a clean and efficient solution right there.
upvoted 0 times
Daren
3 months ago
I think Option A is more practical for our needs. It's a straightforward process to implement.
upvoted 0 times
...
Cheryl
3 months ago
C) Use a formula field that shows true when a record reaches a defined age and use that field to run a report and export a report into SharePoint.
upvoted 0 times
...
Nieves
4 months ago
Option A seems like the way to go. Scheduled batch jobs to archive and purge data - that's a clean and efficient solution right there.
upvoted 0 times
...
Willow
4 months ago
A) Identify a location to store archived data and use scheduled batch jobs to migrate and purge the aged data on a nightly basis.
upvoted 0 times
...
...

Save Cancel
az-700  pass4success  az-104  200-301  200-201  cissp  350-401  350-201  350-501  350-601  350-801  350-901  az-720  az-305  pl-300  

Warning: Cannot modify header information - headers already sent by (output started at /pass.php:70) in /pass.php on line 77