Deal of The Day! Hurry Up, Grab the Special Discount - Save 25% - Ends In 00:00:00 Coupon code: SAVE25
Welcome to Pass4Success

- Free Preparation Discussions

Salesforce Exam Data-Architect Topic 1 Question 43 Discussion

Actual exam question for Salesforce's Salesforce Certified Data Architect exam
Question #: 43
Topic #: 1
[All Salesforce Certified Data Architect Questions]

Northern Trail Outfitters needs to implement an archive solution for Salesforce dat

a. This archive solution needs to help NTO do the following:

1. Remove outdated Information not required on a day-to-day basis.

2. Improve Salesforce performance.

Which solution should be used to meet these requirements?

Show Suggested Answer Hide Answer
Suggested Answer: A

Identifying a location to store archived data and using scheduled batch jobs to migrate and purge the aged data on a nightly basis can be a way to meet the requirements for an archive solution. The article provides a use case of how to use Heroku Connect, Postgres, and Salesforce Connect to archive old data, free up space in the org, and still retain the option to unarchive the data if needed. The article also explains how this solution can improve Salesforce performance and meet data retention policies.


Contribute your Thoughts:

Stephaine
11 days ago
I think option A is the best solution.
upvoted 0 times
...
Howard
13 days ago
I prefer option D, creating a full copy sandbox seems like a secure way to retain archived data.
upvoted 0 times
...
Junita
15 days ago
But option A involves scheduled batch jobs to migrate and purge data, which can be automated and reliable.
upvoted 0 times
...
Deandrea
15 days ago
If they're trying to 'remove outdated information,' I hope they're not just deleting it all willy-nilly. That's a surefire way to end up in the doghouse with the sales team.
upvoted 0 times
Hollis
2 days ago
B) I agree, deleting important data could cause issues with the sales team.
upvoted 0 times
...
Bette
5 days ago
A) Identify a location to store archived data and use scheduled batch jobs to migrate and purge the aged data on a nightly basis.
upvoted 0 times
...
...
Louisa
26 days ago
Option D with the full copy sandbox? That's like using a bazooka to swat a fly. Talk about overkill!
upvoted 0 times
...
Dorothy
27 days ago
Using a formula field to identify old records and then exporting to SharePoint? That's a creative approach, I'll give them that. But I'm not sure it's the most robust solution.
upvoted 0 times
...
Chu
28 days ago
I don't know, the time-based workflow in Option B sounds pretty interesting too. Automation is key when it comes to archiving, you know?
upvoted 0 times
Herminia
13 days ago
True, automation is definitely key. Option B with the time-based workflow could also be a good solution for archiving.
upvoted 0 times
...
Yaeko
14 days ago
I think Option A is the best choice. Scheduled batch jobs can efficiently migrate and purge aged data.
upvoted 0 times
...
...
Minna
30 days ago
I disagree, I believe option C is more efficient as it uses a formula field to identify aged data.
upvoted 0 times
...
Junita
1 months ago
I think option A is the best choice for archiving Salesforce data.
upvoted 0 times
...
Jacinta
1 months ago
Option A seems like the way to go. Scheduled batch jobs to archive and purge data - that's a clean and efficient solution right there.
upvoted 0 times
Cheryl
2 days ago
C) Use a formula field that shows true when a record reaches a defined age and use that field to run a report and export a report into SharePoint.
upvoted 0 times
...
Nieves
13 days ago
Option A seems like the way to go. Scheduled batch jobs to archive and purge data - that's a clean and efficient solution right there.
upvoted 0 times
...
Willow
29 days ago
A) Identify a location to store archived data and use scheduled batch jobs to migrate and purge the aged data on a nightly basis.
upvoted 0 times
...
...

Save Cancel
az-700  pass4success  az-104  200-301  200-201  cissp  350-401  350-201  350-501  350-601  350-801  350-901  az-720  az-305  pl-300  

Warning: Cannot modify header information - headers already sent by (output started at /pass.php:70) in /pass.php on line 77