A client receives multiple feeds from third parties on the same SFTP location:
* Product prices (sftp: prod/prices)
* Stores information (sftp: prod/stores;
* Product information (sftp: prod/catalog)
* Categories information (sftp: prod/marketing)
* Content (sftp: prod/marketing)
Some of the feeds are placed on sftp multiple times a day, as the information is updated in the source system.
The Architect decides to have only two jobs:
* One that checks and downloads available feeds every hour
* One that imports the files from Webdav once a day before the data replication, using the standards steps available in the Job Framework
Which design is correctfor the import Job, taking the steps scope in consideration?
This design maximizes efficiency and concurrency. By having the jobs that import products, stores, prices, and content run in parallel, the system can handle multiple data streams simultaneously, reducing total processing time. The sequential execution of importing categories followed by reindexing ensures that all new and updated information is properly indexed and available for site use, following the completion of the import of more frequently updated data. This order respects dependencies between steps and aligns with best practices for handling complex data workflows in B2C Commerce environments.
Melissa
4 days agoDiane
6 days agoAdolph
12 days agoTiara
14 days agoJeanice
14 days agoErin
16 days agoJuan
20 days agoElbert
3 days agoNobuko
9 days agoDiane
25 days ago