The current solution is to request all items via the public API with a special call for the eZ Pers. Service and to fetch each single Object or to page the request and fetch it in hundreds or thousends from the API. It is a very error prone concept and performance is very pur. Customer Hegnar with 500k content objects did run for several days and did need sevaral days of fixing as some of the "pages" did fail and hat to be reloaded manually.
The YOOCHOOSE Recommender supports also for user customers a bulk upload function. THis allows to send a donwload URL to an API call and the Service will download the files (CSV, JSON, XML) and import them on the eZ Pers. Service.
Create a local command / flag that would not send the API-Query results for the eZ Pers. via http to the requester (as today) but dumps the result in 1 to X files to the local disk. The files should be placed in a folder that can be reached from the internet (with or w/o basic auth) and the eZ Pers. service needs to be informed via an API call about the file name(s) to be downloaded and importet.
This change should also consider the multi site change request that is under construction in parallel (see https://jira.ez.no/browse/EZS-789). The dump should only containe the published items for ONE site access as they will be imported into separet accounts in the eez Pers.