1. Configuration of the CERNBox sync client to store data locally: directories listed
above are saved locally in the Hard Drive. The CERNBox client will use the local
cernbox path to synchronize its content with the server. The Windows Registry is
then configured accordingly for the typical Windows folders (refer to Section 5.2).
2. Configuration of a network mapped drive using the SMB protocol: in this case, data
is not stored locally. A network drive is created pointing to the SMB gateway for
CERNBox and the Windows Registry is updated to point to the corresponding remote
folders. This configuration is suggested for shared computers (e.g. meeting room PCs)
where having the client synchronising the profile of many users would not be optimal
and could result in filling up the Hard Drive quickly.
6 Outlook
We plan to introduce more application integrations with alternative open source solutions,
as discussed in section 3, to reduce our dependencies on existing commercial software. The
usage and easy of adoption of these alternative products will be monitored during the coming
years. At the same time, we aim at increasing the productivity of our users in their daily
workflows, in particular when it comes to uploading and downloading files from different
clouds: for example, many users use the Overleaf platform to edit LaTeX documents to
prepare scientific papers, and the data to prepare them typically is already in CERNBox. We
are also exploring a concept we name Bring Your Own Application, to efficiently allow users
and developers to connect their preferred applications to CERNBox (refer to [9] for a more
detailed explanation) using the micro-services deployment pattern [6, 7].
References
[1] E. Ormancey, Migrating to open-source technologies, CERN Computing Newsletter 12
June 2019, https://home.cern/news/news/computing/migrating-open-source-technologies
[2] ownCloud, https://owncloud.com (access time: 10/03/2018)
[3] Mascetti, Luca and Labrador, H Gonzalez and Lamanna, M and Mo
´
scicki, JT and Pe-
ters, AJ, Journal of Physics: Conference Series 664, 062037, CERNBox + EOS: end-user
storage for science (2015)
[4] H. G. Labrador, CERNBox: Petabyte-Scale Cloud Synchronisation and Sharing Platform
(University of Vigo, Ourense, 2015) EI15/16-02
[5] H. G. Labrador et all, EPJ Web of Conferences 214, 04038, CERNBox: the CERN storage
hub (2019)
[6] Thönes, Johannes, IEEE software 32.1 Microservices, (2015)
[7] Dragoni, N., Giallorenzo, S., Lafuente, A. L., Mazzara, M., Montesi, F., Mustafin, R.,
& Safina, L, Springer, Cham, Microservices: yesterday, today, and tomorrow. In Present
and ulterior software engineering (2017)
[8] WOPI, https://wopi.readthedocs.io/en/latest/ (access time: 10/03/2020)
[9] Hugo G. Labrador al., Increasing interoperability: CS3APIS, Proceedings of 24th Inter-
national Conference on Computing in High Energy & Nuclear Physics (CHEP 2019)
[10] Maria Alandes Pradillo et al., Experience Finding MS Project Alternatives at CERN,
Proceedings of 24th International Conference on Computing in High Energy & Nuclear
Physics (CHEP 2019)
[11] Ascensio System SIA, https://www.onlyoffice.com/ (access time: 10/03/2020)
[12] Collabora Ltd, https://www.collabora.com/ (access time: 10/03/2020)
6
6
EPJ Web of Conferences 245, 08015 (2020)
CHEP 2019
https://doi.org/10.1051/epjconf/202024508015