Remote Data Access
The storage can be accessed remotely using the S3 protocol.
In order to access the storage remotely you will need an access token. Login into the Analysis Platform and head to your user profile page.
MinIO client
The recommended way of accessing the remote storage is using the MinIO client. This is a command line tool that once installed allows you to transfer files from the local machine to the remote storage or from the remote storage to the local machine. Once installed it
mc alias set ioa https://darkroom.ast.cam.ac.uk:9443 username accessTokenWhere username is your username and accessToken is the token available in your user profile page.
The MinIO client command are described extensively in the MinIO Client User Guide. Some useful commands are described below.
In order to list the contents of the storage:
mc ls ioaEach directory name that appears in the output of the command above is called a bucket.
You can copy a file to the remote storage using:
mc cp some-local-file ioa/bucket_name/In order to copy a directory use the recursive flag -r:
mc cp -r some-local-dir ioa/bucket_name/And copy data to your local storage using
mc cp -r ioa/bucket_name/dataset-name .You can also delete a file or directory with
mc rm -r --force ioa/bucket_name/name_to_deleteRemote mounting the storage
Access from Python scripts
There are two main ways of connecting to the S3 storage, using the Python minio package or using the
most generic s3fs package.
Graphical clients
- CyberDuck (Windows, MacOS)
- MountainDuck (Windows, MacOS)
- Transmit [MacOS]
- WinSCP [Windows]
Sharing data
It is possible to share data with external collaborators without giving them access to the full storage. To do that use the mc share command. For example the following command creates a URL to allows for the file specified to be downloaded.
mc share download ioa/imaxt/eglez/imaxtreg/axio2stpt.tar.gz