Skip to content

Integration storage api


date: "2023-09-05" Author: "Sinisa Jovic"


Integration guide

Direct AWS CLI integration

Initial data export steps are explained below to help guide you through the process of your integration.

Tables below are the main source of information for your data export. It holds all data categories (Transactions, Items, Leaflets, Customers, Stores) listed by Solver Studios (Campaign Studio, Touchpoint Studio, Segmentation Studio, AI Studios recommender, CLV models etc) with all the details and necessity level (Mandatory, Recommended, Optional). Examples of source data are also given in a separate page to illustrate how the data you’re sending us should look like.

Unique files are generated for each data category (eg. Items, Customers, Transactions etc)

Where?

Data is delivered on AWS S3 under bucket with clients name (eg. s3://sais-client_name). Credentials for clients bucket on Solver infrastructure are delivered through email and sms.

Entity Path
Transactions s3://sais-{client_name}/initial-data/transactions/transactions_{date}.csv
Items s3://sais-{client_name}/initial-data/items/items.csv
Customers s3://sais-{client_name}/initial-data/customers/customers.csv
Leaflet s3://sais-{client_name}/initial-data/leaflet/leaflet.csv
Store s3://sais-{client_name}/initial-data/stores/stores.csv
All the above listed paths are fixed and used for initial data export as well as all future daily data exports.

How often?

All data is delivered once per day, ideally during the night (around 2-3am). Transactions data is delivered from the previous day.

Data Storage (S3) Instructions

We create 'sais-{client_name}' bucket on S3. Your data should be stored in folder 'initial-data'. If you have never used S3, there are 4 steps that are useful:
  1. The first step is to install a command line for AWS (aws-cli). On the next link you can find steps for each OS. For Windows, you will use the graphical interface, but for Linux, it’s enough to copy the first block from the documentation.
  2. After installation, you need to add the user profile via the command: 'aws configure import --csv file:///path/to/file/sais-{client_name}.csv'. This CSV file is attached to the email Things Solver team provided to the client after initial environment setup. That email contains credentials for accessing the S3.
  3. It’s possible to check if the profile is added with the next command: 'aws configure list-profiles'. The output of the command should be a list of all users (if there are more than one), and one of them is 'sais-{client_name}'.
  4. To upload files to S3, it is necessary to use the command 'cp' in the next format: 'aws s3 cp /path/to/file/data_to_upload.csv s3://sais-client_name/initial-data/folder-in-which-you-want-to-uload/ --profile sais-{client_name}'.

After setup, it is possible to upload all files from the local directory. For CSV files, you can use the next command:

aws s3 cp /path/to/folder s3://sais-client_name/initial-data --recursive --exclude"*" --include ".csv" --profile sais-client_name

As the bucket made for you has a predefined structure, you should take into account the location where you are sending the data. All the data you send should be in a folder *'s3://sais-{client_name}/initial-data'*.
Back to top