Links

PostgreSQL (Beta)

How to set up our PostgreSQL Imports + Deployments.
PostgreSQL support is currently in beta. Please raise any issues you encounter via the chat bubble in the bottom-right of the site, and we'll be happy to help.

Overview

Our PostgreSQL integration allows you to import your data, allowing you to clean and transform your data as well as ask it questions via Chat Explore. When ready, you can then deploy models trained on PostgreSQL data, allowing you to make predictions off of data still in your PostgreSQL database.
You must allow network traffic from the IP address 52.44.27.22 in order to allow Akkio to access your data.

Importing Data

To import data, first create a new flow.
Create a new flow from the homepage.
Select the PostgreSQL button.
Click "Connect Dataset" to add a new dataset.
Enter connection details for your database.
Make sure your security settings allow us to connect to your database. You'll need to add our IP Address, 52.44.27.22, to your security settings so that we can connect and pull your data in.
To pull in your data, input the following:
  • Hostname: The hostname of your PostgreSQL instance (this looks like the URL).
  • Database / Schema Name: The name of the database within your PostgreSQL instance that your table is inside. NOTE: a database instance holds multiple databases or schemas, and each database or schema holds multiple tables.
  • Username/Password: Credentials for the database user to authenticate. This user must have read permissions for a table to import and write permissions to deploy.
  • Port: The port we should use to connect to your database. This is generally 3306 but should be verified.
  • Table Name: The name of the table to pull from.
When complete, hit submit.
We will then validate the connection.
If we're successful, you'll move forward to the data loading screen, where your data is syncing behind the scenes.
You may need to wait for a little bit for the data to import. Rest assured that we're importing data behind the scenes. Small datasets should only take a few minutes; large datasets may take longer.
Once your data is done importing, you'll see it populate onto the screen.
From here, you can do anything you like to the data - clean and transform it on the Prepare tab, ask Chat Explore questions on the Explore tab, you name it!

Deployments

We also offer the ability to deploy models based on your PostgreSQL datasets. Deployments allow you to make predictions on your PostgreSQL data based on a trained model.
PostgreSQL deployments are currently only supported for data imported from PostgreSQL.
To get started with these, first import train a model on your PostgreSQL data:
Then, select the PostgreSQL option on the Deploy tab:
Provide the Schema Name and Table Name of the table you wish to run predictions on.
Click "Verify Table" to validate that we can connect to the table.
Once the table is validated, deployment options will appear.
These options allow you to:
  • Scheduler: Select how often we should run predictions on your provided table.
  • Run on Deploy: Select whether you want us to run predictions whenever this job is created, as opposed to only on a scheduled basis.
  • Map Fields: Allows you to map the fields from your trained model to the fields they should correspond to in this table, if they differ.
  • Apply Data Prep: Whether we should transform the data with any Data Prep steps you specified in the Prepare tab.
Once you've selected your options, hit "Show Preview" for us to run some sample predictions.
Depending on the size of your table, this operation may be long-running.
Once complete, a preview table will be rendered with additional columns containing our predictions.
For classification models, these will be a column for the probability of this row being each category, as well as a category for the Unix Timestamp at which we made the prediction.
To actually deploy this model so it will make scheduled predictions, hit the "Deploy" button in the top-right:
We will push this new predictions table to a table that's named the same as the input table, but with _akkio_predictions added to the end.
Once we've deployed this model, you'll get a notification like this.
And there you go! Model deployed.