Python Library
Python library for Akkio
API Keys
As noted in the code samples below, you must get your API keys and copy them into your API code. Those can be found under the team settings page at the bottom of the Akkio app.

Installation
pip install akkio
Example Usage
import akkio
akkio.api_key = 'YOUR-API-KEY-HERE'
# get your API key at https://app.akkio.com/team-settings
# list models in your organization
models = akkio.get_models()['models']
for model in models:
print(model)
# list datasets in your organization
datasets = akkio.get_datasets()['datasets']
for dataset in datasets:
print(dataset)
# create a new empty dataset
new_dataset = akkio.create_dataset('python api test')
print(new_dataset)
# add rows to the dataset
import random
rows = []
for i in range(1000):
rows.append({
'x': random.random()
})
rows[-1]['y'] = rows[-1]['x'] > 0.5
akkio.add_rows_to_dataset(new_dataset['dataset_id'], rows)
# create a model
new_model = akkio.create_model(new_dataset['dataset_id'], ['y'], [], {'duration': 1})
print(new_model)
# make a prediction using the model
prediction = akkio.make_prediction(new_model['model_id'], [{'x': 0.1}, {'x':0.7}], explain=True)
print(prediction)
Datasets
create_dataset(dataset_name)
Create a new empty dataset.
input
description
dataset_name
The name of your newly created dataset.
add_rows_to_dataset(dataset_id, rows)
Add rows to a dataset.
input
description
dataset_id
A dataset id
rows
An array of rows to be added to the dataset in the following form:
[{ "field 1": "data", "field 2": "data" }, { ... }, ... ]
get_datasets()
Get all datasets in your organization.
get_dataset(dataset_id)
Get a dataset.
input
description
dataset_id
A dataset id
parse_dataset(dataset_id)
Recalculate the field types for a dataset.
input
description
dataset_id
A dataset id
delete_dataset(dataset_id)
Delete a dataset.
input
description
dataset_id
A dataset id
Models
get_models()
Get all models in your organization.
delete_model(model_id)
Delete a model in your organization.
input
description
model_id
A model id
create_model(dataset_id, predict_fields, ignore_fields, params)
Create a model (requires a dataset).
input
description
dataset_id
A dataset id
predict_fields
An array of field names to predict (case sensitive)
ignore_fields
An array of field names to ignore (case sensitive) (optional)
params
A dict with default value of:
{
"duration": 10,
"extra_attention": False,
"force": False
}
duration
is the duration in seconds to be used for model training.
extra_attention
can be enabled to help with predicting rare cases
force
forces a new model to be created
make_prediction(model_id, data)
Make a prediction using your model and new data.
input
description
model_id
A model id
data
An array of rows to be predicted in the following form:
[{ "field 1": "data", "field 2": "data" }, { ... }, ... ]
Last updated
Was this helpful?