Speed Running the Ingestion

Objective

After completing this lesson, you will be able to expedite data ingestion in the AI Workbench

Ingesting a Large Dataset

The purpose of this lesson is to quickly review the steps necessary to ingest a large dataset required to run the AI Workbench models, publish the predictive indicators, and ultimately obtain the trend insights as a result.

The following video demonstrates a speed run of the customer data ingestion process:

If you decide to replicate the ingestion steps in the video you’ve just watched, please use the following JSON configuration snippets.

The following snippet shows the Profile Event Model JSON configuration:

JSON
12345678910111213141516171819202122232425262728293031323334
{ "title": "Profiles", "type": "object", "additionalProperties": true, "properties": { "firstName": { "type": "string" }, "lastName": { "type": "string" }, "gender": { "type": "integer" }, "primaryEmail": { "type": "string" }, "primaryPhone": { "type": "string" }, "masterDataId": { "type": "string" }, "birthDate": { "type": "string", "format": "date-time" }, "timestamp": { "type": "string", "format": "date-time" } } }

This next snippet shows the Order Event Model JSON configuration used to ingest orders:

JSON
123456789101112131415161718192021222324252627282930
{ "title": "Orders", "type": "object", "additionalProperties": true, "properties": { "masterDataId": { "type": "string" }, "timestamp": { "type": "string", "format": "date-time" }, "currency": { "type": "string" }, "id": { "type": "string" }, "productId": { "type": "string" }, "amount": { "type": "string" }, "tax": { "type": "string" } } }